WorldWideScience

Sample records for contingent probability statistics

  1. Comparison of accuracy in predicting emotional instability from MMPI data: fisherian versus contingent probability statistics

    Energy Technology Data Exchange (ETDEWEB)

    Berghausen, P.E. Jr.; Mathews, T.W.

    1987-01-01

    The security plans of nuclear power plants generally require that all personnel who are to have access to protected areas or vital islands be screened for emotional stability. In virtually all instances, the screening involves the administration of one or more psychological tests, usually including the Minnesota Multiphasic Personality Inventory (MMPI). At some plants, all employees receive a structured clinical interview after they have taken the MMPI and results have been obtained. At other plants, only those employees with dirty MMPI are interviewed. This latter protocol is referred to as interviews by exception. Behaviordyne Psychological Corp. has succeeded in removing some of the uncertainty associated with interview-by-exception protocols by developing an empirically based, predictive equation. This equation permits utility companies to make informed choices regarding the risks they are assuming. A conceptual problem exists with the predictive equation, however. Like most predictive equations currently in use, it is based on Fisherian statistics, involving least-squares analyses. Consequently, Behaviordyne Psychological Corp., in conjunction with T.W. Mathews and Associates, has just developed a second predictive equation, one based on contingent probability statistics. The particular technique used in the multi-contingent analysis of probability systems (MAPS) approach. The present paper presents a comparison of predictive accuracy of the two equations: the one derived using Fisherian techniques versus the one thing contingent probability techniques.

  2. On-line multi-contingency preprocessing of security assessment for severe weather based on qualitative reasoning with probability statistics classification

    Energy Technology Data Exchange (ETDEWEB)

    Chen, R.H.; Malik, O.P.; Hope, G.S. [Calgary Univ., AB (Canada). Dept. of Electrical and Computer Engineering

    1995-02-01

    A preprocessing method to select multi-contingencies having higher probability and severity for security assessment is presented in this paper. Based upon probabilistic classification and qualitative analysis, the proposed method can quickly create and renew the multiple contingency list with probability attributes and greatly reduce the number of multi-contingencies to be asserted. The operators can concentrate on analysing multicontingency cases with higher severity and higher probability, especially for on-line application during severe weather periods. Thus, both the probability and the duration of the loss of load can be reduced. (Author)

  3. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  4. Statistics and Probability

    Directory of Open Access Journals (Sweden)

    Laktineh Imad

    2010-04-01

    Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  5. Sampling, Probability Models and Statistical Reasoning Statistical ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  6. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  7. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  8. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  9. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  10. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  11. Probability, Statistics, and Computational Science

    OpenAIRE

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient infe...

  12. Probability and statistics: A reminder

    Science.gov (United States)

    Clément, Benoit

    2013-07-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from "data analysis in experimental sciences" given in [1

  13. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  14. Probability and statistics: A reminder

    OpenAIRE

    Clément Benoit

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  15. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  16. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  17. Frequentist probability and frequentist statistics

    Energy Technology Data Exchange (ETDEWEB)

    Neyman, J.

    1977-01-01

    A brief, nontechnical outline is given of the author's views on the ''frequentist'' theory of probability and the ''frequentist'' theory of statistics; their applications are illustrated in a few domains of study of nature. The phenomenon of apparently stable relative frequencies as the source of the frequentist theories of probability and statistics is taken up first. Three steps are set out: empirical establishment of apparently stable long-run relative frequencies of events judged interesting, as they develop in nature; guessing and then verifying the chance mechanism, the repeated operation of which produced the observed frequencies--this is a problem of frequentist probability theory; using the hypothetical chance mechanism of the phenomenon studied to deduce rules of adjusting our actions to the observations to ensure the highest ''measure'' of ''success''. Illustrations of the three steps are given. The theory of testing statistical hypotheses is sketched: basic concepts, simple and composite hypotheses, hypothesis tested, importance of the power of the test used, practical applications of the theory of testing statistical hypotheses. Basic ideas and an example of the randomization of experiments are discussed, and an ''embarrassing'' example is given. The problem of statistical estimation is sketched: example of an isolated problem, example of connected problems treated routinely, empirical Bayes theory, point estimation. The theory of confidence intervals is outlined: basic concepts, anticipated misunderstandings, construction of confidence intervals: regions of acceptance. Finally, the theory of estimation by confidence intervals or regions is considered briefly. 4 figures. (RWR)

  18. Simplified Freeman-Tukey test statistics for testing probabilities in ...

    African Journals Online (AJOL)

    This paper presents the simplified version of the Freeman-Tukey test statistic for testing hypothesis about multinomial probabilities in one, two and multidimensional contingency tables that does not require calculating the expected cell frequencies before test of significance. The simplified method established new criteria of ...

  19. Entropy in probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Rolke, W.A.

    1992-01-01

    The author develops a theory of entropy, where entropy is defined as the Legendre-Fenchel transform of the logarithmic moment generating function of a probability measure on a Banach space. A variety of properties relating the probability measure and its entropy are proven. It is shown that the entropy of a large class of stochastic processes can be approximated by the entropies of the finite-dimensional distributions of the process. For several types of measures the author finds explicit formulas for the entropy, for example for stochastic processes with independent increments and for Gaussian processes. For the entropy of Markov chains, evaluated at the observations of the process, the author proves a central limit theorem. Theorems relating weak convergence of probability measures on a finite dimensional space and pointwise convergence of their entropies are developed and then used to give a new proof of Donsker's theorem. Finally the use of entropy in statistics is discussed. The author shows the connection between entropy and Kullback's minimum discrimination information. A central limit theorem yields a test for the independence of a sequence of observations.

  20. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  1. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  2. Dependence in Probability and Statistics

    CERN Document Server

    Doukhan, Paul; Surgailis, Donatas; Teyssiere, Gilles

    2010-01-01

    This volume collects recent works on weakly dependent, long-memory and multifractal processes and introduces new dependence measures for studying complex stochastic systems. Other topics include the statistical theory for bootstrap and permutation statistics for infinite variance processes, the dependence structure of max-stable processes, and the statistical properties of spectral estimators of the long memory parameter. The asymptotic behavior of Fejer graph integrals and their use for proving central limit theorems for tapered estimators are investigated. New multifractal processes are intr

  3. Probability and statistics: models for research

    National Research Council Canada - National Science Library

    Bailey, Daniel Edgar

    1971-01-01

    This book is an interpretative presentation of the mathematical and logical basis of probability and statistics, indulging in some mathematics, but concentrating on the logical and scientific meaning...

  4. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  5. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  6. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  7. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  8. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  9. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  10. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  11. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  12. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  13. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  14. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  15. Statistics with JMP graphs, descriptive statistics and probability

    CERN Document Server

    Goos, Peter

    2015-01-01

    Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic

  16. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data...

  17. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto; Baake, Ellen; Georgii, Hans-Otto

    2008-01-01

    This book is a translation of the third edition of the well accepted German textbook 'Stochastik', which presents the fundamental ideas and results of both probability theory and statistics, and comprises the material of a one-year course. The stochastic concepts, models and methods are motivated by examples and problems and then developed and analysed systematically.

  18. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  19. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  20. Probability and Statistics for Particle Physicists

    CERN Document Server

    Ocariz, J.

    2014-01-01

    Lectures presented at the 1st CERN Asia-Europe-Pacific School of High-Energy Physics, Fukuoka, Japan, 14-27 October 2012. A pedagogical selection of topics in probability and statistics is presented. Choice and emphasis are driven by the author's personal experience, predominantly in the context of physics analyses using experimental data from high-energy physics detectors.

  1. The Britannica Guide to Statistics and Probability

    CERN Document Server

    2011-01-01

    By observing patterns and repeated behaviors, mathematicians have devised calculations to significantly reduce human potential for error. This volume introduces the historical and mathematical basis of statistics and probability as well as their application to everyday situations. Readers will also meet the prominent thinkers who advanced the field and established a numerical basis for prediction

  2. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  4. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  5. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  6. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  7. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  8. Statistical methods for solar flare probability forecasting

    Science.gov (United States)

    Vecchia, D. F.; Tryon, P. V.; Caldwell, G. A.; Jones, R. W.

    1980-09-01

    The Space Environment Services Center (SESC) of the National Oceanic and Atmospheric Administration provides probability forecasts of regional solar flare disturbances. This report describes a statistical method useful to obtain 24 hour solar flare forecasts which, historically, have been subjectively formulated. In Section 1 of this report flare classifications of the SESC and the particular probability forecasts to be considered are defined. In Section 2 we describe the solar flare data base and outline general principles for effective data management. Three statistical techniques for solar flare probability forecasting are discussed in Section 3, viz, discriminant analysis, logistic regression, and multiple linear regression. We also review two scoring measures and suggest the logistic regression approach for obtaining 24 hour forecasts. In Section 4 a heuristic procedure is used to select nine basic predictors from the many available explanatory variables. Using these nine variables logistic regression is demonstrated by example in Section 5. We conclude in Section 6 with band broad suggestions regarding continued development of objective methods for solar flare probability forecasting.

  9. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  10. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  11. Lectures on probability and statistics. Revision

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.

  12. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    eligible voters who support a particular political party. A random sample of size n is selected from this population and suppose k voters support this party. What is a good estimate of the required proportion? How do we obtain a probability model for the experi- ment just conducted? Let us examine the following simple ex-.

  13. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  14. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  15. On lacunary statistical convergence of order α in probability

    Science.gov (United States)

    Işık, Mahmut; Et, Kübra Elif

    2015-09-01

    In this study, we examine the concepts of lacunary statistical convergence of order α in probability and Nθ—convergence of order α in probability. We give some relations connected to these concepts.

  16. CRC standard probability and statistics tables and formulae Student ed.

    CERN Document Server

    Kokoska, Stephen

    2000-01-01

    Users of statistics in their professional lives and statistics students will welcome this concise, easy-to-use reference for basic statistics and probability. It contains all of the standardized statistical tables and formulas typically needed plus material on basic statistics topics, such as probability theory and distributions, regression, analysis of variance, nonparametric statistics, and statistical quality control.For each type of distribution the authors supply:?definitions?tables?relationships with other distributions, including limiting forms?statistical parameters, such as variance a

  17. Probability and braiding statistics in Majorana nanowires

    Science.gov (United States)

    Clarke, David J.; Sau, Jay D.; Das Sarma, S.

    2017-04-01

    Given recent progress in the realization of Majorana zero modes in semiconducting nanowires with proximity-induced superconductivity, a crucial next step is to attempt an experimental demonstration of the predicted braiding statistics associated with the Majorana mode. Such a demonstration should, in principle, confirm that the experimentally observed zero-bias anomalies are indeed due to the presence of anyonic Majorana zero modes. Moreover, such a demonstration would be a breakthrough at the level of fundamental physics: the first clear demonstration of a non-Abelian excitation. It is therefore important to clarify the expected signals of Majorana physics in the braiding context and to differentiate these signals from those that might also arise in nontopological variants of the same system. A definitive and critical distinction between signals expected in topological (i.e., anyonic) and nontopological (i.e., trivial) situations is therefore essential for future progress in the field. In this paper, we carefully examine the expected signals of proposed braiding and fusion experiments in topological and nontopological variants of the experimental nanowire systems in which Majoranas are predicted to occur. We point out situations where `trivial' and `anyonic' signatures may be qualitatively similar experimentally, necessitating a certain level of caution in the interpretation of various proposed fusion and braiding experiments. We find in particular that braiding experiments consisting of full braids (two Majorana exchanges) are better at distinguishing between topological and nontopological systems than fusion experiments or experiments with an odd number of Majorana exchanges. Successful fusion experiments, particularly in nanowires where zero bias conductance peaks are also observed, can also provide strong evidence for the existence of Majorana modes, but such fusion evidence without a corresponding braiding success is not definitive.

  18. Probability, statistics and queueing theory, with computer science applications

    CERN Document Server

    Allen, Arnold O

    1978-01-01

    Probability, Statistics, and Queueing Theory: With Computer Science Applications focuses on the use of statistics and queueing theory for the design and analysis of data communication systems, emphasizing how the theorems and theory can be used to solve practical computer science problems. This book is divided into three parts. The first part discusses the basic concept of probability, probability distributions commonly used in applied probability, and important concept of a stochastic process. Part II covers the discipline of queueing theory, while Part III deals with statistical inference. T

  19. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  20. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  1. Introduction to probability and statistics for engineers and scientists

    CERN Document Server

    Ross, Sheldon M

    2009-01-01

    This updated text provides a superior introduction to applied probability and statistics for engineering or science majors. Ross emphasizes the manner in which probability yields insight into statistical problems; ultimately resulting in an intuitive understanding of the statistical procedures most often used by practicing engineers and scientists. Real data sets are incorporated in a wide variety of exercises and examples throughout the book, and this emphasis on data motivates the probability coverage.As with the previous editions, Ross' text has remendously clear exposition, plus real-data

  2. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  3. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  4. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Semkow, Thomas M. E-mail: semkow@wadsworth.org

    1999-11-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  5. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    CERN Document Server

    Semkow, T M

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  6. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    Science.gov (United States)

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  7. Teaching Basic Probability in Undergraduate Statistics or Management Science Courses

    Science.gov (United States)

    Naidu, Jaideep T.; Sanford, John F.

    2017-01-01

    Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…

  8. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Statistics of adaptive optics speckles: From probability cloud to probability density function

    OpenAIRE

    Yaitskova, Natalia; Gladysz, Szymon

    2016-01-01

    The complex amplitude in the focal plane of adaptive optics system is modelled as an elliptical complex random variable. The geometrical properties of the probability density function of such variable relate directly to the statistics of the residual phase. Building solely on the twodimensional geometry, the expression for the probability density function of speckle intensity is derived.

  10. Statistical hydrodynamics and related problems in spaces of probability measures

    Science.gov (United States)

    Dostoglou, Stamatios

    2017-11-01

    A rigorous theory of statistical solutions of the Navier-Stokes equations, suitable for exploring Kolmogorov's ideas, has been developed by M.I. Vishik and A.V. Fursikov, culminating in their monograph "Mathematical problems of Statistical Hydromechanics." We review some progress made in recent years following this approach, with emphasis on problems concerning the correlation of velocities and corresponding questions in the space of probability measures on Hilbert spaces.

  11. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  12. Teaching Probability with the Support of the R Statistical Software

    Science.gov (United States)

    dos Santos Ferreira, Robson; Kataoka, Verônica Yumi; Karrer, Monica

    2014-01-01

    The objective of this paper is to discuss aspects of high school students' learning of probability in a context where they are supported by the statistical software R. We report on the application of a teaching experiment, constructed using the perspective of Gal's probabilistic literacy and Papert's constructionism. The results show improvement…

  13. Sets, Probability and Statistics: The Mathematics of Life Insurance.

    Science.gov (United States)

    Clifford, Paul C.; And Others

    The practical use of such concepts as sets, probability and statistics are considered by many to be vital and necessary to our everyday life. This student manual is intended to familiarize students with these concepts and to provide practice using real life examples. It also attempts to illustrate how the insurance industry uses such mathematic…

  14. Statistical learning of action: the role of conditional probability.

    Science.gov (United States)

    Meyer, Meredith; Baldwin, Dare

    2011-12-01

    Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.

  15. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  16. Refresher Course on Probability, Statistics and Stochastic Processes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 4. Refresher Course on Probability, Statistics and Stochastic Processes. Information and Announcements Volume 8 Issue 4 April 2003 pp 65-65. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  18. Algebraic Probability, Classical Stochastic Processes, and Counting Statistics

    Science.gov (United States)

    Ohkubo, Jun

    2013-08-01

    We study a connection between the algebraic probability and classical stochastic processes described by master equations. Introducing a definition of a state which has not been used for quantum cases, the classical stochastic processes can be reformulated in terms of the algebraic probability. This reformulation immediately gives the Doi--Peliti formalism, which has been frequently used in nonequilibrium physics. As an application of the reformulation, we give a derivation of basic equations for counting statistics, which plays an important role in nonequilibrium physics.

  19. Full counting statistics for noninteracting fermions: joint probability distributions.

    Science.gov (United States)

    Inhester, L; Schönhammer, K

    2009-11-25

    The joint probability distribution in the full counting statistics (FCS) for noninteracting electrons is discussed for an arbitrary number of initially separate subsystems which are connected at t = 0 and separated again at a later time. A simple method to obtain the leading-order long-time contribution to the logarithm of the characteristic function is presented which simplifies earlier approaches. New explicit results for the determinant involving the scattering matrices are found. The joint probability distribution for the charges in two leads is discussed for Y junctions and dots connected to four leads.

  20. Full counting statistics for noninteracting fermions: joint probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Inhester, L; Schoenhammer, K [Institut fuer Theoretische Physik, Universitaet Goettingen, Friedrich-Hund-Platz 1, D-37077 Goettingen (Germany)

    2009-11-25

    The joint probability distribution in the full counting statistics (FCS) for noninteracting electrons is discussed for an arbitrary number of initially separate subsystems which are connected at t = 0 and separated again at a later time. A simple method to obtain the leading-order long-time contribution to the logarithm of the characteristic function is presented which simplifies earlier approaches. New explicit results for the determinant involving the scattering matrices are found. The joint probability distribution for the charges in two leads is discussed for Y junctions and dots connected to four leads.

  1. Ladar range image denoising by a nonlocal probability statistics algorithm

    Science.gov (United States)

    Xia, Zhi-Wei; Li, Qi; Xiong, Zhi-Peng; Wang, Qi

    2013-01-01

    According to the characteristic of range images of coherent ladar and the basis of nonlocal means (NLM), a nonlocal probability statistics (NLPS) algorithm is proposed in this paper. The difference is that NLM performs denoising using the mean of the conditional probability distribution function (PDF) while NLPS using the maximum of the marginal PDF. In the algorithm, similar blocks are found out by the operation of block matching and form a group. Pixels in the group are analyzed by probability statistics and the gray value with maximum probability is used as the estimated value of the current pixel. The simulated range images of coherent ladar with different carrier-to-noise ratio and real range image of coherent ladar with 8 gray-scales are denoised by this algorithm, and the results are compared with those of median filter, multitemplate order mean filter, NLM, median nonlocal mean filter and its incorporation of anatomical side information, and unsupervised information-theoretic adaptive filter. The range abnormality noise and Gaussian noise in range image of coherent ladar are effectively suppressed by NLPS.

  2. Emission probability and photon statistics of a coherently driven mazer

    Energy Technology Data Exchange (ETDEWEB)

    Xiong Jin [Department of Physics, Shanghai Jiao Tong University, Shanghai (China)]. E-mail: xiongjint@hotmail.com; Zhang Zhiming [Department of Physics, Shanghai Jiao Tong University, Shanghai (China)

    2002-05-14

    The idea of a mazer is put forward with particular reference to the question of the driving-induced atomic coherence, which is established by a coherent driving field. The interaction of a quantum cavity field and an ultracold V-type three-level atom in which two levels are coupled by a coherent driving field, is derived. Its general quantum theory is established and the atomic emission probability and photon statistics are calculated and analysed. It is found that the mazer based on this driving-induced atomic coherence shows new features. There is a non-vanishing probability for the atom emitting a photon in the cavity even when the resonance condition is not fulfilled (here the resonance condition means that the cavity length is an integer multiple of half the atomic de Broglie wavelength). Under the resonance condition, the atomic emission probability has two sets of resonance peaks. For a very strong coherent driving field, the emission of the atom can be forbidden. As to the photon statistics, when the driving field is not very strong, the driving-induced atomic coherence reduces the photon number fluctuations of the cavity field. The photon statistics exhibits strong sub-Poissonian behaviour. In the region considered here, it can even be sub-Poissonian for any cavity length. However, when the driving field is too strong, the sub-Poissonian property may disappear. (author)

  3. A new statistical methodology predicting chip failure probability considering electromigration

    Science.gov (United States)

    Sun, Ted

    In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry

  4. Diffusion-based population statistics using tract probability maps.

    Science.gov (United States)

    Wassermann, Demian; Kanterakis, Efstathios; Gur, Ruben C; Deriche, Rachid; Verma, Ragini

    2010-01-01

    We present a novel technique for the tract-based statistical analysis of diffusion imaging data. In our technique, we represent each white matter (WM) tract as a tract probability map (TPM): a function mapping a point to its probability of belonging to the tract. We start by automatically clustering the tracts identified in the brain via tractography into TPMs using a novel Gaussian process framework. Then, each tract is modeled by the skeleton of its TPM, a medial representation with a tubular or sheet-like geometry. The appropriate geometry for each tract is implicitly inferred from the data instead of being selected a priori, as is done by current tract-specific approaches. The TPM representation makes it possible to average diffusion imaging based features along directions locally perpendicular to the skeleton of each WM tract, increasing the sensitivity and specificity of statistical analyses on the WM. Our framework therefore facilitates the automated analysis of WM tract bundles, and enables the quantification and visualization of tract-based statistical differences between groups. We have demonstrated the applicability of our framework by studying WM differences between 34 schizophrenia patients and 24 healthy controls.

  5. On Dobrushin's way from probability theory to statistical physics

    CERN Document Server

    Minlos, R A; Suhov, Yu M; Suhov, Yu

    2000-01-01

    R. Dobrushin worked in several branches of mathematics (probability theory, information theory), but his deepest influence was on mathematical physics. He was one of the founders of the rigorous study of statistical physics. When Dobrushin began working in that direction in the early sixties, only a few people worldwide were thinking along the same lines. Now there is an army of researchers in the field. This collection is devoted to the memory of R. L. Dobrushin. The authors who contributed to this collection knew him quite well and were his colleagues. The title, "On Dobrushin's Way", is mea

  6. Applications of Statistics and Probability in Civil Engineering

    DEFF Research Database (Denmark)

    Faber, Michael Havbro

    contains the proceedings of the 11th International Conference on Applications of Statistics and Probability in Civil Engineering (ICASP11, Zürich, Switzerland, 1-4 August 2011). The book focuses not only on the more traditional technical issues, but also emphasizes the societal context...... and reliability in engineering; to professionals and engineers, including insurance and consulting companies working with natural hazards, design, operation and maintenance of civil engineering and industrial facilities; and to decision makers and professionals in the public sector, including nongovernmental...

  7. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  8. Unnormalized probability: A different view of statistical mechanics

    Science.gov (United States)

    Swendsen, Robert H.

    2014-10-01

    All teachers and students of physics have absorbed the doctrine that probability must be normalized. Nevertheless, there are problems for which the normalization factor only gets in the way. An important example of this counter-intuitive assertion is provided by the derivation of the thermodynamic entropy from the principles of statistical mechanics. Unnormalized probabilities provide a surprisingly effective teaching tool that can make it easier to explain to students the essential concept of entropy. The elimination of the normalization factor offers simpler equations for thermodynamic equilibrium in statistical mechanics, which then lead naturally to a new and simpler definition of the entropy in thermodynamics. Notably, this definition does not change the formal expression of the entropy based on composite systems that I have previously offered. My previous definition of entropy has been criticized by Dieks, based on what appears to be a misinterpretation. I believe that the new definition presented here has the advantage of greatly reducing the possibility of such a misunderstanding—either by students or by experts.

  9. Development of the Interactive Statistical Tutorial Package (ISTP) for Learning Mathematical Concepts of Probability and Statistics.

    Science.gov (United States)

    1986-09-01

    the full breadth of elementary statistical theory, its presumption of statistical naivete on the part of users severely limits its pedagogical...be accommodated. Although plotting the shape of probability distri- butions involves elementary graphical functions of S such as plot(x,y) for a two...weibull # probability distribution ?T ifelse(x!:O,exp(log(a)+(a-1)*log(x)-(x/b)^a-a*log(b)),-if else( a>1, O, ifelse( a: I, I/b, dg aroma ( 0, I

  10. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  11. Survival probability and order statistics of diffusion on disordered media.

    Science.gov (United States)

    Acedo, L; Yuste, S B

    2002-07-01

    We investigate the first passage time t(j,N) to a given chemical or Euclidean distance of the first j of a set of N>1 independent random walkers all initially placed on a site of a disordered medium. To solve this order-statistics problem we assume that, for short times, the survival probability (the probability that a single random walker is not absorbed by a hyperspherical surface during some time interval) decays for disordered media in the same way as for Euclidean and some class of deterministic fractal lattices. This conjecture is checked by simulation on the incipient percolation aggregate embedded in two dimensions. Arbitrary moments of t(j,N) are expressed in terms of an asymptotic series in powers of 1/ln N, which is formally identical to those found for Euclidean and (some class of) deterministic fractal lattices. The agreement of the asymptotic expressions with simulation results for the two-dimensional percolation aggregate is good when the boundary is defined in terms of the chemical distance. The agreement worsens slightly when the Euclidean distance is used.

  12. Do doctors need statistics? Doctors' use of and attitudes to probability and statistics.

    Science.gov (United States)

    Swift, Louise; Miles, Susan; Price, Gill M; Shepstone, Lee; Leinster, Sam J

    2009-07-10

    There is little published evidence on what doctors do in their work that requires probability and statistics, yet the General Medical Council (GMC) requires new doctors to have these skills. This study investigated doctors' use of and attitudes to probability and statistics with a view to informing undergraduate teaching.An email questionnaire was sent to 473 clinicians with an affiliation to the University of East Anglia's Medical School.Of 130 respondents approximately 90 per cent of doctors who performed each of the following activities found probability and statistics useful for that activity: accessing clinical guidelines and evidence summaries, explaining levels of risk to patients, assessing medical marketing and advertising material, interpreting the results of a screening test, reading research publications for general professional interest, and using research publications to explore non-standard treatment and management options.Seventy-nine per cent (103/130, 95 per cent CI 71 per cent, 86 per cent) of participants considered probability and statistics important in their work. Sixty-three per cent (78/124, 95 per cent CI 54 per cent, 71 per cent) said that there were activities that they could do better or start doing if they had an improved understanding of these areas and 74 of these participants elaborated on this. Themes highlighted by participants included: being better able to critically evaluate other people's research; becoming more research-active, having a better understanding of risk; and being better able to explain things to, or teach, other people.Our results can be used to inform how probability and statistics should be taught to medical undergraduates and should encourage today's medical students of the subjects' relevance to their future careers. Copyright 2009 John Wiley & Sons, Ltd.

  13. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  14. Probability and Statistics Questions and Tests : a critical analysis

    Directory of Open Access Journals (Sweden)

    Fabrizio Maturo

    2015-06-01

    Full Text Available In probability and statistics courses, a popular method for the evaluation of the students is to assess them using multiple choice tests. The use of these tests allows to evaluate certain types of skills such as fast response, short-term memory, mental clarity and ability to compete. In our opinion, the verification through testing can certainly be useful for the analysis of certain aspects, and to speed up the process of assessment, but we should be aware of the limitations of such a standardized procedure and then exclude that the assessments of pupils, classes and schools can be reduced to processing of test results. To prove this thesis, this article argues in detail the main test limits, presents some recent models which have been proposed in the literature and suggests some alternative valuation methods.   Quesiti e test di Probabilità e Statistica: un'analisi critica Nei corsi di Probabilità e  Statistica, un metodo molto diffuso per la valutazione degli studenti consiste nel sottoporli a quiz a risposta multipla.  L'uso di questi test permette di valutare alcuni tipi di abilità come la rapidità di risposta, la memoria a breve termine, la lucidità mentale e l'attitudine a gareggiare. A nostro parere, la verifica attraverso i test può essere sicuramente utile per l'analisi di alcuni aspetti e per velocizzare il percorso di valutazione ma si deve essere consapevoli dei limiti di una tale procedura standardizzata e quindi escludere che le valutazioni di alunni, classi e scuole possano essere ridotte a elaborazioni di risultati di test. A dimostrazione di questa tesi, questo articolo argomenta in dettaglio i limiti principali dei test, presenta alcuni recenti modelli proposti in letteratura e propone alcuni metodi di valutazione alternativi. Parole Chiave:  item responce theory, valutazione, test, probabilità

  15. Statistical learning of action: The role of conditional probability

    National Research Council Canada - National Science Library

    Meyer, Meredith; Baldwin, Dare

    2011-01-01

    .... Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements...

  16. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  17. Statistical computation of Boltzmann entropy and estimation of the optimal probability density function from statistical sample

    Science.gov (United States)

    Sui, Ning; Li, Min; He, Ping

    2014-12-01

    In this work, we investigate the statistical computation of the Boltzmann entropy of statistical samples. For this purpose, we use both histogram and kernel function to estimate the probability density function of statistical samples. We find that, due to coarse-graining, the entropy is a monotonic increasing function of the bin width for histogram or bandwidth for kernel estimation, which seems to be difficult to select an optimal bin width/bandwidth for computing the entropy. Fortunately, we notice that there exists a minimum of the first derivative of entropy for both histogram and kernel estimation, and this minimum point of the first derivative asymptotically points to the optimal bin width or bandwidth. We have verified these findings by large amounts of numerical experiments. Hence, we suggest that the minimum of the first derivative of entropy be used as a selector for the optimal bin width or bandwidth of density estimation. Moreover, the optimal bandwidth selected by the minimum of the first derivative of entropy is purely data-based, independent of the unknown underlying probability density distribution, which is obviously superior to the existing estimators. Our results are not restricted to one-dimensional, but can also be extended to multivariate cases. It should be emphasized, however, that we do not provide a robust mathematical proof of these findings, and we leave these issues with those who are interested in them.

  18. Duality of circulation decay statistics and survival probability

    Science.gov (United States)

    2010-09-01

    Survival probability and circulation decay history have both been used for setting wake turbulence separation standards. Conceptually a strong correlation should exist between these two characterizations of the vortex behavior, however, the literatur...

  19. Probability, statistics, and reliability for engineers and scientists

    CERN Document Server

    Ayyub, Bilal M

    2012-01-01

    IntroductionIntroduction Knowledge, Information, and Opinions Ignorance and Uncertainty Aleatory and Epistemic Uncertainties in System Abstraction Characterizing and Modeling Uncertainty Simulation for Uncertainty Analysis and Propagation Simulation Projects Data Description and TreatmentIntroduction Classification of Data Graphical Description of Data Histograms and Frequency Diagrams Descriptive Measures Applications Analysis of Simulated Data Simulation Projects Fundamentals of ProbabilityIntroduction Sets, Sample Spaces, and EventsMathematics of Probability Random Variables and Their Proba

  20. Sufficient Statistics for Divergence and the Probability of Misclassification

    Science.gov (United States)

    Quirein, J.

    1972-01-01

    One particular aspect is considered of the feature selection problem which results from the transformation x=Bz, where B is a k by n matrix of rank k and k is or = to n. It is shown that in general, such a transformation results in a loss of information. In terms of the divergence, this is equivalent to the fact that the average divergence computed using the variable x is less than or equal to the average divergence computed using the variable z. A loss of information in terms of the probability of misclassification is shown to be equivalent to the fact that the probability of misclassification computed using variable x is greater than or equal to the probability of misclassification computed using variable z. First, the necessary facts relating k-dimensional and n-dimensional integrals are derived. Then the mentioned results about the divergence and probability of misclassification are derived. Finally it is shown that if no information is lost (in x = Bz) as measured by the divergence, then no information is lost as measured by the probability of misclassification.

  1. Measurement of zero-count probability in photoelectron statistics.

    Science.gov (United States)

    Basano, L; Ottonello, P

    1982-10-15

    The probability of zero-count P(0)(T) (as a function of the counting interval T) is one of the most interesting functions characterizing a light field. Experimentally, P(0)(T) is usually obtained by measuring successively the zero-count probability for a set of different intervals. This procedure exposes the measurement of P(0)(T) to errors imputable to drift. We present a simple zero-counter which is essentially free from drift effects and displays P(0)(T) directly on the CRT of an oscilloscope for sixteen values of T. Another advantage of the instrument is a conspicuous reduction of the overall measuring time.

  2. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...

  3. Limiting values of large deviation probabilities of quadratic statistics

    NARCIS (Netherlands)

    Jeurnink, Gerardus A.M.; Kallenberg, W.C.M.

    1990-01-01

    Application of exact Bahadur efficiencies in testing theory or exact inaccuracy rates in estimation theory needs evaluation of large deviation probabilities. Because of the complexity of the expressions, frequently a local limit of the nonlocal measure is considered. Local limits of large deviation

  4. Bayesian Statistics-The Theory of Inverse Probability

    Indian Academy of Sciences (India)

    Author Affiliations. Mohan Delampady1 T Krishnan2. Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560 059, India. Systat Software Asia-Pacific (P) Ltd. Floor 5, 'C' Tower Golden Enclave, Airport Road Bangalore 560 017, India.

  5. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    Science.gov (United States)

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  6. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  7. On Asymptotically Lacunary Statistical Equivalent Sequences of Order α in Probability

    Directory of Open Access Journals (Sweden)

    Işık Mahmut

    2017-01-01

    Full Text Available In this study, we introduce and examine the concepts of asymptotically lacunary statistical equivalent of order α in probability and strong asymptotically lacunary equivalent of order α in probability. We give some relations connected to these concepts.

  8. On Asymptotically Lacunary Statistical Equivalent Sequences of Order α in Probability

    OpenAIRE

    Işık Mahmut; Akbaş Kübra Elif

    2017-01-01

    In this study, we introduce and examine the concepts of asymptotically lacunary statistical equivalent of order α in probability and strong asymptotically lacunary equivalent of order α in probability. We give some relations connected to these concepts.

  9. Consolidity analysis for fully fuzzy functions, matrices, probability and statistics

    Directory of Open Access Journals (Sweden)

    Walaa Ibrahim Gabr

    2015-03-01

    Full Text Available The paper presents a comprehensive review of the know-how for developing the systems consolidity theory for modeling, analysis, optimization and design in fully fuzzy environment. The solving of systems consolidity theory included its development for handling new functions of different dimensionalities, fuzzy analytic geometry, fuzzy vector analysis, functions of fuzzy complex variables, ordinary differentiation of fuzzy functions and partial fraction of fuzzy polynomials. On the other hand, the handling of fuzzy matrices covered determinants of fuzzy matrices, the eigenvalues of fuzzy matrices, and solving least-squares fuzzy linear equations. The approach demonstrated to be also applicable in a systematic way in handling new fuzzy probabilistic and statistical problems. This included extending the conventional probabilistic and statistical analysis for handling fuzzy random data. Application also covered the consolidity of fuzzy optimization problems. Various numerical examples solved have demonstrated that the new consolidity concept is highly effective in solving in a compact form the propagation of fuzziness in linear, nonlinear, multivariable and dynamic problems with different types of complexities. Finally, it is demonstrated that the implementation of the suggested fuzzy mathematics can be easily embedded within normal mathematics through building special fuzzy functions library inside the computational Matlab Toolbox or using other similar software languages.

  10. Sets, Probability and Statistics: The Mathematics of Life Insurance. [Computer Program.] Second Edition.

    Science.gov (United States)

    King, James M.; And Others

    The materials described here represent the conversion of a highly popular student workbook "Sets, Probability and Statistics: The Mathematics of Life Insurance" into a computer program. The program is designed to familiarize students with the concepts of sets, probability, and statistics, and to provide practice using real life examples. It also…

  11. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  12. Solutions manual to accompany Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    A solutions manual to accompany Statistics and Probability with Applications for Engineers and ScientistsUnique among books of this kind, Statistics and Probability with Applications for Engineers and Scientists covers descriptive statistics first, then goes on to discuss the fundamentals of probability theory. Along with case studies, examples, and real-world data sets, the book incorporates clear instructions on how to use the statistical packages Minitab® and Microsoft® Office Excel® to analyze various data sets. The book also features: Detailed discussions on sampling distributions, stati

  13. Methods in probability and statistical inference. Progress report, June 15, 1976--June 14, 1977. [Dept. of Statistics, Univ. of Chicago

    Energy Technology Data Exchange (ETDEWEB)

    Perlman, M D

    1977-03-01

    Research activities of the Department of Statistics, University of Chicago, during the period 15 June 1976 to 14 June 1977 are reviewed. Individual projects were carried out in the following eight areas: statistical computing--approximations to statistical tables and functions; numerical computation of boundary-crossing probabilities for Brownian motion and related stochastic processes; probabilistic methods in statistical mechanics; combining independent tests of significance; small-sample efficiencies of tests and estimates; improved procedures for simultaneous estimation and testing of many correlations; statistical computing and improved regression methods; and comparison of several populations. Brief summaries of these projects are given, along with other administrative information. (RWR)

  14. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  15. Analysis of stabilization quality false alarm probability by a radar adaptive detector in order statistics

    OpenAIRE

    Andreev, F. M.; Taranchenko, I. V.

    2007-01-01

    New expressions are derived for estimating the false alarm probability by adaptive threshold detectors, using the method of averaging and method of order statistics, in the presence of nonstationary interference. Potentialities of these devices, from the viewpoint of their stabilization of false alarm probability in the presence of nonstationary interference, are defined, and comparative analysis of the devices is performed.

  16. The exact probability distribution of the rank product statistics for replicated experiments.

    Science.gov (United States)

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  17. Frequency, Contingency and Online Processing of Multiword Sequences: An Eye-Tracking Study

    Science.gov (United States)

    Yi, Wei; Lu, Shiyi; Ma, Guojie

    2017-01-01

    Frequency and contingency are two primary statistical factors that drive the acquisition and processing of language. This study explores the role of phrasal frequency and contingency (the co-occurrence probability/statistical association of the constituent words in multiword sequences) during online processing of multiword sequences. Meanwhile, it…

  18. Academic Training Lecture | Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood | 7-9 April

    CERN Multimedia

    2015-01-01

    Please note that our next series of Academic Training Lectures will take place on the 7, 8 and 9 April 2015   Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood, by Harrison Prosper, Floridia State University, USA. from 11.00 a.m. to 12.00 p.m. in the Council Chamber (503-1-001) https://indico.cern.ch/event/358542/

  19. Methods in probability and statistical inference. Final report, June 15, 1975-June 30, 1979. [Dept. of Statistics, Univ. of Chicago

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, D L; Perlman, M D

    1980-06-01

    This report describes the research activities of the Department of Statistics, University of Chicago, during the period June 15, 1975 to July 30, 1979. Nine research projects are briefly described on the following subjects: statistical computing and approximation techniques in statistics; numerical computation of first passage distributions; probabilities of large deviations; combining independent tests of significance; small-sample efficiencies of tests and estimates; improved procedures for simultaneous estimation and testing of many correlations; statistical computing and improved regression methods; comparison of several populations; and unbiasedness in multivariate statistics. A description of the statistical consultation activities of the Department that are of interest to DOE, in particular, the scientific interactions between the Department and the scientists at Argonne National Laboratories, is given. A list of publications issued during the term of the contract is included.

  20. Tagging French Without Lexical Probabilities Combining Linguistic Knowledge And Statistical Learning

    CERN Document Server

    Tzoukermann, E; Gale, W A; Tzoukermann, Evelyne; Radev, Dragomir R.; Gale, William A.

    1997-01-01

    This paper explores morpho-syntactic ambiguities for French to develop a strategy for part-of-speech disambiguation that a) reflects the complexity of French as an inflected language, b) optimizes the estimation of probabilities, c) allows the user flexibility in choosing a tagset. The problem in extracting lexical probabilities from a limited training corpus is that the statistical model may not necessarily represent the use of a particular word in a particular context. In a highly morphologically inflected language, this argument is particularly serious since a word can be tagged with a large number of parts of speech. Due to the lack of sufficient training data, we argue against estimating lexical probabilities to disambiguate parts of speech in unrestricted texts. Instead, we use the strength of contextual probabilities along with a feature we call ``genotype'', a set of tags associated with a word. Using this knowledge, we have built a part-of-speech tagger that combines linguistic and statistical approa...

  1. A new expression of the probability distribution in Incomplete Statistics and fundamental thermodynamic relations

    Energy Technology Data Exchange (ETDEWEB)

    Huang Zhifu [Department of Physics, Xiamen University, Xiamen 361005 (China); Lin Bihong [Department of Physics, Xiamen University, Xiamen 361005 (China); Department of Physics, Quanzhou Normal University, Quanzhou 362000 (China); ChenJincan [Department of Physics, Xiamen University, Xiamen 361005 (China)], E-mail: jcchen@xmu.edu.cn

    2009-05-15

    In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier {beta} introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.

  2. 6th International Conference on Soft Methods in Probability and Statistics

    CERN Document Server

    Berthold, Michael; Moewes, Christian; Gil, María; Grzegorzewski, Przemysław; Hryniewicz, Olgierd; Synergies of Soft Computing and Statistics for Intelligent Data Analysis

    2013-01-01

    In recent years there has been a growing interest to extend classical methods for data analysis. The aim is to allow a more flexible modeling of phenomena such as uncertainty, imprecision or ignorance. Such extensions of classical probability theory and statistics are useful in many real-life situations, since uncertainties in data are not only present in the form of randomness --- various types of incomplete or subjective information have to be handled. About twelve years ago the idea of strengthening the dialogue between the various research communities in the field of data analysis was born and resulted in the International Conference Series on Soft Methods in Probability and Statistics (SMPS). This book gathers contributions presented at the SMPS'2012 held in Konstanz, Germany. Its aim is to present recent results illustrating new trends in intelligent data analysis. It gives a comprehensive overview of current research into the fusion of soft computing methods with probability and statistics. Synergies o...

  3. Statistics and Probability at Secondary Schools in the Federal State of Salzburg: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Wolfgang Voit

    2014-12-01

    Full Text Available Knowledge about the practical use of statistics and probability in today's mathematics instruction at secondary schools is vital in order to improve the academic education for future teachers. We have conducted an empirical study among school teachers to inform towards improved mathematics instruction and teacher preparation. The study provides a snapshot into the daily practice of instruction at school. Centered around the four following questions, the status of statistics and probability was examined. Where did  the current mathematics teachers study? What relevance do statistics and probability have in school? Which contents are actually taught in class? What kind of continuing education would be desirable for teachers? The study population consisted of all teachers of mathematics at secondary schools in the federal state of Salzburg.

  4. Central limit theorems for Sinkhorn divergence between probability distributions on finite spaces and statistical applications

    OpenAIRE

    Bigot, Jérémie; Cazelles, Elsa; Papadakis, Nicolas

    2017-01-01

    The notion of Sinkhorn divergence has recently gained popularity in machine learning and statistics, as it makes feasible the use of smoothed optimal transportation distances for data analysis. The Sinkhorn divergence allows the fast computation of an entropically regularized Wasserstein distance between two probability distributions supported on a finite metric space of (possibly) high-dimension. For data sampled from one or two unknown probability distributions, we derive central limit theo...

  5. 8th International Conference on Soft Methods in Probability and Statistics

    CERN Document Server

    Giordani, Paolo; Vantaggi, Barbara; Gagolewski, Marek; Gil, María; Grzegorzewski, Przemysław; Hryniewicz, Olgierd

    2017-01-01

    This proceedings volume is a collection of peer reviewed papers presented at the 8th International Conference on Soft Methods in Probability and Statistics (SMPS 2016) held in Rome (Italy). The book is dedicated to Data science which aims at developing automated methods to analyze massive amounts of data and to extract knowledge from them. It shows how Data science employs various programming techniques and methods of data wrangling, data visualization, machine learning, probability and statistics. The soft methods proposed in this volume represent a collection of tools in these fields that can also be useful for data science.

  6. Relevant Aspects on Teaching Probability and Statistics in Basic General Education

    Directory of Open Access Journals (Sweden)

    Marianela Alpízar-Vargas

    2012-08-01

    Full Text Available Probability and Statistics were included in the Basic General Education curricula by the Ministry of Public Education (Costa Rica, since 1995. To analyze the teaching reality in these fields, a research was conducted in two educational regions of the country: Heredia and Pérez Zeledón. The survey included university training and updating processes of teachers teaching Statistics and Probability in the schools. The research demonstrated the limited university training in these fields, the dissatisfaction of teachers about it, and the poor support of training institutions to their professional exercise.

  7. 7th International Conference on Soft Methods in Probability and Statistics – SMPS 2014

    CERN Document Server

    Gagolewski, Marek; Hryniewicz, Olgierd; Gil, María; Strengthening Links between Data Analysis and Soft Computing

    2015-01-01

    This book gathers contributions presented at the 7th International Conference on Soft Methods in Probability and Statistics SMPS 2014, held in Warsaw (Poland) on September 22-24, 2014. Its aim is to present recent results illustrating new trends in intelligent data analysis. It gives a comprehensive overview of current research into the fusion of soft computing methods with probability and statistics. Synergies of both fields might improve intelligent data analysis methods in terms of robustness to noise and applicability to larger datasets, while being able to efficiently obtain understandable solutions of real-world problems.

  8. Methods in probability and statistical inference. Progress report, June 1975--June 14, 1976. [Dept. of Statistics, Univ. of Chicago

    Energy Technology Data Exchange (ETDEWEB)

    Perlman, M D

    1976-03-01

    Efficient methods for approximating percentage points of the largest characteristic root of a Wishart matrix, and other statistical quantities of interest, were developed. Fitting of non-additive models to two-way and higher-way tables and the further development of the SNAP statistical computing system were reported. Numerical procedures for computing boundary-crossing probabilities for Brownian motion and other stochastic processes, such as Bessel diffusions, were implemented. Mathematical techniques from statistical mechanics were applied to obtain a unified treatment of probabilities of large deviations of the sample; in the setting of general topological vector spaces. The application of the Martin boundary to questions about infinite particle systems was studied. A comparative study of classical ''omnibus'' and Bayes procedures for combining several independent noncentral chi-square test statistics was completed. Work proceeds on the related problem of combining noncentral F-tests. A numerical study of the small-sample powers of the Pearson chi-square and likelihood ratio tests for multinomial goodness-of-fit was made. The relationship between asymptotic (large sample) efficiency of test statistics, as measured by Bahadur's concept of exact slope, and actual small-sample efficiency was studied. A promising new technique for the simultaneous estimation of all correlation coefficients in a multivariate population was developed. The method adapts the James--Stein ''shrinking'' estimator (for location parameters) to the estimating of correlations.

  9. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    Science.gov (United States)

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  10. Probability density cloud as a geometrical tool to describe statistics of scattered light.

    Science.gov (United States)

    Yaitskova, Natalia

    2017-04-01

    First-order statistics of scattered light is described using the representation of the probability density cloud, which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.

  11. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  12. Probability density cloud as a geometrical tool to describe statistics of scattered light

    Science.gov (United States)

    Yaitskova, Natalia

    2017-04-01

    First-order statistics of scattered light is described using the representation of probability density cloud which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.

  13. Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function

    Science.gov (United States)

    Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.

    2011-01-01

    In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.

  14. Animating Statistics: A New Kind of Applet for Exploring Probability Distributions

    Science.gov (United States)

    Kahle, David

    2014-01-01

    In this article, I introduce a novel applet ("module") for exploring probability distributions, their samples, and various related statistical concepts. The module is primarily designed to be used by the instructor in the introductory course, but it can be used far beyond it as well. It is a free, cross-platform, stand-alone interactive…

  15. The Effects and Side-Effects of Statistics Education: Psychology Students' (Mis-)Conceptions of Probability

    Science.gov (United States)

    Morsanyi, Kinga; Primi, Caterina; Chiesi, Francesca; Handley, Simon

    2009-01-01

    In three studies we looked at two typical misconceptions of probability: the representativeness heuristic, and the equiprobability bias. The literature on statistics education predicts that some typical errors and biases (e.g., the equiprobability bias) increase with education, whereas others decrease. This is in contrast with reasoning theorists'…

  16. Chances Are...Making Probability and Statistics Fun To Learn and Easy To Teach.

    Science.gov (United States)

    Pfenning, Nancy

    Probability and statistics may be the horror of many college students, but if these subjects are trimmed to include only the essential symbols, they are easily within the grasp of interested middle school or even elementary school students. This book can serve as an introduction for any beginner, from gifted students who would like to broaden…

  17. Two-Way Tables: Issues at the Heart of Statistics and Probability for Students and Teachers

    Science.gov (United States)

    Watson, Jane; Callingham, Rosemary

    2014-01-01

    Some problems exist at the intersection of statistics and probability, creating a dilemma in relation to the best approach to assist student understanding. Such is the case with problems presented in two-way tables representing conditional information. The difficulty can be confounded if the context within which the problem is set is one where…

  18. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  19. There Once Was a 9-Block ...--A Middle-School Design for Probability and Statistics

    Science.gov (United States)

    Abrahamson, Dor; Janusz, Ruth M.; Wilensky, Uri

    2006-01-01

    ProbLab is a probability-and-statistics unit developed at the Center for Connected Learning and Computer-Based Modeling, Northwestern University. Students analyze the combinatorial space of the 9-block, a 3-by-3 grid of squares, in which each square can be either green or blue. All 512 possible 9-blocks are constructed and assembled in a "bar…

  20. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    Science.gov (United States)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  1. Probability of Identification: A Statistical Model for the Validation of Qualitative Botanical Identification Methods

    Science.gov (United States)

    LaBudde, Robert A.; Harnly, James M.

    2013-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive non-target (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given. PMID:22468371

  2. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  3. Development of a statistical tool for the estimation of riverbank erosion probability

    Science.gov (United States)

    Varouchakis, Emmanouil

    2016-04-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a innovative statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use, accurate and can be applied to any region and river. Varouchakis, E. A., Giannakis, G. V., Lilli, M. A., Ioannidou, E., Nikolaidis, N. P., and Karatzas, G. P.: Development of a statistical tool for the estimation of riverbank erosion probability, SOIL (EGU), in print, 2016.

  4. A scan statistic for continuous data based on the normal probability model

    Directory of Open Access Journals (Sweden)

    Huang Lan

    2009-10-01

    Full Text Available Abstract Temporal, spatial and space-time scan statistics are commonly used to detect and evaluate the statistical significance of temporal and/or geographical disease clusters, without any prior assumptions on the location, time period or size of those clusters. Scan statistics are mostly used for count data, such as disease incidence or mortality. Sometimes there is an interest in looking for clusters with respect to a continuous variable, such as lead levels in children or low birth weight. For such continuous data, we present a scan statistic where the likelihood is calculated using the the normal probability model. It may also be used for other distributions, while still maintaining the correct alpha level. In an application of the new method, we look for geographical clusters of low birth weight in New York City.

  5. Success probability of atom-molecule sympathetic cooling: A statistical approach

    Science.gov (United States)

    Morita, Masato; Krems, Roman; Tscherbul, Timur

    2017-04-01

    Sympathetic cooling with ultracold atoms is a promising route toward creating colder and denser ensembles of polar molecules at temperatures below 1 mK. Rigorous quantum scattering calculations can be carried out to identify atom-molecule collision systems with suitable collisional properties for sympathetic cooling experiments. The accuracy of such calculations is limited by the uncertainties of the underlying ab initio interaction potentials. To overcome these limitations, we introduce a statistical approach based on cumulative probability distributions for the ratio of elastic-to-inelastic collision cross sections, from which the success probability of atom-molecule sympathetic cooling can be estimated. Our analysis shows that, for a range of experimentally relevant collision systems, the cumulative probabilities are not sensitive to the number of rotational states in the basis set, potentially leading to a dramatic reduction of the computational cost of simulating cold molecular collisions in external fields.

  6. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K

    2016-01-01

    -scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing......Genome-wide Association Studies (GWAS) result in millions of summary statistics ("z-scores") for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric...... 9.3 million SNP z-scores in both cases. We show that, over a broad range of z-scores and sample sizes, the model accurately predicts expectation estimates of true effect sizes and replication probabilities in multistage GWAS designs. We assess the degree to which effect sizes are over-estimated when...

  7. a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information

    Science.gov (United States)

    Lian, Shizhong; Chen, Jiangping; Luo, Minghai

    2016-06-01

    Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.

  8. Pao-Lu Hsu (Xu, Bao-lu): The Grandparent of Probability and Statistics in China

    OpenAIRE

    Chen, Dayue; Olkin, Ingram

    2012-01-01

    The years 1910–1911 are auspicious years in Chinese mathematics with the births of Pao-Lu Hsu, Luo-Keng Hua and Shiing-Shen Chern. These three began the development of modern mathematics in China: Hsu in probability and statistics, Hua in number theory, and Chern in differential geometry. We here review some facts about the life of P.-L. Hsu which have been uncovered recently, and then discuss some of his contributions. We have drawn heavily on three papers in the 1979 Annals of Statistics (v...

  9. Statistical analysis of nature frequencies of hemispherical resonator gyroscope based on probability theory

    Science.gov (United States)

    Yu, Xudong; Long, Xingwu; Wei, Guo; Li, Geng; Qu, Tianliang

    2015-04-01

    A finite element model of the hemispherical resonator gyro (HRG) is established and the natural frequencies and vibration modes are investigated. The matrix perturbation technology in the random finite element method is first introduced to analyze the statistical characteristics of the natural frequencies of HRG. The influences of random material parameters and dimensional parameters on the natural frequencies are quantitatively described based on the probability theory. The statistics expressions of the random parameters are given and the influences of three key parameters on natural frequency are pointed out. These results are important for design and improvement of high accuracy HRG.

  10. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  11. Statistics of ultimate strain of the Earth's crust and probability of earthquake occurrence

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1975-03-01

    Statistics of ultimate strain are improved by adding new data to the previous data. The critical value for horizontal strain seems somewhat larger than that for vertical strain, although parameters of a Weibull distribution, which is customarily used for quality-control research and which fits in very well with the present statistics, are calculated for the whole set of data, making no distinction between the two subsets because of their scantiness. On the basis of the parameters thus determined and strain rates obtained from geodetic data, probabilities of earthquake occurrence in a few regions in Japan and the U.S. are estimated. The probability of having an earthquake in an area southwest of Tokyo, where the 1923 earthquake (magnitude 7.9) occurred, is at this time 20%, a value almost the same as that obtained in the previous papers. The probability will reach somewhere between 50 and 90% by 2000 and 2050, respectively. In the North Izu District, where an earthquake of magnitude 7.0 occurred in 1930, a shearing crustal motion is occurring. Because of the extent of this motion, there is a 40% probability that an earthquake will recur during the next 40 years. By the end of this century, it will become as high as 85%. Similar estimates of such cumulative probabilities are made for the San Francisco and Fort Tejon regions, where great earthquakes occurred respectively in 1906 and 1857, yielding values of 30 and 80% at present. These probabilities are tentative because of possible errors in evaluating geodetic measurements and because of the uncertainty of the ultimate crustal strain assigned to the San Andreas Fault.

  12. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  13. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    OpenAIRE

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-01-01

    This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school) in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three stud...

  14. How well is the probability of tumor cure after fractionated irradiation described by Poisson statistics?

    Science.gov (United States)

    Tucker, S L; Thames, H D; Taylor, J M

    1990-12-01

    The probability of tumor cure in a homogeneous population of tumors exposed to fractionated radiotherapy was modeled using numerical simulations and compared with the predictions of Poisson statistics, assuming exact knowledge of the relevant tumor parameters (clonogen number, radiosensitivity, and growth kinetics). The results show that although Poisson statistics (based on exact knowledge of all parameters) accurately describes the probability of tumor cure when no proliferation occurs during treatment, it underestimates the cure rate when proliferation does occur. In practice, however, the inaccuracy is not likely to be more than about 10%. When the tumor parameters are unknown and are estimated by fitting an empirical Poisson model to tumor-cure data from a homogeneous population of proliferative tumors, the resulting estimates of tumor growth rate and radiosensitivity accurately reflect the true values, but the estimate of initial clonogen number is biased downward. A new formula that is more accurate than Poisson statistics in predicting the probability of tumor cure when proliferation occurs during treatment is discussed.

  15. Motion statistics at the saccade landing point: attentional capture by spatiotemporal features in a gaze-contingent reference

    Science.gov (United States)

    Belardinelli, Anna; Carbone, Andrea

    2012-06-01

    Motion is known to play a fundamental role in attentional capture, still it is not always included in computational models of visual attention. A wealth of literature in the past years has investigated natural image statistics at the centre of gaze to assess static low-level features accounting for fixation capture on images. A motion counterpart describing which features trigger saccades on dynamic scenes has been less looked into, whereas it would provide significant insight on the visuomotor behaviour when attending to events instead of less realistic still images. Such knowledge would be paramount to devise active vision systems that can spot interesting or malicious activities and disregard less relevant patterns. In this paper, we present an analysis of spatiotemporal features at the future centre of gaze to extract possible regularities in the fixation distribution to contrast with the feature distribution of non-fixated points. A substantial novelty in the methodology is the evaluation of the features in a gaze-contingent reference. Each video sequence fragment is indeed foveated with respect to the current fixation, while features are collected at the next saccade landing point. This allows us to estimate covertly selected motion cues in a retinotopic fashion. We consider video sequences and eye-tracking data from a recent state-of-the art dataset and test a bottom-up motion saliency measure against human performance. Obtained results can be used to further tune saliency computational models and to learn to predict human fixations on video sequences or generate meaningful shifts of active sensors in real world scenarios.

  16. Firing statistics of inhibitory neuron with delayed feedback. I. Output ISI probability density.

    Science.gov (United States)

    Vidybida, A K; Kravchuk, K G

    2013-06-01

    Activity of inhibitory neuron with delayed feedback is considered in the framework of point stochastic processes. The neuron receives excitatory input impulses from a Poisson stream, and inhibitory impulses from the feedback line with a delay. We investigate here, how does the presence of inhibitory feedback affect the output firing statistics. Using binding neuron (BN) as a model, we derive analytically the exact expressions for the output interspike intervals (ISI) probability density, mean output ISI and coefficient of variation as functions of model's parameters for the case of threshold 2. Using the leaky integrate-and-fire (LIF) model, as well as the BN model with higher thresholds, these statistical quantities are found numerically. In contrast to the previously studied situation of no feedback, the ISI probability densities found here both for BN and LIF neuron become bimodal and have discontinuity of jump type. Nevertheless, the presence of inhibitory delayed feedback was not found to affect substantially the output ISI coefficient of variation. The ISI coefficient of variation found ranges between 0.5 and 1. It is concluded that introduction of delayed inhibitory feedback can radically change neuronal output firing statistics. This statistics is as well distinct from what was found previously (Vidybida and Kravchuk, 2009) by a similar method for excitatory neuron with delayed feedback. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Audio analysis of statistically instantaneous signals with mixed Gaussian probability distributions

    Science.gov (United States)

    Naik, Ganesh R.; Wang, Wenwu

    2012-10-01

    In this article, a novel method is proposed to measure the separation qualities of statistically instantaneous audio signals with mixed Gaussian probability distributions. This study evaluates the impact of the Probability Distribution Function (PDF) of the mixed signals on the outcomes of both sub- and super-Gaussian distributions. Different Gaussian measures are evaluated by using various spectral-distortion measures. It aims to compare the different audio mixtures from both super-Gaussian and sub-Gaussian perspectives. Extensive computer simulation confirms that the separated sources always have super-Gaussian characteristics irrespective of the PDF of the signals or mixtures. The result based on the objective measures demonstrates the effectiveness of source separation in improving the quality of the separated audio sources.

  18. Computing light statistics in heterogeneous media based on a mass weighted probability density function method.

    Science.gov (United States)

    Jenny, Patrick; Mourad, Safer; Stamm, Tobias; Vöge, Markus; Simon, Klaus

    2007-08-01

    Based on the transport theory, we present a modeling approach to light scattering in turbid material. It uses an efficient and general statistical description of the material's scattering and absorption behavior. The model estimates the spatial distribution of intensity and the flow direction of radiation, both of which are required, e.g., for adaptable predictions of the appearance of colors in halftone prints. This is achieved by employing a computational particle method, which solves a model equation for the probability density function of photon positions and propagation directions. In this framework, each computational particle represents a finite probability of finding a photon in a corresponding state, including properties like wavelength. Model evaluations and verifications conclude the discussion.

  19. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    Science.gov (United States)

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  20. A statistical technique for defining rainfall forecast probabilities in southern Africa

    Science.gov (United States)

    Husak, G. J.; Magadzire, T.

    2010-12-01

    Probabilistic forecasts are currently produced by many climate centers and by just as many processes. These models use a number of inputs to generate the probability of rainfall falling in the lower/middle/upper tercile (frequently termed “below-normal”/”normal”/”above-normal”) of the historical rainfall distribution. Generation of forecasts for a season may be the result of a dynamic climate model, a statistical model, a consensus of a panel of experts, or a combination of some of the afore-mentioned techniques, among others. This last method is one most commonly accepted in Southern Africa, resulting from the Southern Africa Regional Climate Outlook Forum (SARCOF). While it has been noted that there is a reasonable chance of polygons having a dominant tercile of 60% probability or more, this has seldom been the case. Indeed, over the last four years, the SARCOF process has not produced any polygons with such a likelihood. In fact, the terciles in all of the SARCOFs since 2007 have been some combination of 40%, 35% and 25%. Discussions with SARCOF scientists suggests that the SARCOF process is currently using the probabilistic format to define relatively qualitative, rank-ordered outcomes in the format “most-likely”, “second-most likely” and “least likely” terciles. While this rank-ordered classification has its merits, it limits the sort of downstream quantitative statistical analysis that could potentially be of assistance to various decision makers. In this study we build a simple statistical model to analyze the probabilistic outcomes for the coming rainfall season, and analyze their resulting probabilities. The prediction model takes a similar approach to that already used in the SARCOF process: namely, using available historic rainfall data and SST information to create a linear regression between rainfall and SSTs, define a likely rainfall outcome, and analyze the cross-validation errors over the most recent 30 years. The cross

  1. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  2. What is the probability of replicating a statistically significant association in genome-wide association studies?

    Science.gov (United States)

    Jiang, Wei; Xue, Jing-Hao; Yu, Weichuan

    2017-11-01

    The goal of genome-wide association studies (GWASs) is to discover genetic variants associated with diseases/traits. Replication is a common validation method in GWASs. We regard an association as true finding when it shows significance in both primary and replication studies. A question worth pondering is what is the probability of a primary association (i.e. a statistically significant association in the primary study) being validated in the replication study? This article systematically reviews the answers to this question from different points of view. As Bayesian methods can help us integrate out the uncertainty about the underlying effect of the primary association, we will mainly focus on the Bayesian view in this article. We refer the Bayesian replication probability as the replication rate (RR). We further describe an estimation method for RR, which makes use of the summary statistics from the primary study. We can use the estimated RR to determine the sample size of the replication study and to check the consistency between the results of the primary study and those of the replication study. We describe an R-package to estimate and apply RR in GWASs. Simulation and real data experiments show that the estimated RR has good prediction and calibration performance. We also use these data to demonstrate the usefulness of RR. The R-package is available at http://bioinformatics.ust.hk/RRate.html. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. A Statistical Model for Determining the Probability of Observing Exoplanetary Radio Emissions

    Science.gov (United States)

    Garcia, R.; Knapp, M.; Winterhalter, D.; Majid, W.

    2015-12-01

    The idea that extrasolar planets should emit radiation in the low-frequency radio regime is a generalization of the observation of decametric and kilometric radio emissions from magnetic planets in our own solar system, yet none of these emissions have been observed. Such radio emissions are a result of the interactions between the host star's magnetized wind and the planet's magnetosphere that accelerate electrons along the field lines, which leads to radio emissions at the electron gyrofrequency. To understand why these emissions had not yet been observed, and to guide in target selection for future detection efforts, we took a statistical approach to determine what the ideal location in parameter space was for these hypothesized exoplanetary radio emissions to be detected. We derived probability distribution functions from current datasets for the observably constrained parameters (such as the radius of the host star), and conducted a review of the literature to construct reasonable probability distribution functions to obtain the unconstrained parameters (such as the magnetic field strength of the exoplanet). We then used Monte Carlo sampling to develop a synthetic population of exoplanetary systems and calculated whether the radio emissions from the systems were detectable depending on the angle of beaming, frequency (above the ionospheric cutoff rate of 10 MHz) and flux density (above 5 mJy) of the emission. From millions of simulations we derived a probability distribution function in parameter space as a function of host star type, orbital radius and planetary or host star radius. The probability distribution function illustrated the optimal parameter values of an exoplanetary system that may make the system's radio emissions detectable to current and currently under development instruments such as the SKA. We found that detection of exoplanetary radio emissions favor planets larger than 5 Earth radii and within 1 AU of their M dwarf host.

  4. Probability uniformization and application to statistical palaeomagnetic field models and directional data

    Science.gov (United States)

    Khokhlov, A.; Hulot, G.

    2013-04-01

    We introduce and apply the concept of 2-D probability uniformization to palaeomagnetic directional data. 2-D uniformization belongs to a very general class of probability transformations that map multivariate probability distributions into multivariate uniform distributions. Our goal is to produce joint tests of directional data sets assumed generated by a common statistical model, but with different sampling distributions. This situation is encountered when testing so-called Giant Gaussian Process (GGP) models of the Earth's magnetic field against palaeomagnetic directional data collected from different geographical sites, the predicted sampling distributions being site-dependent. To introduce the concept, we first consider 2-D Gaussian distributions in the plane {R}^2, before turning to Angular Gaussian and more general 2-D distributions on the unit sphere S^2. We detail the approach when applied to the 2-D distributions expected for palaeomagnetic directional data, if these are to be consistent with a GGP model while affected by some Fisherian error. We finally provide some example applications to real palaeomagnetic data. In particular, we show how subtle inhomogeneities in the distribution of the data, such as the so-called right-handed effect in palaeomagnetism, can be detected. This effect, whether of geomagnetic origin or not, affects the Brunhes data in such a way that they cannot easily be reconciled with GGP models originally built with the help of these data. 2-D probability uniformization is a powerful tool which, we argue, could be used to build and test better GGP models of the mean palaeomagnetic field and palaeosecular variation. The software designed in the course of this study is available upon request from the authors. It can also be downloaded from http://geomag.ipgp.fr/download/PSVT.tgz.

  5. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    Science.gov (United States)

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  6. Quantifying changes and their uncertainties in probability distribution of climate variables using robust statistics

    Energy Technology Data Exchange (ETDEWEB)

    Hannachi, A. [University of Reading, Department of Meteorology, Earley Gate, PO Box 243, Reading (United Kingdom)

    2006-08-15

    Robust tools are presented in this manuscript to assess changes in probability density function (pdf) of climate variables. The approach is based on order statistics and aims at computing, along with their standard errors, changes in various quantiles and related statistics. The technique, which is nonparametric and simple to compute, is developed for both independent and dependent data. For autocorrelated data, serial correlation is addressed via Monte Carlo simulations using various autoregressive models. The ratio between the average standard errors, over several quantiles, of quantile estimates for correlated and independent data, is then computed. A simple scaling-law type relationship is found to hold between this ratio and the lag-1 autocorrelation. The approach has been applied to winter monthly Central England Temperature (CET) and North Atlantic Oscillation (NAO) time series from 1659 to 1999 to assess/quantify changes in various parameters of their pdf. For the CET, significant changes in median (or scale) and also in low and high quantiles are observed between various time slices, in particular between the pre- and post-industrial revolution. Observed changes in spread and also quartile skewness of the pdf, however, are not statistically significant (at 95% confidence level). For the NAO index we find mainly large significant changes in variance (or scale), yielding significant changes in low/high quantiles. Finally, the performance of the method compared to few conventional approaches is discussed. (orig.)

  7. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  8. Comparison of probability density functions for analyzing irradiance statistics due to atmospheric turbulence.

    Science.gov (United States)

    Mclaren, Jason R W; Thomas, John C; Mackintosh, Jessica L; Mudge, Kerry A; Grant, Kenneth J; Clare, Bradley A; Cowley, William G

    2012-09-01

    A large number of model probability density functions (PDFs) are used to analyze atmospheric scintillation statistics. We have analyzed scintillation data from two different experimental setups covering a range of scintillation strengths to determine which candidate model PDFs best describe the experimental data. The PDFs were fitted to the experimental data using the method of least squares. The root-mean-squared fitting error was used to monitor the goodness of fit. The results of the fitting were found to depend strongly on the scintillation strength. We find that the log normally modulated Rician and the log normal PDFs are the best fit to the experimental data over the range of scintillation strengths encountered.

  9. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    Science.gov (United States)

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  10. Zooming in on local level statistics by supersymmetric extension of free probability

    Energy Technology Data Exchange (ETDEWEB)

    Zirnbauer, S Mandt and M R [Institut fuer Theoretische Physik, Universitaet zu Koeln, Zuelpicher Strasse 77, 50937 Koeln (Germany)], E-mail: zirnbauer@uni-koeln.de

    2010-01-15

    We consider unitary ensembles of Hermitian N x N matrices governed by a confining potential NV with analytic and uniformly convex V. From earlier work it is known that the large-N limit of the characteristic function for a finite-rank Fourier variable K is determined by the Voiculescu R-transform, a key object in free probability theory. Going beyond these results, we argue that the same holds true when the finite-rank operator K has the form that is required by the Wegner-Efetov supersymmetry method. This insight leads to a potent new technique for the study of local statistics, e.g. level correlations. We illustrate the new technique by demonstrating universality in a random matrix model of stochastic scatteri0008.

  11. Statistical analysis and verification of 3-hourly geomagnetic activity probability predictions

    Science.gov (United States)

    Wang, Jingjing; Zhong, Qiuzhen; Liu, Siqing; Miao, Juan; Liu, Fanghua; Li, Zhitao; Tang, Weiwei

    2015-12-01

    The Space Environment Prediction Center (SEPC) has classified geomagnetic activity into four levels: quiet to unsettled (Kp 6). The 3-hourly Kp index prediction product provided by the SEPC is updated half hourly. In this study, the statistical conditional forecast models for the 3-hourly geomagnetic activity level were developed based on 10 years of data and applied to more than 3 years of data, using the previous Kp index, interplanetary magnetic field, and solar wind parameters measured by the Advanced Composition Explorer as conditional parameters. The quality of the forecast models was measured and compared against verifications of accuracy, reliability, discrimination capability, and skill of predicting all geomagnetic activity levels, especially the probability of reaching the storm level given a previous "calm" (nonstorm level) or "storm" (storm level) condition. It was found that the conditional models that used the previous Kp index, the peak value of BtV (the product of the total interplanetary magnetic field and speed), the average value of Bz (the southerly component of the interplanetary magnetic field), and BzV (the product of the southerly component of the interplanetary magnetic field and speed) over the last 6 h as conditional parameters provide a relative operating characteristic area of 0.64 and can be an appropriate predictor for the probability forecast of geomagnetic activity level.

  12. Statistics and probability analysis of the voltage on the pantograph of dc electric locomotive in the recuperation mode

    Directory of Open Access Journals (Sweden)

    V.M. Liashuk

    2012-12-01

    Full Text Available The statistical analysis and probability characteristics of voltage random variation on the pantograph of DC electric locomotive in the recuperation mode are presented in the article.

  13. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2011-01-01

    , but that person has not yet been born. The notion of ‘future contingent objects’ involves important philosophical questions, for instance the issue of ethical obligations towards future generations, quantification over ‘future contingent objects’ etc. However, this entry is confined to the study of future...... contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...

  14. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2015-01-01

    , but that person has not yet been born. The notion of ‘future contingent objects’ involves important philosophical questions, for instance the issue of ethical obligations towards future generations, quantification over ‘future contingent objects’ etc. However, this entry is confined to the study of future...... contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...

  15. An analysis of problems in statistics and probability in second year educational text books

    Directory of Open Access Journals (Sweden)

    Nicolás Andrés Sánchez

    2017-09-01

    Full Text Available At present society demands that every citizen manage to develop the capacity to interpret and question different phenomena present in tables, graphs and data, capacities that must be developed progressively from the earliest years of education. For this, it is also necessary that the resources point to the development of these skills, such as the textbook of Mathematics. The objective of the present work is to analyze the types of problems proposed in two Secondary Mathematics textbooks in the thematic area of Statistics and Probability. Both texts were those that were tendered and distributed free of charge and respond to two different curricular periods: 1 the one in which the old curriculum bases were in force and the other 2 when the current curricula were implemented. The use of the school textbook book by students and teachers assumes the premise that the various tasks proposed should tend to solve problems. The research was carried out through a qualitative methodology through content analysis. The theoretical categories proposed by Díaz and Poblete (2001 were used. Among the results found most routine problems are identified that serve to mechanize processes; The non-routine problems or real context, appear in very few cases.

  16. Level Set Segmentation of Medical Images Based on Local Region Statistics and Maximum a Posteriori Probability

    Directory of Open Access Journals (Sweden)

    Wenchao Cui

    2013-01-01

    Full Text Available This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP and Bayes’ rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  17. Introduction to the life estimation of materials - Application of the theory of probability statistics of extreme values to corrosion

    Energy Technology Data Exchange (ETDEWEB)

    Kowaka, M.

    1984-01-01

    The book contains a history of the application of statistics of extreme values of corrosion, fundamentals of statistics, probability of corrosion phenomena, exercises to understand the theory. The corrosion phenomena are described and the quantum analysis of localized corrosion and life estimation of materials are available by using the method.

  18. Statistical Analysis of Gait Maturation in Children Using Nonparametric Probability Density Function Modeling

    Directory of Open Access Journals (Sweden)

    Ning Xiang

    2013-02-01

    Full Text Available Analysis of gait dynamics in children may help understand the development of neuromuscular control and maturation of locomotor function. This paper applied the nonparametric Parzen-window estimation method to establish the probability density function (PDF models for the stride interval time series of 50 children (25 boys and 25 girls. Four statistical parameters, in terms of averaged stride interval (ASI, variation of stride interval (VSI, PDF skewness (SK, and PDF kurtosis (KU, were computed with the Parzen-window PDFs to study the maturation of stride interval in children. By analyzing the results of the children in three age groups (aged 3–5 years, 6–8 years, and 10–14 years, we summarize the key findings of the present study as follows. (1 The gait cycle duration, in terms of ASI, increases until 14 years of age. On the other hand, the gait variability, in terms of VSI, decreases rapidly until 8 years of age, and then continues to decrease at a slower rate. (2 The SK values of both the histograms and Parzen-window PDFs for all of the three age groups are positive, which indicates an imbalance in the stride interval distribution within an age group. However, such an imbalance would be meliorated when the children grow up. (3 The KU values of both the histograms and Parzen-window PDFs decrease with the body growth in children, which suggests that the musculoskeletal growth enables the children to modulate a gait cadence with ease. (4 The SK and KU results also demonstrate the superiority of the Parzen-window PDF estimation method to the Gaussian distribution modeling, for the study of gait maturation in children.

  19. The Probability of Exceedance as a Nonparametric Person-Fit Statistic for Tests of Moderate Length

    NARCIS (Netherlands)

    Tendeiro, Jorge N.; Meijer, Rob R.

    2013-01-01

    To classify an item score pattern as not fitting a nonparametric item response theory (NIRT) model, the probability of exceedance (PE) of an observed response vector x can be determined as the sum of the probabilities of all response vectors that are, at most, as likely as x, conditional on the

  20. Statistics, Probability, Significance, Likelihood: Words Mean What We Define Them to Mean

    Science.gov (United States)

    Drummond, Gordon B.; Tom, Brian D. M.

    2011-01-01

    Statisticians use words deliberately and specifically, but not necessarily in the way they are used colloquially. For example, in general parlance "statistics" can mean numerical information, usually data. In contrast, one large statistics textbook defines the term "statistic" to denote "a characteristic of a…

  1. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    Science.gov (United States)

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  2. Some possible q-exponential type probability distribution in the non-extensive statistical physics

    Science.gov (United States)

    Chung, Won Sang

    2016-08-01

    In this paper, we present two exponential type probability distributions which are different from Tsallis’s case which we call Type I: one given by pi = 1 Zq[eq(Ei)]-β (Type IIA) and another given by pi = 1 Zq[eq(-β)]Ei (Type IIIA). Starting with the Boltzman-Gibbs entropy, we obtain the different probability distribution by using the Kolmogorov-Nagumo average for the microstate energies. We present the first-order differential equations related to Types I, II and III. For three types of probability distributions, we discuss the quantum harmonic oscillator, two-level problem and the spin-1 2 paramagnet.

  3. The Prediction of Future Order Statistics from Weibull or Extreme-Value Distribution Using the Probability Paper

    OpenAIRE

    Ishikawa, Takeshi

    1996-01-01

    This article provides a method using the probability papers for point and interval predictions of future order statistics, on the basis of Type II censored samples from the Weibull and Extreme-Value distribution. First we propose two-point predictor for the point prediction problem and the problem of choosing plotting position are studied. Second we give the method to construct the prediction interval of the future order statistics using the two-point predictor.

  4. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  5. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    Science.gov (United States)

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  6. Probability density function shape sensitivity in the statistical modeling of turbulent particle dispersion

    Science.gov (United States)

    Litchford, Ron J.; Jeng, San-Mou

    1992-01-01

    The performance of a recently introduced statistical transport model for turbulent particle dispersion is studied here for rigid particles injected into a round turbulent jet. Both uniform and isosceles triangle pdfs are used. The statistical sensitivity to parcel pdf shape is demonstrated.

  7. Studies in the History of Probability and Statistics. XXXV: Multiple decrements or competing risks

    OpenAIRE

    SEAL, HILARY L.

    2017-01-01

    Given two states A and B such that individuals in state A have mutually exclusive probabilities, possibly dependent on the time spent in state A, of leaving that state because of (i) death, or (ii) passage to state B, what is the probability of an individual passing to state B and dying there within a given period? This problem has been of great interest and importance to actuaries for over 100 years and the solutions of their professional contemporaries have appeared in their textbooks. Twen...

  8. Aarhus Conference on Probability, Statistics and Their Applications : Celebrating the Scientific Achievements of Ole E. Barndorff-Nielsen

    CERN Document Server

    Stelzer, Robert; Thorbjørnsen, Steen; Veraart, Almut

    2016-01-01

    Collecting together twenty-three self-contained articles, this volume presents the current research of a number of renowned scientists in both probability theory and statistics as well as their various applications in economics, finance, the physics of wind-blown sand, queueing systems, risk assessment, turbulence and other areas. The contributions are dedicated to and inspired by the research of Ole E. Barndorff-Nielsen who, since the early 1960s, has been and continues to be a very active and influential researcher working on a wide range of important problems. The topics covered include, but are not limited to, econometrics, exponential families, Lévy processes and infinitely divisible distributions, limit theory, mathematical finance, random matrices, risk assessment, statistical inference for stochastic processes, stochastic analysis and optimal control, time series, and turbulence. The book will be of interest to researchers and graduate students in probability, statistics and their applications. .

  9. Characterization of probability distributions by conditional expectation of function of record statistics

    OpenAIRE

    Zubdah-e-Noor; Athar, Haseeb

    2014-01-01

    In this paper, two general classes of distributions have been characterized through conditional expectation of power of difference of two record statistics. Further, some particular cases and examples are also discussed.

  10. Characterization of probability distributions by conditional expectation of function of record statistics

    Directory of Open Access Journals (Sweden)

    Zubdah-e-Noor

    2014-07-01

    Full Text Available In this paper, two general classes of distributions have been characterized through conditional expectation of power of difference of two record statistics. Further, some particular cases and examples are also discussed.

  11. Conceptual and Statistical Issues Regarding the Probability of Default and Modeling Default Risk

    Directory of Open Access Journals (Sweden)

    Emilia TITAN

    2011-03-01

    Full Text Available In today’s rapidly evolving financial markets, risk management offers different techniques in order to implement an efficient system against market risk. Probability of default (PD is an essential part of business intelligence and customer relation management systems in the financial institutions. Recent studies indicates that underestimating this important component, and also the loss given default (LGD, might threaten the stability and smooth running of the financial markets. From the perspective of risk management, the result of predictive accuracy of the estimated probability of default is more valuable than the standard binary classification: credible or non credible clients. The Basle II Accord recognizes the methods of reducing credit risk and also PD and LGD as important components of advanced Internal Rating Based (IRB approach.

  12. Incident prediction: a statistical approach to dynamic probability estimation : application to a test site in Barcelona

    OpenAIRE

    Montero Mercadé, Lídia; Barceló Bugeda, Jaime; Perarnau, Josep

    2002-01-01

    DR 2002/08 Departament d'EIO - Research Supported by PRIME European Project Real-time models for estimating incident probabilities (EIP models) are innovative methods for predicting the potential occurrence of incidents and improving the effectiveness of incident management policies devoted to increasing road safety. EIP models imbedded in traffic management systems can lead to the development of control strategies for reducing the likelihood of incidents before they occur. This paper pre...

  13. Ignorance is not bliss: Statistical power is not probability of trial success.

    Science.gov (United States)

    Zierhut, M L; Bycott, P; Gibbs, M A; Smith, B P; Vicini, P

    2016-04-01

    The purpose of this commentary is to place probability of trial success, or assurance, in the context of decision making in drug development, and to illustrate its properties in an intuitive manner for the readers of Clinical Pharmacology and Therapeutics. The hope is that this will stimulate a dialog on how assurance should be incorporated into a quantitative decision approach for clinical development and trial design that uses all available information. © 2015 ASCPT.

  14. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  15. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  16. Probabilistic risk assessment course documentation. Volume 2. Probability and statistics for PRA applications

    Energy Technology Data Exchange (ETDEWEB)

    Iman, R.L.; Prairie, R.R.; Cramond, W.R.

    1985-08-01

    This course is intended to provide the necessary probabilistic and statistical skills to perform a PRA. Fundamental background information is reviewed, but the principal purpose is to address specific techniques used in PRAs and to illustrate them with applications. Specific examples and problems are presented for most of the topics.

  17. Level crossing statistics of atmospheric wind speeds: Probability distribution of episode lengths

    Science.gov (United States)

    Edwards, Paul J.

    2000-03-01

    The probability distribution of the duration of episodes (``wind runs'') during which the horizontal wind speed in the planetary surface boundary layer remains above or below a given threshold value is of interest in the fields of renewable energy generation and pollutant dispersal. There still appear to be no analytic or conceptual models to explain the remarkable constancy of the power law form of the wind run distribution measured at a variety of sites on the earth's surface for run lengths ranging from a few minutes to a day or more. .

  18. [Probability of disturbance of Poisson statistics in radioactive decay type processes].

    Science.gov (United States)

    Kirillov, A A; Zenchenko, K I

    2001-01-01

    The phenomenon of regular dynamics of fine structure of sampling distributions in processes of radioactive decay type is discussed. The time dependence of the probability of their similarity indicates that the process measured is nonstationary. The most natural explanation for the disturbance of stationarity is the influence of gauging equipment, which in principle cannot be avoided completely. However, the universality of the effect and the presence of the correlation of the fine structure in independent synchronous processes leads one to suggest the existence of some global source. A mechanism of the generation of fine structure as a result of the influence of nontrivial topology of space on the decay process is proposed.

  19. Rank-k Maximal Statistics for Divergence and Probability of Misclassification

    Science.gov (United States)

    Decell, H. P., Jr.

    1972-01-01

    A technique is developed for selecting from n-channel multispectral data some k combinations of the n-channels upon which to base a given classification technique so that some measure of the loss of the ability to distinguish between classes, using the compressed k-dimensional data, is minimized. Information loss in compressing the n-channel data to k channels is taken to be the difference in the average interclass divergences (or probability of misclassification) in n-space and in k-space.

  20. Review of RiskQ: An interactive approach to probability, uncertainty, and statistics for use with Mathematica

    Energy Technology Data Exchange (ETDEWEB)

    Murray, D.M.; Burmaster, D.E. (Alceon Corp., Cambridge, MA (United States))

    1993-08-01

    RiskQ is a computer program designed to enable users to perform analyses involving probability, uncertainty, and statistics. It is written in the Mathematica programming language. RiskQ is intended for individuals who have experience with Mathematica and a good working knowledge of probabilistic and statistical concepts. In this article the authors discuss the features of RiskQ, its performance as compared with Crystal Ball and Risk, documentation, ease of learning and use, error handling, and support. 2 refs., 2 figs.

  1. ESTIMATION PILOTING ERRORS STATISTICS AND PROBABILITY OF SATELLITE NAVIGATION SYSTEM FAILURE

    Directory of Open Access Journals (Sweden)

    V. L. Kuznetsov

    2014-01-01

    Full Text Available A new approach to the problem of co-processing of data about the aircraft position with the help of satellite navigation systems and secondary surveillance radar system is been developed. The purpose of the task is to obtain estimates of mistakes distribution for control systems and piloting errors. Possibility of a statistical relationship between piloting errors and mistakes of satellite navigation system is taken into account.

  2. Some statistical properties of surface slopes via remote sensing considering a non-Gaussian probability density function

    Science.gov (United States)

    Poom-Medina, José Luis; Álvarez-Borrego, Josué

    2016-07-01

    Theoretical relationships of statistical properties of surface slope from statistical properties of the image intensity in remotely sensed images, considering a non-Gaussian probability density function of the surface slope, are shown. Considering a variable detector line of sight angle and considering ocean waves moving along a single direction and that the observer and the sun are both in the vertical plane containing this direction, new expressions, using two different glitter functions, between the variance of the intensity of the image and the variance of the surface slopes are derived. In this case, skewness and kurtosis moments are taken into account. However, new expressions between correlation functions of the intensities in the image and surface slopes are numerically analyzed; for this case, the skewness moments were considered only. It is possible to observe more changes in these statistical relationships when the Rect function is used. The skewness and kurtosis values are in direct relation with the wind velocity on the sea surface.

  3. A refined statistical cloud closure using double-Gaussian probability density functions

    Science.gov (United States)

    Naumann, A. K.; Seifert, A.; Mellado, J. P.

    2013-10-01

    We introduce a probability density function (PDF)-based scheme to parameterize cloud fraction, average liquid water and liquid water flux in large-scale models, that is developed from and tested against large-eddy simulations and observational data. Because the tails of the PDFs are crucial for an appropriate parameterization of cloud properties, we use a double-Gaussian distribution that is able to represent the observed, skewed PDFs properly. Introducing two closure equations, the resulting parameterization relies on the first three moments of the subgrid variability of temperature and moisture as input parameters. The parameterization is found to be superior to a single-Gaussian approach in diagnosing the cloud fraction and average liquid water profiles. A priori testing also suggests improved accuracy compared to existing double-Gaussian closures. Furthermore, we find that the error of the new parameterization is smallest for a horizontal resolution of about 5-20 km and also depends on the appearance of mesoscale structures that are accompanied by higher rain rates. In combination with simple autoconversion schemes that only depend on the liquid water, the error introduced by the new parameterization is orders of magnitude smaller than the difference between various autoconversion schemes. For the liquid water flux, we introduce a parameterization that is depending on the skewness of the subgrid variability of temperature and moisture and that reproduces the profiles of the liquid water flux well.

  4. Analytical probability density function for the statistics of the ENSO phenomenon: Asymmetry and power law tail

    Science.gov (United States)

    Bianucci, M.

    2016-01-01

    This letter has two main goals. The first one is to give a physically reasonable explanation for the use of stochastic models for mimicking the apparent random features of the El Ninõ-Southern Oscillation (ENSO) phenomenon. The second one is to obtain, from the theory, an analytical expression for the equilibrium density function of the anomaly sea surface temperature, an expression that fits the data from observations well, reproducing the asymmetry and the power law tail of the histograms of the NIÑO3 index. We succeed in these tasks exploiting some recent theoretical results of the author in the field of the dynamical origin of the stochastic processes. More precisely, we apply this approach to the celebrated recharge oscillator model (ROM), weakly interacting by a multiplicative term, with a general deterministic complex forcing (Madden-Julian Oscillations, westerly wind burst, etc.), and we obtain a Fokker-Planck equation that describes the statistical behavior of the ROM.

  5. Active Contour Driven by Local Region Statistics and Maximum A Posteriori Probability for Medical Image Segmentation

    Directory of Open Access Journals (Sweden)

    Xiaoliang Jiang

    2014-01-01

    Full Text Available This paper presents a novel active contour model in a variational level set formulation for simultaneous segmentation and bias field estimation of medical images. An energy function is formulated based on improved Kullback-Leibler distance (KLD with likelihood ratio. According to the additive model of images with intensity inhomogeneity, we characterize the statistics of image intensities belonging to each different object in local regions as Gaussian distributions with different means and variances. Then, we use the Gaussian distribution with bias field as a local region descriptor in level set formulation for segmentation and bias field correction of the images with inhomogeneous intensities. Therefore, image segmentation and bias field estimation are simultaneously achieved by minimizing the level set formulation. Experimental results demonstrate desirable performance of the proposed method for different medical images with weak boundaries and noise.

  6. Performance of Statistical Control Charts with Bilateral Limits of Probability to Monitor Processes Weibull in Maintenance

    Directory of Open Access Journals (Sweden)

    Quintana Alicia Esther

    2015-01-01

    Full Text Available Manufacturing with optimal quality standards is underpinned to the high reliability of its equipment and systems, among other essential pillars. Maintenance Engineering is responsible for planning control and continuous improvement of its critical equipment by any approach, such as Six Sigma. This is nourished by numerous statistical tools highlighting, among them, statistical process control charts. While their first applications were in production, other designs have emerged to adapt to new needs as monitoring equipment and systems in the manufacturing environment. The time between failures usually fits an exponential or Weibull model. The t chart and adjusted t chart, with probabilistic control limits, are suitable alternatives to monitor the mean time between failures. Unfortunately, it is difficult to find publications of them applied to the models Weibull, very useful in contexts such as maintenance. In addition, literature limits the study of their performance to the analysis of the standard metric average run length, thus giving a partial view. The aim of this paper is to explore the performance of the t chart and adjusted t chart using three metrics, two unconventional. To do this, it incorporates the concept of lateral variability, in their forms left and right variability. Major precisions of the behavior of these charts allow to understand the conditions under which are suitable: if the main objective of monitoring lies in detecting deterioration, the t chart with adjustment is recommended. On the other hand, when the priority is to detect improvements, the t chart without adjustment is the best choice. However, the response speed of both charts is very variable from run to run.

  7. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    Directory of Open Access Journals (Sweden)

    Mark William Perlin

    2015-01-01

    Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN

  8. Survival probability and first-passage-time statistics of a Wiener process driven by an exponential time-dependent drift

    Science.gov (United States)

    Urdapilleta, Eugenio

    2011-02-01

    The survival probability and the first-passage-time statistics are important quantities in different fields. The Wiener process is the simplest stochastic process with continuous variables, and important results can be explicitly found from it. The presence of a constant drift does not modify its simplicity; however, when the process has a time-dependent component the analysis becomes difficult. In this work we analyze the statistical properties of the Wiener process with an absorbing boundary, under the effect of an exponential time-dependent drift. Based on the backward Fokker-Planck formalism we set the time-inhomogeneous equation and conditions that rule the diffusion of the corresponding survival probability. We propose as the solution an expansion series in terms of the intensity of the exponential drift, resulting in a set of recurrence equations. We explicitly solve the expansion up to second order and comment on higher-order solutions. The first-passage-time density function arises naturally from the survival probability and preserves the proposed expansion. Explicit results, related properties, and limit behaviors are analyzed and extensively compared to numerical simulations.

  9. Survival probability and first-passage-time statistics of a Wiener process driven by an exponential time-dependent drift.

    Science.gov (United States)

    Urdapilleta, Eugenio

    2011-02-01

    The survival probability and the first-passage-time statistics are important quantities in different fields. The Wiener process is the simplest stochastic process with continuous variables, and important results can be explicitly found from it. The presence of a constant drift does not modify its simplicity; however, when the process has a time-dependent component the analysis becomes difficult. In this work we analyze the statistical properties of the Wiener process with an absorbing boundary, under the effect of an exponential time-dependent drift. Based on the backward Fokker-Planck formalism we set the time-inhomogeneous equation and conditions that rule the diffusion of the corresponding survival probability. We propose as the solution an expansion series in terms of the intensity of the exponential drift, resulting in a set of recurrence equations. We explicitly solve the expansion up to second order and comment on higher-order solutions. The first-passage-time density function arises naturally from the survival probability and preserves the proposed expansion. Explicit results, related properties, and limit behaviors are analyzed and extensively compared to numerical simulations.

  10. A Hybrid Course for Probability and Statistics for Engineers: An E-Readiness Study at Shahid Beheshti University

    Directory of Open Access Journals (Sweden)

    Amir T Payandeh

    2010-09-01

    Full Text Available Probability and Statistics for Engineers covers verities of subjects in the set theory, the combinatory analysis, probability, statistics, and (in some universities the stochastic processes. Since, course receives only 3 credits it has to be thought 3 hours/week. This overloading content along with time limitation make course as a challenging and difficult one for students. Also, many instructors, including the first author, found the course very challenging to teach. Two popular on-site and e-learning training systems do not provide any appropriate solution. This article suggests a hybrid training system, which combines some elements of both training systems to reduce the disadvantages of both systems. Readiness of such hybrid course is measured by preparedness of students for online activities. The readiness study at Shahid Beheshti University shows that Internet skills, self-directed learning, learner attitude toward e-learning, e-mail skills, and software ability of students are factors which are significantly affect readiness of students.

  11. Integrating Geologic, Geochemical and Geophysical Data in a Statistical Analysis of Geothermal Resource Probability across the State of Hawaii

    Science.gov (United States)

    Lautze, N. C.; Ito, G.; Thomas, D. M.; Hinz, N.; Frazer, L. N.; Waller, D.

    2015-12-01

    Hawaii offers the opportunity to gain knowledge and develop geothermal energy on the only oceanic hotspot in the U.S. As a remote island state, Hawaii is more dependent on imported fossil fuel than any other state in the U.S., and energy prices are 3 to 4 times higher than the national average. The only proven resource, located on Hawaii Island's active Kilauea volcano, is a region of high geologic risk; other regions of probable resource exist but lack adequate assessment. The last comprehensive statewide geothermal assessment occurred in 1983 and found a potential resource on all islands (Hawaii Institute of Geophysics, 1983). Phase 1 of a Department of Energy funded project to assess the probability of geothermal resource potential statewide in Hawaii was recently completed. The execution of this project was divided into three main tasks: (1) compile all historical and current data for Hawaii that is relevant to geothermal resources into a single Geographic Information System (GIS) project; (2) analyze and rank these datasets in terms of their relevance to the three primary properties of a viable geothermal resource: heat (H), fluid (F), and permeability (P); and (3) develop and apply a Bayesian statistical method to incorporate the ranks and produce probability models that map out Hawaii's geothermal resource potential. Here, we summarize the project methodology and present maps that highlight both high prospect areas as well as areas that lack enough data to make an adequate assessment. We suggest a path for future exploration activities in Hawaii, and discuss how this method of analysis can be adapted to other regions and other types of resources. The figure below shows multiple layers of GIS data for Hawaii Island. Color shades indicate crustal density anomalies produced from inversions of gravity (Flinders et al. 2013). Superimposed on this are mapped calderas, rift zones, volcanic cones, and faults (following Sherrod et al., 2007). These features were used

  12. Statistics, Probability and Chaos

    OpenAIRE

    Berliner, L. Mark

    1992-01-01

    The study of chaotic behavior has received substantial attention in many disciplines. Although often based on deterministic models, chaos is associated with complex, "random" behavior and forms of unpredictability. Mathematical models and definitions associated with chaos are reviewed. The relationship between the mathematics of chaos and probabilistic notions, including ergodic theory and uncertainty modeling, are emphasized. Popular data analytic methods appearing in the literature are disc...

  13. Two novel quantitative trait linkage analysis statistics based on the posterior probability of linkage: application to the COGA families.

    Science.gov (United States)

    Bartlett, Christopher W; Vieland, Veronica J

    2005-12-30

    In this paper we apply two novel quantitative trait linkage statistics based on the posterior probability of linkage (PPL) to chromosome 4 from the GAW 14 COGA dataset. Our approaches are advantageous since they use the full likelihood, use full phenotypic information, do not assume normality at the population level or require population/sample parameter estimates; and like other forms of the PPL, they are specifically tailored to accumulate linkage evidence, either for or against linkage, across multiple sets of heterogeneous data. The first statistic uses all quantitative trait (QT) information from the pedigree (QT-posterior probability of linkage, PPL); we applied the QT-PPL to the trait ecb21 (resting electroencephalogram). The second statistic allows simultaneous incorporation of dichotomous trait data into the QT analysis via a threshold model (QTT-PPL); we applied the QTT-PPL to combined data on ecb21 and ALDX1. We obtained a QT-PPL of 96% at GABRB1 and a QT-PPL of 18% at FABP2 while the QTT-PPL was 4% and 2% at the same two loci, respectively. By comparison, the variance-components (VC) method, as implemented in SOLAR, yielded multipoint VC LOD scores of 2.05 and 2.21 at GABRB1 and FABP2, respectively; no other VC LODs were greater than 2. The QTT-PPL was only 4% at GABARB1, which might suggest that the underlying ecb21 gene does not also cause ALDX1, although features of the data complicate interpretation of this result.

  14. Development of a statistical model for the determination of the probability of riverbank erosion in a Meditteranean river basin

    Science.gov (United States)

    Varouchakis, Emmanouil; Kourgialas, Nektarios; Karatzas, George; Giannakis, Georgios; Lilli, Maria; Nikolaidis, Nikolaos

    2014-05-01

    Riverbank erosion affects the river morphology and the local habitat and results in riparian land loss, damage to property and infrastructures, ultimately weakening flood defences. An important issue concerning riverbank erosion is the identification of the areas vulnerable to erosion, as it allows for predicting changes and assists with stream management and restoration. One way to predict the vulnerable to erosion areas is to determine the erosion probability by identifying the underlying relations between riverbank erosion and the geomorphological and/or hydrological variables that prevent or stimulate erosion. A statistical model for evaluating the probability of erosion based on a series of independent local variables and by using logistic regression is developed in this work. The main variables affecting erosion are vegetation index (stability), the presence or absence of meanders, bank material (classification), stream power, bank height, river bank slope, riverbed slope, cross section width and water velocities (Luppi et al. 2009). In statistics, logistic regression is a type of regression analysis used for predicting the outcome of a categorical dependent variable, e.g. binary response, based on one or more predictor variables (continuous or categorical). The probabilities of the possible outcomes are modelled as a function of independent variables using a logistic function. Logistic regression measures the relationship between a categorical dependent variable and, usually, one or several continuous independent variables by converting the dependent variable to probability scores. Then, a logistic regression is formed, which predicts success or failure of a given binary variable (e.g. 1 = "presence of erosion" and 0 = "no erosion") for any value of the independent variables. The regression coefficients are estimated by using maximum likelihood estimation. The erosion occurrence probability can be calculated in conjunction with the model deviance regarding

  15. Interference statistics and capacity analysis for uplink transmission in two-tier small cell networks: A geometric probability approach

    KAUST Repository

    Tabassum, Hina

    2014-07-01

    This paper presents a novel framework to derive the statistics of the interference considering dedicated and shared spectrum access for uplink transmission in two-tier small cell networks such as the macrocell-femtocell networks. The framework exploits the distance distributions from geometric probability theory to characterize the uplink interference while considering a traditional grid-model set-up for macrocells along with the randomly deployed femtocells. The derived expressions capture the impact of path-loss, composite shadowing and fading, uniform and non-uniform traffic loads, spatial distribution of femtocells, and partial and full spectral reuse among femtocells. Considering dedicated spectrum access, first, we derive the statistics of co-tier interference incurred at both femtocell and macrocell base stations (BSs) from a single interferer by approximating generalized- K composite fading distribution with the tractable Gamma distribution. We then derive the distribution of the number of interferers considering partial spectral reuse and moment generating function (MGF) of the cumulative interference for both partial and full spectral reuse scenarios. Next, we derive the statistics of the cross-tier interference at both femtocell and macrocell BSs considering shared spectrum access. Finally, we utilize the derived expressions to analyze the capacity in both dedicated and shared spectrum access scenarios. The derived expressions are validated by the Monte Carlo simulations. Numerical results are generated to assess the feasibility of shared and dedicated spectrum access in femtocells under varying traffic load and spectral reuse scenarios. © 2014 IEEE.

  16. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    Science.gov (United States)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  17. Relationship between the generalized equivalent uniform dose formulation and the Poisson statistics-based tumor control probability model.

    Science.gov (United States)

    Zhou, Su-Min; Das, Shiva; Wang, Zhiheng; Marks, Lawrence B

    2004-09-01

    The generalized equivalent uniform dose (GEUD) model uses a power-law formalism, where the outcome is related to the dose via a power law. We herein investigate the mathematical compatibility between this GEUD model and the Poisson statistics based tumor control probability (TCP) model. The GEUD and TCP formulations are combined and subjected to a compatibility constraint equation. This compatibility constraint equates tumor control probability from the original heterogeneous target dose distribution to that from the homogeneous dose from the GEUD formalism. It is shown that this constraint equation possesses a unique, analytical closed-form solution which relates radiation dose to the tumor cell survival fraction. It is further demonstrated that, when there is no positive threshold or finite critical dose in the tumor response to radiation, this relationship is not bounded within the realistic cell survival limits of 0%-100%. Thus, the GEUD and TCP formalisms are, in general, mathematically inconsistent. However, when a threshold dose or finite critical dose exists in the tumor response to radiation, there is a unique mathematical solution for the tumor cell survival fraction that allows the GEUD and TCP formalisms to coexist, provided that all portions of the tumor are confined within certain specific dose ranges.

  18. Flood probability quantification for road infrastructure: Data-driven spatial-statistical approach and case study applications.

    Science.gov (United States)

    Kalantari, Zahra; Cavalli, Marco; Cantone, Carolina; Crema, Stefano; Destouni, Georgia

    2017-03-01

    Climate-driven increase in the frequency of extreme hydrological events is expected to impose greater strain on the built environment and major transport infrastructure, such as roads and railways. This study develops a data-driven spatial-statistical approach to quantifying and mapping the probability of flooding at critical road-stream intersection locations, where water flow and sediment transport may accumulate and cause serious road damage. The approach is based on novel integration of key watershed and road characteristics, including also measures of sediment connectivity. The approach is concretely applied to and quantified for two specific study case examples in southwest Sweden, with documented road flooding effects of recorded extreme rainfall. The novel contributions of this study in combining a sediment connectivity account with that of soil type, land use, spatial precipitation-runoff variability and road drainage in catchments, and in extending the connectivity measure use for different types of catchments, improve the accuracy of model results for road flood probability. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Variation in the fatal cancer probability per unit effective dose with age at exposure and population statistics

    Energy Technology Data Exchange (ETDEWEB)

    Carter, M.W.

    1995-12-31

    Effective dose is a measure of radiation detriment, that is, the probability of an adverse consequence resulting from radiation exposure. The unit of effective dose is the sievert. Unlike the gray, a radiation dose unit with real physical meaning, the sievert includes non-dimensional factors which can vary. The sievert cannot be directly measured and the radiation detriment per sievert may not be the same in all circumstances. With the change, in ICRP 60, from `absolute` risk to `relative` risk, the risk that the sievert represents now depends, inter alia, on population statistics and age at exposure. With `relative` risk, radiation exposure is assumed to have a multiplying effect on the `natural` cancer incidence. As this varies with age and between populations, the risk per sievert will be different in different populations. Persons with unusual dose accumulation patterns, or who are members of populations with vital statistics significantly different from those used by the ICRP, will experience a risk per sievert different to the ICRP value. Results of calculations for populations representative of a developed country and a hypothetical developing country for a range of age at exposure are presented to illustrate the range of risks represented by the sievert. The calculations indicate that the risk per sievert is less if the dose is received later in life and is less in a developing country than in a developed country, but the range in risk is not great. (author).

  20. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education.

    Science.gov (United States)

    Christou, Nicolas; Dinov, Ivo D

    2010-09-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources.

  1. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education

    Science.gov (United States)

    Christou, Nicolas; Dinov, Ivo D.

    2011-01-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources. PMID:21603097

  2. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    Science.gov (United States)

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  3. Statistics concerning the Apollo command module water landing, including the probability of occurrence of various impact conditions, sucessful impact, and body X-axis loads

    Science.gov (United States)

    Whitnah, A. M.; Howes, D. B.

    1971-01-01

    Statistical information for the Apollo command module water landings is presented. This information includes the probability of occurrence of various impact conditions, a successful impact, and body X-axis loads of various magnitudes.

  4. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  5. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  6. Introduction to Statistics - eNotes

    DEFF Research Database (Denmark)

    Brockhoff, Per B.; Møller, Jan Kloppenborg; Andersen, Elisabeth Wreford

    2015-01-01

    Online textbook used in the introductory statistics courses at DTU. It provides a basic introduction to applied statistics for engineers. The necessary elements from probability theory are introduced (stochastic variable, density and distribution function, mean and variance, etc.) and thereafter...... the most basic statistical analysis methods are presented: Confidence band, hypothesis testing, simulation, simple and muliple regression, ANOVA and analysis of contingency tables. Examples with the software R are included for all presented theory and methods....

  7. Statistical analysis of blocking probability and fragmentation based on Markov modeling of elastic spectrum allocation on fiber link

    Science.gov (United States)

    Rosa, A. N. F.; Wiatr, P.; Cavdar, C.; Carvalho, S. V.; Costa, J. C. W. A.; Wosinska, L.

    2015-11-01

    In Elastic Optical Network (EON), spectrum fragmentation refers to the existence of non-aligned, small-sized blocks of free subcarrier slots in the optical spectrum. Several metrics have been proposed in order to quantify a level of spectrum fragmentation. Approximation methods might be used for estimating average blocking probability and some fragmentation measures, but are so far unable to accurately evaluate the influence of different sizes of connection requests and do not allow in-depth investigation of blocking events and their relation to fragmentation. The analytical study of the effect of fragmentation on requests' blocking probability is still under-explored. In this work, we introduce new definitions for blocking that differentiate between the reasons for the blocking events. We developed a framework based on Markov modeling to calculate steady-state probabilities for the different blocking events and to analyze fragmentation related problems in elastic optical links under dynamic traffic conditions. This framework can also be used for evaluation of different definitions of fragmentation in terms of their relation to the blocking probability. We investigate how different allocation request sizes contribute to fragmentation and blocking probability. Moreover, we show to which extend blocking events, due to insufficient amount of available resources, become inevitable and, compared to the amount of blocking events due to fragmented spectrum, we draw conclusions on the possible gains one can achieve by system defragmentation. We also show how efficient spectrum allocation policies really are in reducing the part of fragmentation that in particular leads to actual blocking events. Simulation experiments are carried out showing good match with our analytical results for blocking probability in a small scale scenario. Simulated blocking probabilities for the different blocking events are provided for a larger scale elastic optical link.

  8. Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes

    Science.gov (United States)

    Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy

    2006-01-01

    We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…

  9. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values

  10. SOME ASPECTS OF THE USE OF MATHEMATICAL-STATISTICAL METHODS IN THE ANALYSIS OF SOCIO-HUMANISTIC TEXTS Humanities and social text, mathematics, method, statistics, probability

    Directory of Open Access Journals (Sweden)

    Zaira M Alieva

    2016-01-01

    Full Text Available The article analyzes the application of mathematical and statistical methods in the analysis of socio-humanistic texts. The essence of mathematical and statistical methods, presents examples of their use in the study of Humanities and social phenomena. Considers the key issues faced by the expert in the application of mathematical-statistical methods in socio-humanitarian sphere, including the availability of sustainable contrasting socio-humanitarian Sciences and mathematics; the complexity of the allocation of the object that is the bearer of the problem; having the use of a probabilistic approach. The conclusion according to the results of the study.

  11. Statistical methods to quantify the effect of mite parasitism on the probability of death in honey bee colonies

    Science.gov (United States)

    Varroa destructor is a mite parasite of European honey bees, Apis mellifera, that weakens the population, can lead to the death of an entire honey bee colony, and is believed to be the parasite with the most economic impact on beekeeping. The purpose of this study was to estimate the probability of ...

  12. Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection

    Science.gov (United States)

    Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...

  13. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    Science.gov (United States)

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  14. Joint Contingency Contracting

    Science.gov (United States)

    2005-06-01

    parents, Ellsworth K. Johnson, Jr. and Helen E. Johnson, for their lifelong love and support. Mom, thanks for always being there when I needed you...Hurricanes Charley , Jeanne and Andrew are examples of domestic disaster emergency relief. 2. Phases of Contingency Contracting Contingency

  15. Natural analogue study of CO2 storage monitoring using probability statistics of CO2-rich groundwater chemistry

    Science.gov (United States)

    Kim, K. K.; Hamm, S. Y.; Kim, S. O.; Yun, S. T.

    2016-12-01

    For confronting global climate change, carbon capture and storage (CCS) is one of several very useful strategies as using capture of greenhouse gases like CO2 spewed from stacks and then isolation of the gases in underground geologic storage. CO2-rich groundwater could be produced by CO2 dissolution into fresh groundwater around a CO2 storage site. As consequence, natural analogue studies related to geologic storage provide insights into future geologic CO2 storage sites as well as can provide crucial information on the safety and security of geologic sequestration, the long-term impact of CO2 storage on the environment, and field operation and monitoring that could be implemented for geologic sequestration. In this study, we developed CO2 leakage monitoring method using probability density function (PDF) by characterizing naturally occurring CO2-rich groundwater. For the study, we used existing data of CO2-rich groundwaters in different geological regions (Gangwondo, Gyeongsangdo, and Choongchungdo provinces) in South Korea. Using PDF method and QI (quantitative index), we executed qualitative and quantitative comparisons among local areas and chemical constituents. Geochemical properties of groundwater with/without CO2 as the PDF forms proved that pH, EC, TDS, HCO3-, Ca2+, Mg2+, and SiO2 were effective monitoring parameters for carbonated groundwater in the case of CO2leakage from an underground storage site. KEY WORDS: CO2-rich groundwater, CO2 storage site, monitoring parameter, natural analogue, probability density function (PDF), QI_quantitative index Acknowledgement This study was supported by the "Basic Science Research Program through the National Research Foundation of Korea (NRF), which is funded by the Ministry of Education (NRF-2013R1A1A2058186)" and the "R&D Project on Environmental Management of Geologic CO2 Storage" from KEITI (Project number: 2014001810003).

  16. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  17. Using Logistic Regression and Random Forests multivariate statistical methods for landslide spatial probability assessment in North-Est Sicily, Italy

    Science.gov (United States)

    Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele

    2015-04-01

    North-East Sicily is strongly exposed to shallow landslide events. On October, 1st 2009 a severe rainstorm (225.5 mm of cumulative rainfall in 9 hours) caused flash floods and more than 1000 landslides, which struck several small villages as Giampilieri, Altolia, Molino, Pezzolo, Scaletta Zanclea, Itala, with 31 fatalities, 6 missing persons and damage to buildings and transportation infrastructures. Landslides, mainly consisting in earth and debris translational slides evolving into debris flows, triggered on steep slopes involving colluvium and regolith materials which cover the underlying metamorphic bedrock of Peloritani Mountains. In this area catchments are small (about 10 square kilometres), elongated, with steep slopes, low order streams, short time of concentration, and discharge directly into the sea. In the past, landslides occurred at Altolia in 1613 and 2000, at Molino in 1750, 1805 and 2000, at Giampilieri in 1791, 1918, 1929, 1932, 2000 and on October 25, 2007. The aim of this work is to define susceptibility models for shallow landslides using multivariate statistical analyses in the Giampilieri area (25 square kilometres). A detailed landslide inventory map has been produced, as the first step, through field surveys coupled with the observation of high resolution aerial colour orthophoto taken immediately after the event. 1,490 initiation zones have been identified; most of them have planimetric dimensions ranging between tens to few hundreds of square metres. The spatial hazard assessment has been focused on the detachment areas. Susceptibility models, performed in a GIS environment, took into account several parameters. The morphometric and hydrologic parameters has been derived from a detailed LiDAR 1×1 m. Square grid cells of 4×4 m were adopted as mapping units, on the basis of the area-frequency distribution of the detachment zones, and the optimal representation of the local morphometric conditions (e.g. slope angle, plan curvature). A

  18. The DO-climate events are probably noise induced: statistical investigation of the claimed 1470 years cycle

    Directory of Open Access Journals (Sweden)

    P. D. Ditlevsen

    2007-01-01

    Full Text Available The significance of the apparent 1470 years cycle in the recurrence of the Dansgaard-Oeschger (DO events, observed in the Greenland ice cores, is debated. Here we present statistical significance tests of this periodicity. The detection of a periodicity relies strongly on the accuracy of the dating of the DO events. Here we use both the new NGRIP GICC05 time scale based on multi-parameter annual layer counting and the GISP2 time scale where the periodicity is most pronounced. For the NGRIP dating the recurrence times are indistinguishable from a random occurrence. This is also the case for the GISP2 dating, except in the case where the DO9 event is omitted from the record.

  19. Statistically advanced, self-similar, radial probability density functions of atmospheric and under-expanded hydrogen jets

    Science.gov (United States)

    Ruggles, Adam J.

    2015-11-01

    This paper presents improved statistical insight regarding the self-similar scalar mixing process of atmospheric hydrogen jets and the downstream region of under-expanded hydrogen jets. Quantitative planar laser Rayleigh scattering imaging is used to probe both jets. The self-similarity of statistical moments up to the sixth order (beyond the literature established second order) is documented in both cases. This is achieved using a novel self-similar normalization method that facilitated a degree of statistical convergence that is typically limited to continuous, point-based measurements. This demonstrates that image-based measurements of a limited number of samples can be used for self-similar scalar mixing studies. Both jets exhibit the same radial trends of these moments demonstrating that advanced atmospheric self-similarity can be applied in the analysis of under-expanded jets. Self-similar histograms away from the centerline are shown to be the combination of two distributions. The first is attributed to turbulent mixing. The second, a symmetric Poisson-type distribution centered on zero mass fraction, progressively becomes the dominant and eventually sole distribution at the edge of the jet. This distribution is attributed to shot noise-affected pure air measurements, rather than a diffusive superlayer at the jet boundary. This conclusion is reached after a rigorous measurement uncertainty analysis and inspection of pure air data collected with each hydrogen data set. A threshold based upon the measurement noise analysis is used to separate the turbulent and pure air data, and thusly estimate intermittency. Beta-distributions (four parameters) are used to accurately represent the turbulent distribution moments. This combination of measured intermittency and four-parameter beta-distributions constitutes a new, simple approach to model scalar mixing. Comparisons between global moments from the data and moments calculated using the proposed model show excellent

  20. Four hundred or more participants needed for stable contingency table estimates of clinical prediction rule performance

    DEFF Research Database (Denmark)

    Kent, Peter; Boyle, Eleanor; Keating, Jennifer L

    2017-01-01

    OBJECTIVE: To quantify variability in the results of statistical analyses based on contingency tables and discuss the implications for the choice of sample size for studies that derive clinical prediction rules. STUDY DESIGN AND SETTING: An analysis of three pre-existing sets of large cohort data...... (n= 4,062 to 8,674) was performed. In each dataset, repeated random-sampling of various sample sizes, from n=100 up to n=2,000, was performed 100 times at each sample size and the variability in estimates of sensitivity, specificity, positive and negative likelihood ratios, post-test probabilities......, odds ratios and risk/prevalence ratios, for each sample size was calculated. RESULTS: There were very wide, and statistically significant, differences in estimates derived from contingency tables from the same dataset when calculated in sample sizes below 400 people, and typically this variability...

  1. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  2. Waterfowl disease contingency plan

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The purpose of this contingency plan is reduce waterfowl losses from disease, primarily avian botulism, along the eastern shore of the Great Salt Lake in Utah. This...

  3. Contingency Contracting Customer Guide

    Science.gov (United States)

    1996-12-01

    deployed contracting officer to train individual customers on the process, the customer support guide provides the necessary explanations without...straining valuable manpower resources. The Contracting Deployment Customer Guide aids the customer in contingency situations and addresses purchase requests

  4. Contingent Information Systems Development

    NARCIS (Netherlands)

    van Slooten, C.; Schoonhoven, Bram

    1996-01-01

    Situated approaches based on project contingencies are becoming more and more an important research topic for information systems development organizations. The Information Services Organization, which was investigated, has recognized that it should tune its systems development approaches to the

  5. Contingency planning: preparation of contingency plans

    DEFF Research Database (Denmark)

    Westergaard, J M

    2008-01-01

    . The risk of introducing disease pathogens into a country and the spread of the agent within a country depends on a number of factors including import controls, movement of animals and animal products and the biosecurity applied by livestock producers. An adequate contingency plan is an important instrument......Outbreaks of infectious animal diseases such as foot-and-mouth disease, classical swine fever, Newcastle disease and avian influenza may have a devastating impact, not only on the livestock sector and the rural community in the directly affected areas, but also beyond agriculture and nation wide...... in the preparation for and the handling of an epidemic. The legislation of the European Union requires that all Member States draw up a contingency plan which specifies the national measures required to maintain a high level of awareness and preparedness and is to be implemented in the event of disease outbreak...

  6. Factual and cognitive probability

    OpenAIRE

    Chuaqui, Rolando

    2012-01-01

    This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...

  7. Probability and statistics with R

    CERN Document Server

    Ugarte, Maria Dolores; Arnholt, Alan T

    2008-01-01

    -Technometrics, May 2009, Vol. 51, No. 2 The book is comprehensive and well written. The notation is clear and the mathematical derivations behind nontrivial equations and computational implementations are carefully explained. Rather than presenting a collection of R scripts together with a summary of relevant theoretical results, this book offers a well-balanced mix of theory, examples and R code.-Raquel Prado, University of California, Santa Cruz,  The American Statistician, February 2009… an impressive book … Overall, this is a good reference book with comprehensive coverage of the details

  8. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  9. Contingent Faculty as Nonideal Workers

    Science.gov (United States)

    Kezar, Adrianna; Bernstein-Sierra, Samantha

    2016-01-01

    This chapter explores how contingent faculty address the issue of work and family and demonstrates the importance of understanding the diversity of contingent faculty experiences and of underemployment rather than notions of the ideal worker to explain their work lives.

  10. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  11. Supervisory Styles: A Contingency Framework

    Science.gov (United States)

    Boehe, Dirk Michael

    2016-01-01

    While the contingent nature of doctoral supervision has been acknowledged, the literature on supervisory styles has yet to deliver a theory-based contingency framework. A contingency framework can assist supervisors and research students in identifying appropriate supervisory styles under varying circumstances. The conceptual study reported here…

  12. Evaluating probability forecasts

    OpenAIRE

    Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo

    2011-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...

  13. Modeling and simulation of cascading contingencies

    Science.gov (United States)

    Zhang, Jianfeng

    This dissertation proposes a new approach to model and study cascading contingencies in large power systems. The most important contribution of the work involves the development and validation of a heuristic analytic model to assess the likelihood of cascading contingencies, and the development and validation of a uniform search strategy. We model the probability of cascading contingencies as a function of power flow and power flow changes. Utilizing logistic regression, the proposed model is calibrated using real industry data. This dissertation analyzes random search strategies for Monte Carlo simulations and proposes a new uniform search strategy based on the Metropolis-Hastings Algorithm. The proposed search strategy is capable of selecting the most significant cascading contingencies, and it is capable of constructing an unbiased estimator to provide a measure of system security. This dissertation makes it possible to reasonably quantify system security and justify security operations when economic concerns conflict with reliability concerns in the new competitive power market environment. It can also provide guidance to system operators about actions that may be taken to reduce the risk of major system blackouts. Various applications can be developed to take advantage of the quantitative security measures provided in this dissertation.

  14. Contingencies of Value

    DEFF Research Database (Denmark)

    Strandvad, Sara Malou

    2014-01-01

    Based on a study of the admission test at a design school, this paper investigates the contingencies of aesthetic values as these become visible in assessment practices. Theoretically, the paper takes its starting point in Herrnstein Smith’s notion of ‘contingencies of values’ and outlines...... a pragmatist ground where cultural sociology and economic sociology meet. Informed by the literature on cultural intermediaries, the paper discusses the role of evaluators and the devices which accompany them. Whereas studies of cultural intermediaries traditionally apply a Bourdieusian perspective, recent...... developments within this field of literature draws inspiration from the socalled ‘new new economic sociology,’ which this paper adds to. While the admission test is easily described as a matter of overcoming “subjective” aesthetic evaluations by means of “objective” and standardized assessment criteria...

  15. Applied Problems and Use of Technology in an Aligned Way in Basic Courses in Probability and Statistics for Engineering Students--A Way to Enhance Understanding and Increase Motivation

    Science.gov (United States)

    Zetterqvist, Lena

    2017-01-01

    Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…

  16. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  17. Fuel rod pressure in nuclear power reactors: Statistical evaluation of the fuel rod internal pressure in LWRs with application to lift-off probability

    Energy Technology Data Exchange (ETDEWEB)

    Jelinek, Tomas

    2001-02-01

    In this thesis, a methodology for quantifying the risk of exceeding the Lift-off limit in nuclear light water power reactors is outlined. Due to fission gas release, the pressure in the gap between the fuel pellets and the cladding increases with burnup of the fuel. An increase in the fuel-clad gap due to clad creep would be expected to result in positive feedback, in the form of higher fuel temperatures, leading to more fission gas release, higher rod pressure, etc, until the cladding breaks. An increase in the fuel-clad gap that leads to this positive feedback is a phenomenon called Lift-off and is a limitation that must be considered in the fuel core management. Lift-off is a consequence of very high internal fuel rod pressure. The internal fuel rod pressure is therefore used as a Lift-off indicator. The internal fuel rod pressure is closely connected to the fission gas release into the fuel rod plenum and is thus used to increase the database. It is concluded that the dominating error source in the prediction of the pressure in Boiling Water Reactors (BWR), is the power history. There is a bias in the fuel pressure prediction that is dependent on the fuel rod position in the fuel assembly for BWRs. A methodology to quantify the risk of the fuel rod internal pressure exceeding a certain limit is developed; the risk is dependent of the pressure prediction and the fuel rod position. The methodology is based on statistical treatment of the discrepancies between predicted and measured fuel rod internal pressures. Finally, a methodology to estimate the Lift-off probability of the whole core is outlined.

  18. Using GAISE and NCTM Standards as Frameworks for Teaching Probability and Statistics to Pre-Service Elementary and Middle School Mathematics Teachers

    Science.gov (United States)

    Metz, Mary Louise

    2010-01-01

    Statistics education has become an increasingly important component of the mathematics education of today's citizens. In part to address the call for a more statistically literate citizenship, The "Guidelines for Assessment and Instruction in Statistics Education (GAISE)" were developed in 2005 by the American Statistical Association. These…

  19. Contingent kernel density estimation.

    Directory of Open Access Journals (Sweden)

    Scott Fortmann-Roe

    Full Text Available Kernel density estimation is a widely used method for estimating a distribution based on a sample of points drawn from that distribution. Generally, in practice some form of error contaminates the sample of observed points. Such error can be the result of imprecise measurements or observation bias. Often this error is negligible and may be disregarded in analysis. In cases where the error is non-negligible, estimation methods should be adjusted to reduce resulting bias. Several modifications of kernel density estimation have been developed to address specific forms of errors. One form of error that has not yet been addressed is the case where observations are nominally placed at the centers of areas from which the points are assumed to have been drawn, where these areas are of varying sizes. In this scenario, the bias arises because the size of the error can vary among points and some subset of points can be known to have smaller error than another subset or the form of the error may change among points. This paper proposes a "contingent kernel density estimation" technique to address this form of error. This new technique adjusts the standard kernel on a point-by-point basis in an adaptive response to changing structure and magnitude of error. In this paper, equations for our contingent kernel technique are derived, the technique is validated using numerical simulations, and an example using the geographic locations of social networking users is worked to demonstrate the utility of the method.

  20. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  1. Random walks on three-strand braids and on related hyperbolic groups 05.40.-a Fluctuation phenomena, random processes, noise, and Brownian motion; 02.50.-r Probability theory, stochastic processes, and statistics; 02.40.Ky Riemannian geometries;

    CERN Document Server

    Nechaev, S

    2003-01-01

    We investigate the statistical properties of random walks on the simplest nontrivial braid group B sub 3 , and on related hyperbolic groups. We provide a method using Cayley graphs of groups allowing us to compute explicitly the probability distribution of the basic statistical characteristics of random trajectories - the drift and the return probability. The action of the groups under consideration in the hyperbolic plane is investigated, and the distribution of a geometric invariant - the hyperbolic distance - is analysed. It is shown that a random walk on B sub 3 can be viewed as a 'magnetic random walk' on the group PSL(2, Z).

  2. Random walks on three-strand braids and on related hyperbolic groups[05.40.-a Fluctuation phenomena, random processes, noise, and Brownian motion; 02.50.-r Probability theory, stochastic processes, and statistics; 02.40.Ky Riemannian geometries;

    Energy Technology Data Exchange (ETDEWEB)

    Nechaev, Sergei [Laboratoire de Physique Theorique et Modeles Statistiques, Universite Paris Sud, 91405 Orsay Cedex (France); Voituriez, Raphael [Laboratoire de Physique Theorique et Modeles Statistiques, Universite Paris Sud, 91405 Orsay Cedex (France)

    2003-01-10

    We investigate the statistical properties of random walks on the simplest nontrivial braid group B{sub 3}, and on related hyperbolic groups. We provide a method using Cayley graphs of groups allowing us to compute explicitly the probability distribution of the basic statistical characteristics of random trajectories - the drift and the return probability. The action of the groups under consideration in the hyperbolic plane is investigated, and the distribution of a geometric invariant - the hyperbolic distance - is analysed. It is shown that a random walk on B{sub 3} can be viewed as a 'magnetic random walk' on the group PSL(2, Z)

  3. MARKOV CHAIN MODEL FOR PROBABILITY OF DRY, WET DAYS AND STATISTICAL ANALISIS OF DAILY RAINFALL IN SOME CLIMATIC ZONE OF IRAN

    Directory of Open Access Journals (Sweden)

    N. SHAHRAKI

    2013-03-01

    Full Text Available Water scarcity is a major problem in arid and semi-arid areas. The scarcity of water is further stressed by the growing demand due to increase in population growth in developing countries. Climate change and its outcomes on precipitation and water resources is the other problem in these areas. Several models are widely used for modeling daily precipitation occurrence. In this study, Markov Chain Model has been extensively used to study spell distribution. For this purpose, a day period was considered as the optimum length of time. Given the assumption that the Markov chain model is the right model for daily precipitation occurrence, the choice of Markov model order was examined on a daily basis for 4 synoptic weather stations with different climates in Iran (Gorgan, Khorram Abad, Zahedan, Tabrizduring 1978-2009. Based on probability rules, events possibility of sequential dry and wet days, these data were analyzed by stochastic process and Markov Chain method. Then probability matrix was calculated by maximum likelihood method. The possibility continuing2-5days of dry and wet days were calculated. The results showed that the probability maximum of consecutive dry period and climatic probability of dry days has occurred in Zahedan. The probability of consecutive dry period has fluctuated from 73.3 to 100 percent. Climatic probability of occurrence of dry days would change in the range of 70.96 to 100 percent with the average probability of about 90.45 percent.

  4. The Ethics of Group Contingencies.

    Science.gov (United States)

    Sapon-Shevin, Mara

    Group contingencies structure situations which link individual behavior with group outcomes, attempting to change behavior through peer pressure. As such, group contingencies raise numerous methodological and ethical problems, and illuminate the relationship between what data is collected and what subsequent decisions can be made. Over 100…

  5. Waste Management Project Contingency Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Edward L. Parsons, Jr.

    1999-08-31

    The purpose of this report is to provide the office of Waste Management (WM) with recommended contingency calculation procedures for typical WM projects. Typical projects were defined as conventional construction-type activities that use innovative elements when necessary to meet the project objectives. Projects involve treatment, storage, and disposal of low level, mixed low level, hazardous, transuranic, and high level waste. Cost contingencies are an essential part of Total Cost Management. A contingency is an amount added to a cost estimate to compensate for unexpected expenses resulting from incomplete design, unforeseen and unpredictable conditions, or uncertainties in the project scope (DOE 1994, AACE 1998). Contingency allowances are expressed as percentages of estimated cost and improve cost estimates by accounting for uncertainties. The contingency allowance is large at the beginning of a project because there are more uncertainties, but as a project develops, the allowance shrinks to adjust for costs already incurred. Ideally, the total estimated cost remains the same throughout a project. Project contingency reflects the degree of uncertainty caused by lack of project definition, and process contingency reflects the degree of uncertainty caused by use of new technology. Different cost estimation methods were reviewed and compared with respect to terminology, accuracy, and Cost Guide standards. The Association for the Advancement of Cost Engineering (AACE) methods for cost estimation were selected to represent best industry practice. AACE methodology for contingency analysis can be readily applied to WM Projects, accounts for uncertainties associated with different stages of a project, and considers both project and process contingencies and the stage of technical readiness. As recommended, AACE contingency allowances taper off linearly as a project nears completion.

  6. Contingencies and onus in the delictual law of damages | Steynberg ...

    African Journals Online (AJOL)

    law the burden of proof rests upon the plaintiff and the expected measure of proof is a preponderance of probability. It has become clear that the terms 'burden of proof' and 'measure of proof' according to their strict evidentiary meaning, do not fit naturally into the theory of proof in the case of contingencies. If the amount of ...

  7. Judgment of contingency in depressed and nondepressed students: sadder but wiser?

    Science.gov (United States)

    Alloy, L B; Abramson, L Y

    1979-12-01

    How are humans' subjective judgments of contingencies related to objective contingencies? Work in social psychology and human contingency learning predicts that the greater the frequency of desired outcomes, the greater people's judgments of contingency will be. Second, the learned helplessness theory of depression provides both a strong and a weak prediction concerning the linkage between subjective and objective contingencies. According to the strong prediction, depressed individuals should underestimate the degree of contingency between their responses and outcomes relative to the objective degree of contingency. According to the weak prediction, depressed individuals merely should judge that there is a smaller degree of contingency between their responses and outcomes than nondepressed individuals should. In addition, the present investigation deduced a new strong prediction from the helplessness theory: Nondepressed individuals should overestimate the degree of contingency between their responses and outcomes relative to the objective degree of contingency. In the experiments, depressed and nondepressed students were present with one of a series of problems varying in the actual degree of contingency. In each problem, subjects estimated the degree of contingency between their responses (pressing or not pressing a button) and an environmental outcome (onset of a green light). Performance on a behavioral task and estimates of the conditional probability of green light onset associated with the two response alternatives provided additional measures for assessing beliefs about contingencies. Depressed students' judgments of contingency were surprisingly accurate in all four experiments. Nondepressed students, on the other hand, overestimated the degree of contingency between their responses and outcomes when noncontingent outcomes were frequent and/or desired and underestimated the degree of contingency when contingent outcomes were undesired. Thus, predictions

  8. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  9. Heat capacity and sticking probability measurements of sup 4 He submonolayers adsorbed on evaporated Ag films: Bose statistics in two dimensions

    Energy Technology Data Exchange (ETDEWEB)

    Kenny, T.W.; Richards, P.L. (Department of Physics, University of California, Berkeley, Berkeley, CA (USA) Materials and Chemical Sciences Division, Lawrence Berkeley Laboratories, Berkeley, CA (USA))

    1990-05-14

    We have measured the heat capacity of submonolayers of {sup 4}He adsorbed on Ag films between 1.7 and 3.3 K. Good fits to the results are obtained with a model of a noninteracting two-dimensional Bose gas. The sticking probability for room-temperature {sup 4}He atoms on cold Ag has been measured as a function of substrate temperature and {sup 4}He coverage. The sticking probability is 4% at low coverage, and abruptly drops to 1% for coverages above 0.5 monolayer.

  10. Transition probabilities in a problem of stochastic process switching

    NARCIS (Netherlands)

    Veestraeten, D.

    2012-01-01

    Extant solutions for state-contingent process switching use first-passage time densities or differential equations. We alternatively employ transition probabilities. These conditional likelihood functions also have obvious appeal for econometric analyses as well as derivative pricing and decision

  11. National Contingency Plan Subpart J

    Science.gov (United States)

    Subpart J of the National Oil and Hazardous Substances Pollution Contingency Plan (NCP) directs EPA to prepare a schedule of dispersants, other chemicals, and oil spill mitigating devices and substances that may be used to remove or control oil discharges.

  12. Energy Emergency and Contingency Planning

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — This Region 3 document outlines the purpose of Energy Emergency and Contingency Plans. These plans are intended to help refuges continue to function during energy...

  13. RMA Emergency Management / Contingency Plan

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The RMA Emergency Management/Contingency Plan (EM/CP) provides guidance to the Rocky Mountain Arsenal (RMA) and the RMA National Wildlife Refuge (RMANWR) managers...

  14. Mobile contingency unit

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Sergio O. da; Magalhaes, Milton P. de [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil); Junqueira, Rodrigo A.; Torres, Carlos A.R. [PETROBRAS Transporte S/A (TRANSPETRO), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper is aimed at presenting what is already a reality in TRANSPETRO in the area covered by OSBRA, a pipeline that carries by-products to the Mid-West region of Brazil. In order to meet the needs of covering occasional accidents, TRANSPETRO counts on a standardized system of emergency management. It is a great challenge to secure an efficient communication along the 964 km of extension, considering that there are shadow zones where it is not possible to use conventional means of communication such as mobile telephony and internet. It was in this context that the Mobile Contingency Unit Via Satellite - MCU was developed, to extend the communication facilities existing in fixed installations to remote places, mainly the pipeline right of ways. In case of emergency, simulation and work in the pipeline right of way, MCU is fully able to provide the same data, voice, closed-circuit TV and satellite video conference facilities that are available in any internal area of the PETROBRAS system. (author)

  15. Contingent Commitments: Bringing Part-Time Faculty into Focus. Methodology Supplement

    Science.gov (United States)

    Center for Community College Student Engagement, 2014

    2014-01-01

    Center reporting prior to 2013 focused primarily on descriptive statistics (frequencies and means) of student and faculty behaviors. The goal of the analyses reported here and in "Contingent Commitments: Bringing Part-Time Faculty into Focus" is to understand the engagement of part-time or contingent faculty in various activities that…

  16. In All Probability, Probability is not All

    Science.gov (United States)

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  17. Spectral clustering and biclustering learning large graphs and contingency tables

    CERN Document Server

    Bolla, Marianna

    2013-01-01

    Explores regular structures in graphs and contingency tables by spectral theory and statistical methods This book bridges the gap between graph theory and statistics by giving answers to the demanding questions which arise when statisticians are confronted with large weighted graphs or rectangular arrays. Classical and modern statistical methods applicable to biological, social, communication networks, or microarrays are presented together with the theoretical background and proofs. This book is suitable for a one-semester course for graduate students in data mining, mult

  18. On Randomness and Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.

  19. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  20. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-06-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  1. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...

  2. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  3. The Effect of the Probability of Correct Response on the Variability of Measures of Differential Item Functioning. Program Statistics Research Technical Report No. 94-4.

    Science.gov (United States)

    Zwick, Rebecca

    The Mantel Haenszel (MH; 1959) approach of Holland and Thayer (1988) is a well-established method for assessing differential item functioning (DIF). The formula for the variance of the MH DIF statistic is based on work by Phillips and Holland (1987) and Robins, Breslow, and Greenland (1986). Recent simulation studies showed that the MH variances…

  4. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  5. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....

  6. Combining contingency tables with missing dimensions.

    Science.gov (United States)

    Dominici, F

    2000-06-01

    We propose a methodology for estimating the cell probabilities in a multiway contingency table by combining partial information from a number of studies when not all of the variables are recorded in all studies. We jointly model the full set of categorical variables recorded in at least one of the studies, and we treat the variables that are not reported as missing dimensions of the study-specific contingency table. For example, we might be interested in combining several cohort studies in which the incidence in the exposed and nonexposed groups is not reported for all risk factors in all studies while the overall numbers of cases and cohort size is always available. To account for study-to-study variability, we adopt a Bayesian hierarchical model. At the first stage of the model, the observation stage, data are modeled by a multinomial distribution with fixed total number of observations. At the second stage, we use the logistic normal (LN) distribution to model variability in the study-specific cells' probabilities. Using this model and data augmentation techniques, we reconstruct the contingency table for each study regardless of which dimensions are missing, and we estimate population parameters of interest. Our hierarchical procedure borrows strength from all the studies and accounts for correlations among the cells' probabilities. The main difficulty in combining studies recording different variables is in maintaining a consistent interpretation of parameters across studies. The approach proposed here overcomes this difficulty and at the same time addresses the uncertainty arising from the missing dimensions. We apply our modeling strategy to analyze data on air pollution and mortality from 1987 to 1994 for six U.S. cities by combining six cross-classifications of low, medium, and high levels of mortality counts, particulate matter, ozone, and carbon monoxide with the complication that four of the six cities do not report all the air pollution variables. Our

  7. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  8. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  9. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  10. Contingent Conspiracies: Art, Philosophy, Science

    DEFF Research Database (Denmark)

    Wilson, Alexander

    2013-01-01

    with contingency (Negarestani), they do so today with a new paradigm of scientific knowledge at their disposal. For science too has increasingly been forced to respond to the notion of contingency. Progressively discovering the ubiquity of non-linear dynamics, deterministic chaos and emergent complexity...... in the natural world, science has been forced it to reassess its foundational motifs. Contingency and necessity, or unreason and reason, are philosophically inextricable from science’s discovery of systems where randomness emerges from simple deterministic processes or where ordered patterns emerge from random...... distributions (chaos). Perhaps most intriguingly of all, on the “edge of chaos”, between order and randomness, universal computing (Turing) has been shown to emerge in simple rule-based systems such as Conway’s “Game of Life” and Wolfram’s “rule 110”, implying that any computational universe can be emulated...

  11. Surface drift prediction in the Adriatic Sea using hyper-ensemble statistics on atmospheric, ocean and wave models: Uncertainties and probability distribution areas

    Science.gov (United States)

    Rixen, M.; Ferreira-Coelho, E.; Signell, R.

    2008-01-01

    Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).

  12. Contingency Theories of Leadership: A Study.

    Science.gov (United States)

    Saha, Sunhir K.

    1979-01-01

    Some of the major contingency theories of leadership are reviewed; some results from the author's study of Fiedler's contingency model are reported; and some thoughts for the future of leadership research are provided. (Author/MLF)

  13. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    Science.gov (United States)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  14. The Contingency Theory of Education.

    Science.gov (United States)

    Goodnow, Wilma Elizabeth

    1982-01-01

    Develops a conceptual framework for determining the appropriateness of various methodologies. Concludes that educators should stop switching from one to another and recognize that the best methodology is contingent upon the circumstances. (Falmer Press, Falmer House, Barcombe, Nr Lewes, East Sussex, BN8 5DL, UK) (JOW)

  15. The Psychophysics of Contingency Assessment

    Science.gov (United States)

    Allan, Lorraine G.; Hannah, Samuel D.; Crump, Matthew J. C.; Siegel, Shepard

    2008-01-01

    The authors previously described a procedure that permits rapid, multiple within-participant evaluations of contingency assessment (the "streamed-trial" procedure, M. J. C. Crump, S. D. Hannah, L. G. Allan, & L. K. Hord, 2007). In the present experiments, they used the streamed-trial procedure, combined with the method of constant stimuli and a…

  16. How Precarious Is Contingent Work?

    DEFF Research Database (Denmark)

    Scheuer, Steen

    2015-01-01

    rights and opportunities in the job. The analyses (based on logit modelling, multivariate logistic regression) clearly show that contingent employment (e.g. as a temp) is a risk condition, not only because of the stipulated end of the employment period, but also because it implies a clearly lower chance...

  17. Job satisfaction and contingent employment

    NARCIS (Netherlands)

    de Graaf-Zijl, M.

    2012-01-01

    This paper analyses job satisfaction as an aggregate of satisfaction with several job aspects, with special focus on the influence of contingent-employment contracts. Fixed-effect analysis is applied on a longitudinal sample of Dutch employees in four work arrangements: regular, fixed-term, on-call

  18. Univariate Probability Distributions

    Science.gov (United States)

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  19. the theory of probability

    Indian Academy of Sciences (India)

    important practical applications in statistical quality control. Of a similar kind are the laws of probability for the scattering of missiles, which are basic in the ..... deviations for different ranges for each type of gun and of shell are found empirically in firing practice on an artillery range. But the subsequent solution of all possible ...

  20. Distinct motivational effects of contingent and non-contingent rewards

    OpenAIRE

    Manohar, S.; Finzi, R; Drew, DS; Husain, M

    2017-01-01

    When rewards are available, people expend more energy, increasing their motivational vigor. In theory, incentives might drive behavior for two distinct reasons: First, they increase expected reward; second, they increase the difference in subjective value between successful and unsuccessful performance, which increases contingency?the degree to which action determines outcome. Previous studies of motivational vigor have never compared these directly. Here, we indexed motivational vigor by mea...

  1. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  2. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  3. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  4. Statistical eruption forecast for the Chilean Southern Volcanic Zone: typical probabilities of volcanic eruptions as baseline for possibly enhanced activity following the large 2010 Concepción earthquake

    Directory of Open Access Journals (Sweden)

    Y. Dzierma

    2010-10-01

    Full Text Available A probabilistic eruption forecast is provided for ten volcanoes of the Chilean Southern Volcanic Zone (SVZ. Since 70% of the Chilean population lives in this area, the estimation of future eruption likelihood is an important part of hazard assessment. After investigating the completeness and stationarity of the historical eruption time series, the exponential, Weibull, and log-logistic distribution functions are fit to the repose time distributions for the individual volcanoes and the models are evaluated. This procedure has been implemented in two different ways to methodologically compare details in the fitting process. With regard to the probability of at least one VEI ≥ 2 eruption in the next decade, Llaima, Villarrica and Nevados de Chillán are most likely to erupt, while Osorno shows the lowest eruption probability among the volcanoes analysed. In addition to giving a compilation of the statistical eruption forecasts along the historically most active volcanoes of the SVZ, this paper aims to give "typical" eruption probabilities, which may in the future permit to distinguish possibly enhanced activity in the aftermath of the large 2010 Concepción earthquake.

  5. A seismic probability map

    Directory of Open Access Journals (Sweden)

    J. M. MUNUERA

    1964-06-01

    Full Text Available The material included in former two papers (SB and EF
    which summs 3307 shocks corresponding to 2360 years, up to I960, was
    reduced to a 50 years period by means the weight obtained for each epoch.
    The weitliing factor is the ratio 50 and the amount of years for every epoch.
    The frequency has been referred over basis VII of the international
    seismic scale of intensity, for all cases in which the earthquakes are equal or
    greater than VI and up to IX. The sum of products: frequency and parameters
    previously exposed, is the probable frequency expected for the 50
    years period.
    On each active small square, we have made the corresponding computation
    and so we have drawn the Map No 1, in percentage. The epicenters with
    intensity since X to XI are plotted in the Map No 2, in order to present a
    complementary information.
    A table shows the return periods obtained for all data (VII to XI,
    and after checking them with other computed from the first up to last shock,
    a list includes the probable approximate return periods estimated for the area.
    The solution, we suggest, is an appropriated form to express the seismic
    contingent phenomenon and it improves the conventional maps showing
    the equal intensity curves corresponding to the maximal values of given side.

  6. Detecting contingencies: an infomax approach.

    Science.gov (United States)

    Butko, Nicholas J; Movellan, Javier R

    2010-01-01

    The ability to detect social contingencies plays an important role in the social and emotional development of infants. Analyzing this problem from a computational perspective may provide important clues for understanding social development, as well as for the synthesis of social behavior in robots. In this paper, we show that the turn-taking behaviors observed in infants during contingency detection situations are tuned to optimally gather information as to whether a person is responsive to them. We show that simple reinforcement learning mechanisms can explain how infants acquire these efficient contingency detection schemas. The key is to use the reduction of uncertainty (information gain) as a reward signal. The result is an interesting form of learning in which the learner rewards itself for conducting actions that help reduce its own sense of uncertainty. This paper illustrates the possibilities of an emerging area of computer science and engineering that focuses on the computational understanding of human behavior and on its synthesis in robots. We believe that the theory of stochastic optimal control will play a key role providing a formal mathematical foundation for this newly emerging discipline. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. Probability, statistics and modelling in public health

    National Research Council Canada - National Science Library

    Nikulin, Mikhail Stepanovich; Commenges, Daniel; Huber, Catherine

    2006-01-01

    .... Several well known biostatisticians from Europe and America were invited. A special issue of Lifetime Data Analysis was published (Volume 10, No 4), gathering some of the works discussed at this symposium. This volume gathers a larger number of papers, some of them being extended versions of papers published in the Lifetime Data Analysis issu...

  8. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  9. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  10. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  11. Evolutionary contingency and SETI revisited

    Science.gov (United States)

    Cirkovic, Milan M.

    2014-07-01

    The well-known argument against the Search for ExtraTerrestrial Intelligence (SETI) due to George Gaylord Simpson is re-analyzed almost half a century later, in the light of our improved understanding of preconditions for the emergence of life and intelligence brought about by the ongoing "astrobiological revolution". Simpson's argument has been enormously influential, in particular in biological circles, and it arguably fueled the most serious opposition to SETI programmes and their funding. I argue that both proponents and opponents of Simpson's argument have occasionally mispresented its core content. Proponents often oversimplify it as just another consequence of biological contingency, thus leaving their position open to general arguments limiting the scope of contingency in evolution (such as the recent argument of Geerat Vermeij based on selection effects in the fossil record). They also tend to neglect that the argument has been presented as essentially atemporal, while referring to entities and processes that are likely to change over time; this has become even less justifiable as our astrobiological knowledge increased in recent years. Opponents have failed to see that the weaknesses in Simpson's position could be removed by restructuring of the argument; I suggest one way of such restructuring, envisioned long ago in the fictional context by Stanislaw Lem. While no firm consensus has emerged on the validity of Simpson's argument so far, I suggest that, contrary to the original motivation, today it is less an anti-SETI argument, and more an astrobiological research programme. In this research programme, SETI could be generalized into a platform for testing some of the deepest assumptions about evolutionary continuity and the relative role of contingency versus convergence on unprecedented spatial and temporal scales.

  12. Automated Contingency Management for Propulsion Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — Increasing demand for improved reliability and survivability of mission-critical systems is driving the development of health monitoring and Automated Contingency...

  13. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  14. Considerations on a posteriori probability

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available In this first paper of 1911 relating to the sex ratio at birth, Gini repurposed a Laplace’s succession rule according to a Bayesian version. The Gini's intuition consisted in assuming for prior probability a Beta type distribution and introducing the "method of results (direct and indirect" for the determination of  prior probabilities according to the statistical frequency obtained from statistical data.

  15. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....

  16. Contingent Diversity on Anthropic Landscapes

    Directory of Open Access Journals (Sweden)

    William Balée

    2010-02-01

    Full Text Available Behaviorally modern human beings have lived in Amazonia for thousands of years. Significant dynamics in species turnovers due to human-mediated disturbance were associated with the ultimate emergence and expansion of agrarian technologies in prehistory. Such disturbances initiated primary and secondary landscape transformations in various locales of the Amazon region. Diversity in these locales can be understood by accepting the initial premise of contingency, expressed as unprecedented human agency and human history. These effects can be accessed through the archaeological record and in the study of living languages. In addition, landscape transformation can be demonstrated in the study of traditional knowledge (TK. One way of elucidating TK distinctions between anthropic and nonanthropic landscapes concerns elicitation of differential labeling of these landscapes and more significantly, elicitation of the specific contents, such as trees, occurring in these landscapes. Freelisting is a method which can be used to distinguish the differential species compositions of landscapes resulting from human-mediated disturbance vs. those which do not evince records of human agency and history. The TK of the Ka’apor Indians of Amazonian Brazil as revealed in freelisting exercises shows differentiation of anthropogenic from high forests as well as a recognition of diversity in the anthropogenic forests. This suggests that the agents of human-mediated disturbance and landscape transformation in traditional Amazonia encode diversity and contingency into their TK, which encoding reflects past cultural influence on landscape and society over time.

  17. The rejection-rage contingency in borderline personality disorder.

    Science.gov (United States)

    Berenson, Kathy R; Downey, Geraldine; Rafaeli, Eshkol; Coifman, Karin G; Paquin, Nina Leventhal

    2011-08-01

    Though long-standing clinical observation reflected in the Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.) suggests that the rage characteristic of borderline personality disorder (BPD) often appears in response to perceived rejection, the role of perceived rejection in triggering rage in BPD has never been empirically tested. Extending basic personality research on rejection sensitivity to a clinical sample, a priming-pronunciation experiment and a 21-day experience-sampling diary examined the contingent relationship between perceived rejection and rage in participants diagnosed with BPD compared with healthy controls. Despite the differences in these 2 assessment methods, the indices of rejection-contingent rage that they both produced were elevated in the BPD group and were strongly interrelated. They provide corroborating evidence that reactions to perceived rejection significantly explain the rage seen in BPD. © 2011 American Psychological Association

  18. 40 CFR 264.53 - Copies of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... police departments, fire departments, hospitals, and State and local emergency response teams that may be called upon to provide emergency services. ... Contingency Plan and Emergency Procedures § 264.53 Copies of contingency plan. A copy of the contingency plan...

  19. The Role of Contingency in Classical Conditioning.

    Science.gov (United States)

    Papini, Mauricio R.; Bitterman, M. E.

    1990-01-01

    Early experiments suggesting that classical conditioning depends on the contingency between conditioned stimulus (CS) and the unconditioned stimulus (US) are reconsidered along with later evidence that shows conditioning of the CS and its context in random training. CS-US contingency is neither necessary nor sufficient for conditioning. (SLD)

  20. Contingency management: perspectives of Australian service providers.

    Science.gov (United States)

    Cameron, Jacqui; Ritter, Alison

    2007-03-01

    Given the very positive and extensive research evidence demonstrating efficacy and effectiveness of contingency management, it is important that Australia explore whether contingency management has a role to play in our own treatment context. Qualitative interviews were conducted with 30 experienced alcohol and drug practitioners, service managers and policy-makers in Victoria. Interviewees were selected to represent the range of drug treatment services types and included rural representation. A semi-structured interview schedule, covering their perceptions and practices of contingency management was used. All interviews were transcribed verbatim and analysed using N2 qualitative data analysis program. The majority of key informants were positively inclined toward contingency management, notwithstanding some concerns about the philosophical underpinnings. Concerns were raised in relation to the use of monetary rewards. Examples of the use of contingency management provided by key informants demonstrated an over-inclusive definition: all the examples did not adhere to the key principles of contingency management. This may create problems if a structured contingency management were to be introduced in Australia. Contingency management is an important adjunctive treatment intervention and its use in Australia has the potential to enhance treatment outcomes. No unmanageable barriers were identified in this study.

  1. 48 CFR 218.201 - Contingency operation.

    Science.gov (United States)

    2010-10-01

    ... preparation for overseas contingency, humanitarian, or peacekeeping operations. See 213.270(c)(3) and (5). (4... humanitarian or peacekeeping operation may use the Governmentwide commercial purchase card to make a purchase... threshold in support of a contingency operation or a humanitarian or peacekeeping operation. See 213.305-3(d...

  2. Thevenin Equivalent Method for Dynamic Contingency Assessment

    DEFF Research Database (Denmark)

    Møller, Jakob Glarbo; Jóhannsson, Hjörtur; Østergaard, Jacob

    2015-01-01

    A method that exploits Thevenin equivalent representation for obtaining post-contingency steady-state nodal voltages is integrated with a method of detecting post-contingency aperiodic small-signal instability. The task of integrating stability assessment with contingency assessment is challenged...... by the cases of unstable post-contingency conditions. For unstable postcontingency conditions there exists no credible steady-state which can be used for basis of a stability assessment. This paper demonstrates how Thevenin Equivalent methods can be applied in algebraic representation of such bifurcation...... points which may be used in assessment of post-contingency aperiodic small-signal stability. The assessment method is introduced with a numeric example....

  3. The Contingent Character of Necessity

    Directory of Open Access Journals (Sweden)

    Luis Guzmán

    2006-12-01

    Full Text Available One of the most frequent criticisms raised against Hegel has to do with the totalizing aspect of his system, which determines what is as absolutely necessary. The Science of Logic, being the conceptual edifice upon which his whole system is built, is the appropriate place to determine the specific meaning of the Hegelian concepts. The following paper offers a detailed analysis on the chapter on Actuality (Wirklichkeit in the Science of Logic, in order to show how the concept of absolute necessity not only includes within it, but also contains as a structural element, the concept of contingency. In this manner a deflationary interpretation is generated in which the absolutely necessary character of actuality should not be understood as grounded on a pre-established end that inexorably determines actuality, but rather as an interpretive movement, in recollection, of its process.

  4. The historical contingency of bioethics.

    Science.gov (United States)

    Kevles, D J

    2000-01-01

    The principles of bioethics have been historically contingent, a product of social values, circumstances, and experience. During the early twentieth century, they rested on a doctor-knows-best autonomy that permitted physicians to perform research on human subjects with a minimum degree, if any, of informed consent. The eugenics movement of the period embraced an implicit bioethics by presuming to sterilize individuals for the sake of a larger social benefit, a practice and doctrine that helped lead to the Nazi medical experiments and death camps. After World War II, the promulgation of the Nuremberg Code failed to halt eugenic sterilization and risky human experimentation without informed consent either in civilian or military venues. However, beginning in the 1960s, these practices came under mounting critical scrutiny, partly because of the increasing attention given to individual rights. By now, it is widely understood that concern for individual rights rather than an appeal to some national good belongs at the heart of bioethics.

  5. Assessing the benefits of reducing fire risk in the wildland urban interface: A contingent valuation approach

    Science.gov (United States)

    Jeremy Fried; Greg J. Winter; Keith J. Gilless

    1999-01-01

    Wildland-urban interface (WUI) residents in Michigan were interviewed using a contingent valuation protocol to assess their-willingness-to-pay (WT) for incremental reductions in the risk of losing their homes to wildfire. WTP was elicited using a probability model which segments the risk of structure loss into "public" and "private" components.

  6. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  7. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    Science.gov (United States)

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  8. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  9. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  10. STUDI KOMITMEN ORGANISASIONAL: PEKERJA CONTINGENT DAN SURVIVOR

    Directory of Open Access Journals (Sweden)

    Fenika Walani

    2015-08-01

    Full Text Available In recent years, contingent and survivor workers have emerged as a common reality in business activities. Unfortunately, contingent worker has high job insecurity on his employment status. On the other side, downsizing activities can result in decreasing job security of survivor worker. As a consequence, both contingent and survivor workers very potential have low organizational commitment. However, organizations still have an opportunity to give their workers an exclusive treatment for building organizational commitment without ignoring the fact that workers have other commitment foci.

  11. Contingent Faculty Perceptions of Organizational Support, Workplace Attitudes, and Teaching Evaluations at a Public Research University

    Directory of Open Access Journals (Sweden)

    Min Young Cha

    2016-03-01

    Full Text Available This research examines contingent faculty’s perception of organizational support, workplace attitudes, and Student Ratings of Teaching (SRT in a large public research university to investigate their employee-organization relationship. According to t-tests and regression analyses for samples of 2,229 faculty and instructional staff who answered the survey and had SRT data (tenured and tenure-track faculty: 1,708, 76.6% of total; contingent faculty: 521, 23.4% of total, employment relationship of contingent faculty in this institution was closer to a combined economic and social exchange model than to a pure economic exchange model or underinvestment model. Contingent faculty’s satisfaction with work, satisfaction with coworkers, perception of being supported at work, and affective organizational commitment were higher than tenured and tenure-track faculty at a statistically significant level. In addition, contingent faculty had higher SRT mean results in all areas of SRT items in medium-size (10-30 classes and in ‘class presentation,’ ‘feedback,’ ‘deeper understanding,’ and ‘interest stimulated’ in large-size (30-50 classes than Tenured and Tenure-track Faculty. These results not only refute the misconception that contingent faculty have too little time to provide students with feedback but also support that they provide students with good teaching, at least in medium-size and large-size classes. Whereas these results might be partially attributable to the relatively stable status of contingent faculty in this study (who work for more than 50 percent FTE, they indicate that, as a collective, contingent faculty also represent a significant contributor to the university, who are satisfied with their work, enjoy the community they are in, and are committed to their institution.

  12. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  13. Thoughts Without Content are Empty, Intuitions Without Concepts are Blind - Determinism and Contingency Revisited

    Science.gov (United States)

    Pohorille, Andrew

    2011-01-01

    similar reasoning is common in other fields of science, for example in statistical mechanics. Some trajectories lead to life, perhaps in different forms, whereas others do not. Of our true interest is the ratio of these two outcomes. The issue of determinism does not directly enter the picture. The debate about the likelihood of the emergence of life is quite old. One view holds that the origin of life is an event governed by chance, and the result of so many random events (contingencies) is unpredictable. This view was eloquently expressed by Monod. In his book "Chance or Necessity" he argued that life was a product of "nature's roulette." In an alternative view, expressed in particular by deDuve and Morowitz, the origin of life is considered a highly probable or even inevitable event (although its details need not be determined in every respect). Only in this sense the origin of life can be considered a "deterministic event".

  14. [Ottawa National Wildlife Refuge : Disease Contingency Plan

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The Disease Contingency Plan for Ottawa NWR provides background information on disease surveillance; an inventory of Refuge personnel, equipment, and resources; and...

  15. Strategy as Mutually Contingent Choice

    Directory of Open Access Journals (Sweden)

    Neil Martin

    2016-05-01

    Full Text Available Thomas Schelling’s The Strategy of Conflict carries significant behavioral implications which have been overlooked by economic readers. I argue that these implications are central to Schelling’s vision of game theory, that they fit well with recent advances in experimental psychology and behavioral economics, and provide a comprehensive framework that can inform research on strategy. In my view, Schelling develops a non-mathematical approach to strategy which anticipates on Gigerenzer and Selten’s “ecological rationality” program. This approach maps the processes involved in strategic reasoning and highlights their reliance on the particular information structure of interactive social environments. Building on this approach, I model strategy as a heuristic form of reasoning that governs the way in which individuals search for and provide cues in situations of mutually contingent choice. I conclude by examining how the reference to ecological rationality can help clarify Schelling’s contribution to game theory and outline potential avenues of research into strategic reasoning and interaction.

  16. Contingency management for patients with dual disorders in intensive outpatient treatment for addiction.

    Science.gov (United States)

    Kelly, Thomas M; Daley, Dennis C; Douaihy, Antoine B

    2014-01-01

    This quality improvement program evaluation investigated the effectiveness of contingency management for improving retention in treatment and positive outcomes among patients with dual disorders in intensive outpatient treatment for addiction. The effect of contingency management was explored among a group of 160 patients exposed to contingency management (n = 88) and not exposed to contingency management (no contingency management, n = 72) in a six-week partial hospitalization program. Patients referred to the partial hospitalization program for treatment of substance use and comorbid psychiatric disorders received diagnoses from psychiatrists and specialist clinicians according to the Diagnostic and Statistical Manual of the American Psychiatric Association. A unique application of the contingency management "fishbowl" method was used to improve the consistency of attendance at treatment sessions, which patients attended 5 days a week. Days attending treatment and drug-free days were the main outcome variables. Other outcomes of interest were depression, anxiety and psychological stress, coping ability, and intensity of drug cravings. Patients in the contingency management group attended more treatment days compared to patients in the no contingency management group; M = 16.2 days (SD = 10.0) versus M = 9.9 days (SD = 8.5), respectively; t = 4.2, df = 158, p treatment groups on number of drug-free days. Psychological stress and drug craving were inversely associated with drug-free days in bivariate testing (r = -.18, p Treatment days attended and drug craving were associated with drug-free days in multivariate testing (B =.05, SE =.01, β =.39, t = 4.9, p Days attending treatment partially mediated the relationship between exposure to contingency management and self-reported drug-free days. Contingency management is a valuable adjunct for increasing retention in treatment among patients with dual disorders in partial hospitalization treatment. Exposure to

  17. Learning predictive statistics from temporal sequences: Dynamics and strategies

    Science.gov (United States)

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E.; Kourtzi, Zoe

    2017-01-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics—that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments. PMID:28973111

  18. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  19. Statistical mechanics

    CERN Document Server

    Sheffield, Scott

    2009-01-01

    In recent years, statistical mechanics has been increasingly recognized as a central domain of mathematics. Major developments include the Schramm-Loewner evolution, which describes two-dimensional phase transitions, random matrix theory, renormalization group theory and the fluctuations of random surfaces described by dimers. The lectures contained in this volume present an introduction to recent mathematical progress in these fields. They are designed for graduate students in mathematics with a strong background in analysis and probability. This book will be of particular interest to graduate students and researchers interested in modern aspects of probability, conformal field theory, percolation, random matrices and stochastic differential equations.

  20. Problems with Contingency Theory: Testing Assumptions Hidden within the Language of Contingency "Theory".

    Science.gov (United States)

    Schoonhoven, Clausia Bird

    1981-01-01

    Discusses problems in contingency theory, which relates organizational structure to the tasks performed and the information needed. Analysis of data from 17 hospitals suggests that traditional contingency theory underrepresents the complexity of relations among technological uncertainty, structure, and organizational effectiveness. (Author/RW)

  1. Optimal self-esteem is contingent: Intrinsic versus extrinsic and upward versus downward contingencies

    NARCIS (Netherlands)

    Vonk, R.; Smit, H.M.M.

    2012-01-01

    We argue that noncontingent, unconditional self-esteem is not optimal but defensive. We introduce the concept of intrinsic contingency, where self-esteem is affected by whether one's actions are self-congruent and conducive to personal growth. Whereas external contingencies, especially social and

  2. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover

  3. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  4. Understanding advanced statistical methods

    CERN Document Server

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  5. A parallel asynchronous decomposition for on-line contingency planning

    Energy Technology Data Exchange (ETDEWEB)

    Ramesh, V.C. [Illinois Inst. of Tech., Chicago, IL (United States). Electrical and Computer Engineering Dept.; Talukdar, S.N. [Carnegie Mellon Univ., Pittsburgh, PA (United States). Engineering Design Research Center

    1995-12-31

    Traditional formulations of security-constrained-optimal-power-flows represent contingencies by hard constraints. The disadvantages are four-fold. First, the conflicts among contingencies must be arbitrated apriori, before their effects are known. Second, the feasible region shrinks with increase in the number of contingencies. Third, computational time increases with the number of contingencies. Fourth, hard constraints provide poor models of fuzzy quantities such as equipment ratings and operating guidelines. This paper develops a modeling framework without these disadvantages. Specifically, it allows for soft constraints and always has feasible solutions. The effects of conflicts among contingencies are displayed so system operators can arbitrate them in an informed manner. Moreover, each contingency can be handled asynchronously and in parallel. In other words, the computational time, for handling an arbitrarily large number of contingencies, remains the same as for performing an optimal power flow without any contingencies (provided that a computer is dedicated to each contingency).

  6. A parallel asynchronous decomposition for on-line contingency planning

    Energy Technology Data Exchange (ETDEWEB)

    Ramesh, V.C. [Illinois Inst. of Tech., Chicago, IL (United States). Electrical and Computer Engineering Dept.; Talukdar, S.N. [Carnegie Mellon Univ., Pittsburgh, PA (United States). Engineering Design Research Center

    1996-02-01

    Traditional formulations of security-constrained-optimal-power-flows represent contingencies by hard constraints. The disadvantages are four-fold. First, the conflicts among contingencies must be arbitrated a priori, before their effects are known. Second, the feasible region shrinks with increase in the number of contingencies. Third, computational time increases with the number of contingencies. Fourth, hard constraints provide poor models of fuzzy quantities such as equipment ratings and operating guidelines. This paper develops a modeling framework without these disadvantages. Specifically, it allows for soft constraints and always has feasible solutions. The effects of conflicts among contingencies are displayed so system operators can arbitrate them in an informed manner. Moreover, each contingency can be handled asynchronously and in parallel. In other words, the computational time, for handling an arbitrarily large number of contingencies, remains the same as for performing an optimal power flow without any contingencies (provided that a computer is dedicated to each contingency).

  7. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  8. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  9. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  10. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  11. Searching for Plausible N-k Contingencies Endangering Voltage Stability

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel; Van Cutsem, Thierry

    2017-01-01

    This paper presents a novel search algorithm using time-domain simulations to identify plausible N − k contingencies endangering voltage stability. Starting from an initial list of disturbances, progressively more severe contingencies are investigated. After simulation of a N − k contingency......, the simulation results are assessed. If the system response is unstable, a plausible harmful contingency sequence has been found. Otherwise, components affected by the contingencies are considered as candidate next event leading to N − (k + 1) contingencies. This implicitly takes into account hidden failures...

  12. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  13. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  14. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  15. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  16. Color and Contingency in Robert Boyle's Works.

    Science.gov (United States)

    Baker, Tawrin

    2015-01-01

    This essay investigates the relationship between color and contingency in Robert Boyle's Experiments and Considerations Touching Colours (1664) and his essays on the unsuccessfulness of experiments in Certain Physiological Essays (1661). In these two works Boyle wrestles with a difficult practical and philosophical problem with experiments, which he calls the problem of contingency. In Touching Colours, the problem of contingency is magnified by the much-debated issue of whether color had any deep epistemic importance. His limited theoretical principle guiding him in Touching Colours, that color is but modified light, further exacerbated the problem. Rather than theory, Boyle often relied on craftsmen, whose mastery of color phenomena was, Boyle mentions, brought about by economic forces, to determine when colors were indicators of important 'inward' properties of substances, and thus to secure a solid foundation for his experimental history of color.

  17. Estimating state-contingent production functions

    DEFF Research Database (Denmark)

    Rasmussen, Svend; Karantininis, Kostas

    The paper reviews the empirical problem of estimating state-contingent production functions. The major problem is that states of nature may not be registered and/or that the number of observation per state is low. Monte Carlo simulation is used to generate an artificial, uncertain production...... environment based on Cobb Douglas production functions with state-contingent parameters. The pa-rameters are subsequently estimated based on different sizes of samples using Generalized Least Squares and Generalized Maximum Entropy and the results are compared. It is concluded that Maximum Entropy may...... be useful, but that further analysis is needed to evaluate the efficiency of this estimation method compared to traditional methods....

  18. Stationary algorithmic probability

    National Research Council Canada - National Science Library

    Müller, Markus

    2010-01-01

    ...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...

  19. Comments on contingency management and conditional cash transfers.

    Science.gov (United States)

    Higgins, Stephen T

    2010-10-01

    This essay discusses research on incentive-based interventions to promote healthy behavior change, contingency management (CM) and conditional cash transfers (CCT). The overarching point of the essay is that CM and CCT are often treated as distinct areas of inquiry when at their core they represent a common approach. Some potential bi-directional benefits of recognizing this commonality are discussed. Distinct intellectual traditions probably account for the separate paths of CM and CCT to date, with the former being rooted in behavioral psychology and the latter in microeconomics. It is concluded that the emerging field of behavioral economics, which is informed by and integrates principles of each of those disciplines, may provide the proper conceptual framework for integrating CM and CCT.

  20. Experience Matters: Information Acquisition Optimizes Probability Gain

    Science.gov (United States)

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  1. Experience matters: information acquisition optimizes probability gain.

    Science.gov (United States)

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.

  2. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  3. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  4. Insensitivity to Scope in Contingent Valuation Studies

    DEFF Research Database (Denmark)

    Søgaard, Rikke; Lindholt, Jes Sanddal; Gyrd-Hansen, Dorte

    2012-01-01

    Background: The credibility of contingent valuation studies has been questioned because of the potential occurrence of scope insensitivity, i.e. that respondents do not react to higher quantities or qualities of a good. Objective: The aim of this study was to examine the extent of scope insensiti......Background: The credibility of contingent valuation studies has been questioned because of the potential occurrence of scope insensitivity, i.e. that respondents do not react to higher quantities or qualities of a good. Objective: The aim of this study was to examine the extent of scope...... insensitivity and to assess the relevance of potential explanations that may help to shed light on how to appropriately handle this problem in contingent valuation studies. Methods: We surveyed a sample of 2004 men invited for cardiovascular disease screening. Each respondent had three contingent valuation...... was overall found to be sensitive to scope, testing at the conventional sample-mean level. At the individual respondent level, however, more than half of the respondents failed the tests. Potential determinants for failing the tests were examined in alternative regression models but few consistent...

  5. An Exploratory Contingency Model for Schools.

    Science.gov (United States)

    Whorton, David M.

    In an application of contingency theory, data from 45 Arizona schools were analyzed to determine the relationships between three sets of independent variables (organizational structure, leadership style, and environmental characteristics) and the dependent variable (organizational effectiveness as perceived by principals and teachers). Contingency…

  6. Toward a Contingency Theory of Decision Making.

    Science.gov (United States)

    Tarter, C. John; Hoy, Wayne K.

    1998-01-01

    There is no single best decision-making approach. This article reviews and compares six contemporary models (classical, administrative, incremental, mixed-scanning, garbage-can, and political) and develops a framework and 10 propositions to match strategies with circumstances. A contingency approach suggests that administrators use satisficing (a…

  7. Paradoxes of leadership: contingencies and critical learning

    African Journals Online (AJOL)

    Erna Kinsey

    In our resolution we argue that the contingencies of leadership contexts are sufficiently different to compromise the goal of producing a single leadership model. Instead, we urge that the role of school leaders in promoting learning, or other organizational goals, needs to be discerned from the leader's own theory that guides ...

  8. Towards a contingency theory of Operations Management

    DEFF Research Database (Denmark)

    Boer, Harry; Boer, Henrike Engele Elisabeth; Demeter, Krisztina

    2017-01-01

    This paper presents an analysis of papers addressing relationships between context, OM practice and performance, published in IJOPM and JOM over the last 25 years. The analysis suggest that the field is highly scattered, still quite immature, but growing. Suggestions for further analysis of exist...... of existing, and directions for future, research are formulated, aimed at furthering the development of OM contingency theory....

  9. Strategy changes in human contingency judgments as a function of contingency tables.

    Science.gov (United States)

    Shimazaki, T; Tsuda, Y; Imada, H

    1991-10-01

    Judgment strategies of 169 undergraduate students on problems to judge the contingency between two binary events were identified by the method of rule-based analysis to clarify whether or not the strategies the subjects used would be affected by the concrete nature of the contingency table. Problems were constructed along two factors: total cell frequency and width of range of objective contingencies. Although the factor of total cell frequency had no effect on subjects' strategies, the number of subjects who changed strategies corresponding with problem instances increased when the objective contingencies were set closer to zero or when problems became more difficult. These results are discussed in the context of previous studies of this issue in the literature.

  10. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  11. The Role of Feedback Contingency in Perceptual Category Learning

    Science.gov (United States)

    Ashby, F. Gregory; Vucovich, Lauren E.

    2016-01-01

    Feedback is highly contingent on behavior if it eventually becomes easy to predict, and weakly contingent on behavior if it remains difficult or impossible to predict even after learning is complete. Many studies have demonstrated that humans and nonhuman animals are highly sensitive to feedback contingency, but no known studies have examined how…

  12. Managing Contingent Liabilities in Public-Private Partnerships

    OpenAIRE

    Irwin, Timothy; Mokdad, Tanya

    2010-01-01

    Contingent liabilities create management problems for governments. They have a cost, but judging what the cost is and whether it is worth incurring is difficult. Except in the case of contingent liabilities created by simple guarantees of debt, governments usually can incur contingent liabilities without budgetary approval or recognition in the governments accounts. So governments may pref...

  13. Efficient probability sequence

    OpenAIRE

    Regnier, Eva

    2014-01-01

    A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...

  14. Efficient probability sequences

    OpenAIRE

    Regnier, Eva

    2014-01-01

    DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...

  15. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...

  16. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...

  17. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  18. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  19. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  20. Probability on compact Lie groups

    CERN Document Server

    Applebaum, David

    2014-01-01

    Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...

  1. Idaho National Engineering Laboratory Environmental Restoration Program Schedule Contingency Evaluation Report

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    This report represents the schedule contingency evaluation done on the FY-93 Major System Acquisition (MSA) Baseline for the Idaho National Engineering Laboratory`s (INEL) Environmental Restoration Program (EPP). A Schedule Contingency Evaluation Team (SCET) was established to evaluate schedule contingency on the MSA Baseline for the INEL ERP associated with completing work within milestones established in the baseline. Baseline schedules had been established considering enforceable deadlines contained in the Federal Facilities Agreement/Consent Order (FFA/CO), the agreement signed in 1992, by the State of Idaho, Department of Health & Welfare, the U.S. Environmental Protection Agency, Region 10, and the U.S. Department of Energy, Idaho Operations Office. The evaluation was based upon the application of standard schedule risk management techniques to the specific problems of the INEL ERP. The schedule contingency evaluation was designed to provided early visibility for potential schedule delays impacting enforceable deadlines. The focus of the analysis was on the duration of time needed to accomplish all required activities to achieve completion of the milestones in the baseline corresponding to the enforceable deadlines. Additionally, the analysis was designed to identify control of high-probability, high-impact schedule risk factors.

  2. Flood hazard probability mapping method

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  3. Oxygen boundary crossing probabilities.

    Science.gov (United States)

    Busch, N A; Silver, I A

    1987-01-01

    The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.

  4. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  5. Molecular Darwinism: the contingency of spontaneous genetic variation.

    Science.gov (United States)

    Arber, Werner

    2011-01-01

    The availability of spontaneously occurring genetic variants is an important driving force of biological evolution. Largely thanks to experimental investigations by microbial geneticists, we know today that several different molecular mechanisms contribute to the overall genetic variations. These mechanisms can be assigned to three natural strategies to generate genetic variants: 1) local sequence changes, 2) intragenomic reshuffling of DNA segments, and 3) acquisition of a segment of foreign DNA. In these processes, specific gene products are involved in cooperation with different nongenetic elements. Some genetic variations occur fully at random along the DNA filaments, others rather with a statistical reproducibility, although at many possible sites. We have to be aware that evolution in natural ecosystems is of higher complexity than under most laboratory conditions, not at least in view of symbiotic associations and the occurrence of horizontal gene transfer. The encountered contingency of genetic variation can possibly best ensure a long-term persistence of life under steadily changing living conditions.

  6. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  7. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Algebraic statistics computational commutative algebra in statistics

    CERN Document Server

    Pistone, Giovanni; Wynn, Henry P

    2000-01-01

    Written by pioneers in this exciting new field, Algebraic Statistics introduces the application of polynomial algebra to experimental design, discrete probability, and statistics. It begins with an introduction to Gröbner bases and a thorough description of their applications to experimental design. A special chapter covers the binary case with new application to coherent systems in reliability and two level factorial designs. The work paves the way, in the last two chapters, for the application of computer algebra to discrete probability and statistical modelling through the important concept of an algebraic statistical model.As the first book on the subject, Algebraic Statistics presents many opportunities for spin-off research and applications and should become a landmark work welcomed by both the statistical community and its relatives in mathematics and computer science.

  9. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  10. Analysis of Contemporary Contingency Contracting Educational Resources

    Science.gov (United States)

    2010-11-29

    Attn: James B. Greene , RADM, USN, (Ret) Acquisition Chair Graduate School of Business and Public Policy Naval Postgraduate School 555 Dyer Road...Registration CERP Commander’s Emergency Response Program CICA Competition in Contracting Act CJCS Chairman of the Joint Chiefs of Staff CoCo ...334 Matrix OO LCO IPE 1 2 3 4 5 TLO 1 Recognize and defend the most appropriate approaches for a contingency CoCo in an AOR throughout the four

  11. Contingent Claim-Based Expected Stock Returns

    OpenAIRE

    Nicholas Zhiyao Chen; Strebulaev, Ilya A.

    2013-01-01

    We develop and test a parsimonious contingent claims model for cross-sectional returns of stock portfolios formed on market leverage, book-to-market equity, asset growth rate, and equity size. Since stocks are residual claims on firms' assets that generate operating cash flows, stock returns are cash flow rates scaled by the sensitivities of stocks to cash flows. Our model performs well because the stock-cash flow sensitivities contain economic information. Value stocks, high-leverage stocks ...

  12. Food Marketing Technology and Contingency Market Valuation

    OpenAIRE

    Holloway, Garth J.; Anthony C. Zwart

    1993-01-01

    Marketing activities are introduced into a rational expectations model of the food marketing system. The model is used to evaluate effects of alternative marketing technologies on the distribution of the benefits of contingency markets in agriculture. Benefits depend on two parameters: the cost share of farm inputs and the elasticity of substitution between farm and nonfarm inputs in food marketing. Over a broad spectrum of technologies, consumers are likely to be the net beneficiaries and fa...

  13. Sound-contingent visual motion aftereffect

    Directory of Open Access Journals (Sweden)

    Kobayashi Maori

    2011-05-01

    Full Text Available Abstract Background After a prolonged exposure to a paired presentation of different types of signals (e.g., color and motion, one of the signals (color becomes a driver for the other signal (motion. This phenomenon, which is known as contingent motion aftereffect, indicates that the brain can establish new neural representations even in the adult's brain. However, contingent motion aftereffect has been reported only in visual or auditory domain. Here, we demonstrate that a visual motion aftereffect can be contingent on a specific sound. Results Dynamic random dots moving in an alternating right or left direction were presented to the participants. Each direction of motion was accompanied by an auditory tone of a unique and specific frequency. After a 3-minutes exposure, the tones began to exert marked influence on the visual motion perception, and the percentage of dots required to trigger motion perception systematically changed depending on the tones. Furthermore, this effect lasted for at least 2 days. Conclusions These results indicate that a new neural representation can be rapidly established between auditory and visual modalities.

  14. Probability Density Functions and Higher Order Statistics of Large-Scale Geostrophic Velocity Estimates and Sea Surface Height, as seen from the Jason-1 -TOPEX/Poseidon Tandem Mission

    Science.gov (United States)

    Scharffenberg, Martin G.; Biri, Stavroula; Stammer, Detlef

    2013-09-01

    Geostrophic velocity Probability Density Functions (PDF), Skewness (S) and Kurtosis (K) are shown for both velocity components (u, v) estimated from the 3- year long Jason-1 - TOPEX/Poseidon (JTP) Tandem Mission which allowed infer both velocity components directly from the altimeter observations. To be comparable to previous results of velocity- (w) and SSH-PDF, we include the 18.5-year time series of SSH from the TOPEX/Poseidon, Jason-1 and Jason-2 (TPJJ) missions.The differences in the PDF of both velocity components are found to be evident, with a wider shape for the zonal velocity component due to the larger variability in zonal direction. Results confirm that the exponential shape of the global velocity PDF is a consequence of the spatially inhomogeneous EKE distribution over the global ocean. Only regions with a small variance in EKE, have Gaussian shaped PDF, however, normalizing each time series with their STD results in Gaussian PDF everywhere.

  15. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  17. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  18. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  19. A practical overview on probability distributions.

    Science.gov (United States)

    Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca

    2015-03-01

    Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a binomial or Poisson distribution in the majority of cases. For continuous variables, the probability can be described by the most important distribution in statistics, the normal distribution. Distributions of probability are briefly described together with some examples for their possible application.

  20. Context-Gated Statistical Learning and Its Role in Visual-Saccadic Decisions

    Science.gov (United States)

    Ludwig, Casimir J. H.; Farrell, Simon; Ellis, Lucy A.; Hardwicke, Tom E.; Gilchrist, Iain D.

    2012-01-01

    Adaptive behavior in a nonstationary world requires humans to learn and track the statistics of the environment. We examined the mechanisms of adaptation in a nonstationary environment in the context of visual-saccadic inhibition of return (IOR). IOR is adapted to the likelihood that return locations will be refixated in the near future. We examined 2 potential learning mechanisms underlying adaptation: (a) a local tracking or priming mechanism that facilitates behavior that is consistent with recent experience and (b) a mechanism that supports retrieval of knowledge of the environmental statistics based on the contextual features of the environment. Participants generated sequences of 2 saccadic eye movements in conditions where the probability that the 2nd saccade was directed back to the previously fixated location varied from low (.17) to high (.50). In some conditions, the contingency was signaled by a contextual cue (the shape of the movement cue). Adaptation occurred in the absence of contextual signals but was more pronounced in the presence of contextual cues. Adaptation even occurred when different contingencies were randomly intermixed, showing the parallel formation of multiple associations between context and statistics. These findings are accounted for by an evidence accumulation framework in which the resting baseline of decision alternatives is adjusted on a trial-by-trial basis. This baseline tracks the subjective prior beliefs about the behavioral relevance of the different alternatives and is updated on the basis of the history of recent events and the contextual features of the current environment. PMID:21843019

  1. The price of privately releasing contingency tables, and the spectra of random matrices with correlated rows

    Energy Technology Data Exchange (ETDEWEB)

    Kasiviswanathan, Shiva [Los Alamos National Laboratory; Rudelson, Mark [UNIV OF MISSOURI; Smith, Adam [PENNSYLVANIA STATE U

    2009-01-01

    Contingency tables are the method of choice of government agencies for releasing statistical summaries of categorical data. In this paper, we consider lower bounds on how much distortion (noise) is necessary in these tables to provide privacy guarantees when the data being summarized is sensitive. We extend a line of recent work on lower bounds on noise for private data analysis [10, 13. 14, 15] to a natural and important class of functionalities. Our investigation also leads to new results on the spectra of random matrices with correlated rows. Consider a database D consisting of n rows (one per individual), each row comprising d binary attributes. For any subset of T attributes of size |T| = k, the marginal table for T has 2{sup k} entries; each entry counts how many times in the database a particular setting of these attributes occurs. Imagine an agency that wishes to release all (d/k) contingency tables for a given database. For constant k, previous work showed that distortion {tilde {Omicron}}(min{l_brace}n, (n{sup 2}d){sup 1/3}, {radical}d{sup k}{r_brace}) is sufficient for satisfying differential privacy, a rigorous definition of privacy that has received extensive recent study. Our main contributions are: (1) For {epsilon}- and ({epsilon}, {delta})-differential privacy (with {epsilon} constant and {delta} = 1/poly(n)), we give a lower bound of {tilde {Omega}}(min{l_brace}{radical}n, {radical}d{sup k}{r_brace}), which is tight for n = {tilde {Omega}}(d{sup k}). Moreover, for a natural and popular class of mechanisms based on additive noise, our bound can be strengthened to {Omega}({radical}d{sup k}), which is tight for all n. Our bounds extend even to non-constant k, losing roughly a factor of {radical}2{sup k} compared to the best known upper bounds for large n. (2) We give efficient polynomial time attacks which allow an adversary to reconstruct sensitive infonnation given insufficiently perturbed contingency table releases. For constant k, we obtain a

  2. Development of a Tutoring System for Probability Problem-Solving.

    Science.gov (United States)

    O'Connell, Ann Aileen; Bol, Linda

    Interpreting information regarding health risks, crime statistics, and government polls requires some ability to use and interpret probabilities. Studies have shown that even after training or coursework in probability and statistics, people still have many difficulties solving probability problems. The thesis of this document is that helping…

  3. Transportation Contingency Plans for Future Gas Shortages will not Meet Commuter Needs.

    Science.gov (United States)

    1981-07-01

    documents, the contingency plans and supporting documentation, and statistical reports on transit ridership and rideshare /carpool applications . le...personnel and equipment, such as school buses or retired transit buses that have been held in reserve, (3) increase ridesharing by promotional activities and...actions such as alternative work hour programs and rideshar - ing programs (see pp. 22 to 25), (2) problems with acquiring, maintaining, and activating

  4. Portable EMG devices, Biofeedback and Contingent Electrical Stimulation applications in Bruxism

    DEFF Research Database (Denmark)

    Castrillon, Eduardo

    Portable EMG devices, Biofeedback and Contingent Electrical Stimulation applications in Bruxism Eduardo Enrique, Castrillon Watanabe, DDS, MSc, PhD Section of Orofacial Pain and Jaw Function, Department of Dentistry, Aarhus University, Aarhus, Denmark; Scandinavian Center for Orofacial Neuroscience...... Summary: Bruxism is a parafunctional activity, which involves the masticatory muscles and probably it is as old as human mankind. Different methods such as portable EMG devices have been proposed to diagnose and understand the pathophysiology of bruxism. Biofeedback / contingent electrical stimulation...... (CES) methods have been also studied lately in the field of bruxism as a management method. Results from studies on portable EMG devices that can assess EMG activity on multiple nights, tell us that it is possible to improve the accuracy of the clinical diagnosis of sleep bruxism. New algorithms...

  5. Difficulties related to Probabilities

    OpenAIRE

    Rosinger, Elemer Elad

    2010-01-01

    Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.

  6. On Randomness and Probability

    Indian Academy of Sciences (India)

    casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...

  7. Dynamic update with probabilities

    NARCIS (Netherlands)

    Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant

  8. Elements of quantum probability

    NARCIS (Netherlands)

    Kummerer, B.; Maassen, H.

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with

  9. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  10. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  11. Business statistics I essentials

    CERN Document Server

    Clark, Louise

    2014-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t

  12. A pseudo-sequential choice model for valuing multi-attribute environmental policies or programs in contingent valuation applications

    Science.gov (United States)

    Dmitriy Volinskiy; John C Bergstrom; Christopher M Cornwell; Thomas P Holmes

    2010-01-01

    The assumption of independence of irrelevant alternatives in a sequential contingent valuation format should be questioned. Statistically, most valuation studies treat nonindependence as a consequence of unobserved individual effects. Another approach is to consider an inferential process in which any particular choice is part of a general choosing strategy of a survey...

  13. The probability of probability and research truths.

    Science.gov (United States)

    Fatovich, Daniel M; Phillips, Michael

    2017-04-01

    The foundation of much medical research rests on the statistical significance of the P-value, but we have fallen prey to the seductive certainty of significance. Other scientific disciplines work to a different standard. This may partly explain why medical reversal is an increasing phenomenon, whereby new studies (based on the 0.05 standard) overturn previous significant findings. This has generated a crisis in the rigour of evidence-based medicine, as many people erroneously believe that a P statistics are not facts about the world. Nor should they be based on an arbitrary threshold that arose for historical reasons. This arbitrary threshold encourages an unthinking automatic response that contributes to industry's influence on medical research. Examples from emergency medicine practice illustrate these themes. Study replication needs to be valued as much as discovery. Careful and thoughtful unbiased thinking about the results we do have is undervalued. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  14. Parallel auto-correlative statistics with VTK.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  15. Service development success: a contingent approach by knowledge strategy

    OpenAIRE

    Storey, Chris; Frank M. Hull

    2010-01-01

    Purpose\\ud \\ud – Contingency theory suggests that effective strategies and structures are not universal but dependant upon situational factors. The purpose of this paper is to explore how the way service firms compete acts as a strategic contingency, moderating the effect of a new service development (NSD) system on innovation performance. Two knowledge‐based strategies are tested as contingency factors. One strategy adds value for customers via the delivery of personalized knowledge‐based se...

  16. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  17. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  18. Automated Contingency Management for Advanced Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Impact Technologies LLC, in cooperation the Georgia Institute of Technology, proposes to develop and demonstrate an innovative Automated Contingency Management (ACM)...

  19. The dependency and contingency of politics

    DEFF Research Database (Denmark)

    Triantafillou, Peter

    2016-01-01

    Historical Institutionalism and genealogy are two strong analytical approaches that emphasise the importance of history in grasping contemporary politics that so far have lived in isolation from each other. This article, firstly, accounts for the similarities and differences between the two...... differences which make any analytical synthesis both a difficult and a questionable endeavour. In particular, whereas historical institutionalism seeks to explain the present in terms of its dependence on past events, genealogy seeks to provoke the present by demonstrating its historical contingency. In spite...

  20. Resource Contingency Program : Draft Environmental Impact Statement.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1995-02-01

    In 1990, the Bonneville Power Administration (BPA) embarked upon the Resource Contingency Program (RCP) to fulfill its statutory responsibilities to supply electrical power to its utility, industrial and other customers in the Pacific Northwest. Instead of buying or building generating plants now, BPA has purchased options to acquire power later if needed. Three option development agreements were signed in September 1993 with three proposed natural gas-fired, combined cycle combustion turbine CT projects near Chehalis and Satsop Washington and near Hermiston, Oregon. This environmental impact statement addresses the environmental consequences of purchasing power from these options. This environmental impact statement addresses the environmental consequences of purchasing power from these options.

  1. Sustainability between Necessity, Contingency and Impossibility

    Directory of Open Access Journals (Sweden)

    Karl Bruckmeier

    2009-12-01

    Full Text Available Sustainable use of natural resources seems necessary to maintain functions and services of eco- and social systems in the long run. Efforts in policy and science for sustainable development have shown the splintering of local, national and global strategies. Sustainability becomes contingent and insecure with the actors´ conflicting knowledge, interests and aims, and seems even impossible through the “rebound”-effect. To make short and long term requirements of sustainability coherent requires critical, comparative and theoretical analysis of the problems met. For this purpose important concepts and theories are discussed in this review of recent interdisciplinary literature about resource management.

  2. Contingency Contractor Optimization Phase 3 Sustainment Database Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Frazier, Christopher Rawls; Durfee, Justin David; Bandlow, Alisa; Gearhart, Jared Lee; Jones, Katherine A

    2016-05-01

    The Contingency Contractor Optimization Tool – Prototype (CCOT-P) database is used to store input and output data for the linear program model described in [1]. The database allows queries to retrieve this data and updating and inserting new input data.

  3. Contingency in the Cosmos and the Contingency of the Cosmos : Two Theological Approaches

    NARCIS (Netherlands)

    Drees, W.B.

    Contingency in reality may be epistemic, due to incomplete knowledge or the intersection of unrelated causal trajectories. In quantum physics, it appears to be ontological. More fundamental and interesting is the limit-question ‘why is there something rather than nothing,’ pointing out the

  4. The Necessity of Contingency or Contingent Necessity: Meillassoux, Hegel, and the Subject

    Directory of Open Access Journals (Sweden)

    John Van Houdt

    2011-06-01

    Full Text Available This article addresses the relationship of contingency to necessity as developed by Quentin Meillassoux and G.W.F. Hegel. Meillassoux criticizes the restriction of possibility by modern philosophy to the conditions of the transcendental subject, which he calls ‘correlationism’, and opposes to this correlationism, mathematics as an absolute form of thought. The arch-figure of a metaphysical version of correlationism for Meillassoux is Hegel. This article argues that, while Meillassoux is right to criticize a version of correlationism for restricting the range of contingency, he overlooks Hegel’s unique contribution to this issue. Hegel provides us a version of necessity modeled on the mathematical proof which answers Meillassoux’s concerns about correlationist versions of necessity but does not altogether jettison the concept of the subject. Instead, the subject in Hegel is a contingent interruption which emerges from the breaks in the kinds of necessity we posit about the world. Hegel offers us a way of tying these two concepts together in what I call ‘contingent necessity’.

  5. Dissociating Contingency Awareness and Conditioned Attitudes: Evidence of Contingency-Unaware Evaluative Conditioning

    Science.gov (United States)

    Hutter, Mandy; Sweldens, Steven; Stahl, Christoph; Unkelbach, Christian; Klauer, Karl Christoph

    2012-01-01

    Whether human evaluative conditioning can occur without contingency awareness has been the subject of an intense and ongoing debate for decades, troubled by a wide array of methodological difficulties. Following recent methodological innovations, the available evidence currently points to the conclusion that evaluative conditioning effects do not…

  6. Contingency Contracting Officer Proficiency Assessment Test Development for Construction, Architect-Engineer, and Contingency Contracting

    Science.gov (United States)

    2013-03-01

    AT&L Acquisition, Technology, and Logistics AT&LDS Army Training and Leader Development Strategy CCO Contingency Contracting Officer CLP...DEFINITIONS AND EXAMPLES 4. ANALYZE 4.1 DIFFERENTIATING Discriminating Distinguishing Focusing Selecting 4.2 ORGANIZING Finding Coherence Intergrating ... Logistics (AT&L)]), exposed failures occurring in expeditionary contracting operations. In order to rectify difficulties in the acquisition

  7. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  8. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  9. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  10. Elements of quantum probability

    OpenAIRE

    Kummerer, B.; Maassen, Hans

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of ‘quantum coin tosses’ are discussed, closely related to V.F.R....

  11. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  12. Statistical Neurodynamics.

    Science.gov (United States)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

  13. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  14. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  15. Information provided by diagnostic and screening tests: improving probabilities.

    Science.gov (United States)

    Weatherall, Mark

    2017-11-13

    Uncertainty in clinical encounters is inevitable and despite this uncertainty clinicians must still work with patients to make diagnostic and treatment decisions. Explicit diagnostic reasoning based on probabilities will optimise information in relation to uncertainty. In clinical diagnostic encounters, there is often pre-existing information that reflects the probability any particular patient has a disease. Diagnostic testing provides extra information that refines diagnostic probabilities. However, in general diagnostic tests will be positive in most, but not all cases of disease (sensitivity) and may not be negative in all cases of disease absence (specificity). Bayes rule is an arithmetic method of using diagnostic testing information to refine diagnostic probabilities. In this method, when probabilities are converted to odds, multiplication of the odds of disease before diagnostic testing, by the positive likelihood ratio (LR+), the sensitivity of a test divided by 1 minus the specificity refines the probability of a particular diagnosis. Similar arithmetic applies to the probability of not having a disease, where the negative likelihood ratio is the specificity divided by 1 minus the sensitivity. A useful diagnostic test is one where the LR+ is greater than 5-10. This can be clarified by creating a contingency table for hypothetical groups of patients in relation to true disease prevalence and test performance predicted by sensitivity and specificity. Most screening tests in populations with a low prevalence of disease have a very high ratio of false positive results to true positive results, which can also be illustrated by contingency tables. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  17. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  18. Modeling HIVAIDS Variables, A Case Of Contingency Analysis

    African Journals Online (AJOL)

    PROF. OLIVER OSUAGWA

    2015-06-01

    Jun 1, 2015 ... Hypothesis testing may be performed on contingency tables in order to decide whether or not effects are present. Effects in a contingency table are defined as relationships or interactions between variables of interest. However, considering the number of adolescence patients diagnosed of. HIV/AIDS within ...

  19. Modeling HIVAIDS Variables, A Case Of Contingency Analysis ...

    African Journals Online (AJOL)

    Hypothesis testing may be performed on contingency tables in order to decide whether or not effects are present. Effects in a contingency table are defined as relationships or interactions between variables of interest. However, considering the number of adolescence patients diagnosed of HIV/AIDS within a short frame of ...

  20. 40 CFR 265.53 - Copies of contingency plan.

    Science.gov (United States)

    2010-07-01

    ...) Submitted to all local police departments, fire departments, hospitals, and State and local emergency response teams that may be called upon to provide emergency services. ... DISPOSAL FACILITIES Contingency Plan and Emergency Procedures § 265.53 Copies of contingency plan. A copy...

  1. An Experimental Test of the Contingency Model of Leadership Effectiveness.

    Science.gov (United States)

    Chemers, Martin M.; Skrzypek, George J.

    The present experiment provided a test of Fiedler's (1967) Contingency Model of Leadership Effectiveness, i.e., the relationship of leader style to group effectiveness is mediated by situational demands. Thirty-two 4 man task groups composed of military academy cadets were run in the experiment. In accordance with the Contingency Model, leaders…

  2. Effectiveness evaluation of contingency sum as a risk management ...

    African Journals Online (AJOL)

    Construction managers in a bid to effectively manage risks prone projects have adopted several methods, one of which is contingency sum. This study aims at evaluating the effectiveness of contingency sum as a risk management tool for construction projects in Niger Delta region of Nigeria. The objectives are to establish ...

  3. Better off Jobless? Scarring Effects of Contingent Employment in Japan

    Science.gov (United States)

    Yu, Wei-hsin

    2012-01-01

    Previous research fails to address whether contingent employment benefits individuals' careers more than the alternative they often face: being without a job. Using work history data from Japan, this study shows that accepting a contingent job delays individuals' transition to standard employment more than remaining jobless. Moreover, having a…

  4. Modification of Preschool Children's Bathroom Behaviors by Contingent Teacher Attention

    Science.gov (United States)

    Taylor, Marjorie J.; Kratochwill, Thomas R.

    1978-01-01

    Repeated measures of the frequency of paper towel litter, unflushed toilets, dirty sinks, and running water faucets were used to evaluate effectiveness of contingent teacher praise for appropriate bathroom use by preschool children. Contingent praise for appropriate bathroom behaviors resulted in markedly decreased frequencies of four target…

  5. Appraisal of the Performance of Contingency Cost Provision for ...

    African Journals Online (AJOL)

    The paper appraised performance of contingency allowance in addressing projects' cost risk. To achieve this aim, impact of contingency provision in some selected building projects were evaluated. Data for the study was collected by means of checklist from 40 completed projects' files. Furthermore, 100 questionnaires on ...

  6. Relative speed of processing determines color-word contingency learning.

    Science.gov (United States)

    Forrin, Noah D; MacLeod, Colin M

    2017-10-01

    In three experiments, we tested a relative-speed-of-processing account of color-word contingency learning, a phenomenon in which color identification responses to high-contingency stimuli (words that appear most often in particular colors) are faster than those to low-contingency stimuli. Experiment 1 showed equally large contingency-learning effects whether responding was to the colors or to the words, likely due to slow responding to both dimensions because of the unfamiliar mapping required by the key press responses. For Experiment 2, participants switched to vocal responding, in which reading words is considerably faster than naming colors, and we obtained a contingency-learning effect only for color naming, the slower dimension. In Experiment 3, previewing the color information resulted in a reduced contingency-learning effect for color naming, but it enhanced the contingency-learning effect for word reading. These results are all consistent with contingency learning influencing performance only when the nominally irrelevant feature is faster to process than the relevant feature, and therefore are entirely in accord with a relative-speed-of-processing explanation.

  7. A Test Based on the Contingency Model of Leadership.

    Science.gov (United States)

    Vecchio, Robert P.

    1981-01-01

    The contingency model of leadership was extended to investigate subordinate satisfaction. It was hypothesized that subordinate satisfaction with a leader would yield evidence of an interaction between leadership style and situational parameters. Results indicated moderate support for an extension of the contingency model formulation. (RC)

  8. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  9. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  10. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  11. What are the statistics in statistical learning?

    Science.gov (United States)

    Holt, Lori L.; Lotto, Andrew J.

    2003-10-01

    The idea that speech perception is shaped by the statistical structure of the input is gaining wide enthusiasm and growing empirical support. Nonetheless, statistics and statistical learning are broad terms with many possible interpretations and, perhaps, many potential underlying mechanisms. In order to define the role of statistics in speech perception mechanistically, we will need to more precisely define the statistics of statistical learning and examine similarities and differences across subgroups. In this talk, we examine learning of four types of information: (1) acoustic variance that is defining for contrastive categories, (2) the correlation between acoustic attributes or linguistic features, (3) the probability or frequency of events or a series of events, (4) the shape of input distributions. We present representative data from online speech perception and speech development and discuss inter-relationships among the subgroups. [Work supported by NSF, NIH and the James S. McDonnell Foundation.

  12. Guidelines for Statistical Testing

    OpenAIRE

    Strigini, L.; Littlewood, B.; European Space Agency

    1997-01-01

    This document provides an introduction to statistical testing. Statistical testing of software is here defined as testing in which the test cases are produced by a random process meant to produce different test cases with the same probabilities with which they would arise in actual use of the software. Statistical testing of software has these main advantages: for the purpose of reliability assessment and product acceptance, it supports directly estimates of reliability, and thus decisions on...

  13. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  14. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  15. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  16. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  17. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  18. The Impact of the Contingency of Robot Feedback for HRI

    DEFF Research Database (Denmark)

    Fischer, Kerstin; Lohan, Katrin Solveig; Saunders, Joe

    2013-01-01

    In this paper, we investigate the impact the contingency of robot feedback may have on the quality of verbal human-robot interaction. In order to assess not only what the effects are but also what they are caused by, we carried out experiments in which naïve participants instructed the humanoid...... robot iCub on a set of shapes and on a stacking task in two conditions, once with socially contingent, nonverbal feedback implemented in response to different gaze and looming behaviors of the human tutor, and once with non-contingent, saliency-based feedback. The results of the analysis of participants......’ linguistic behaviors in the two conditions show that contingency has an impact on the complexity and the pre-structuring of the task for the robot, i.e. on the participants’ tutoring behaviors. Contingency thus plays a considerable role for learning by demonstration....

  19. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  20. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  1. Huygens' foundations of probability

    NARCIS (Netherlands)

    Freudenthal, Hans

    It is generally accepted that Huygens based probability on expectation. The term “expectation,” however, stems from Van Schooten's Latin translation of Huygens' treatise. A literal translation of Huygens' Dutch text shows more clearly what Huygens actually meant and how he proceeded.

  2. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  3. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  4. The Theory of Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.

  5. Probability Theory Without Tears!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...

  6. probably mostly white

    African Journals Online (AJOL)

    Willem Scholtz

    internet – the (probably mostly white) public's interest in the so-called Border War is ostensibly at an all-time high. By far most of the publications are written by ex- ... understanding of this very important episode in the history of Southern Africa. It was, therefore, with some anticipation that one waited for this book, which.

  7. Tourism employment: contingent work or professional career?

    DEFF Research Database (Denmark)

    Hjalager, Anne-Mette; Andersen, Steen

    2001-01-01

    with a professional or vocational tourism education. Discusses the implications of the retention pattern, arguing that tourism shares its professional labour market with neighbouring sectors, and that the industry and educational support framework must therefore take account of this. However, there is a very real...... background does not give them any particular advantages vis-à-vis employees with less relevant qualifications. The retention of employees is a critical problem in Danish tourism, but while turnover is extremely high among the unskilled, significantly better retention rates are found among those...... risk of losing the competition for the best-qualified staff. Finally, it is postulated that tourism is a locus for new types of career concepts; however, we still lack a genuine understanding of the role of tourism for the contingent or boundaryless career....

  8. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  9. Historical contingency in fluviokarst landscape evolution

    Science.gov (United States)

    Phillips, Jonathan D.

    2018-02-01

    Lateral and vertical erosion at meander bends in the Kentucky River gorge area has created a series of strath terraces on the interior of incised meander bends. These represent a chronosequence of fluviokarst landscape evolution from the youngest valley side transition zone near the valley bottom to the oldest upland surface. This five-part chronosequence (not including the active river channel and floodplain) was analyzed in terms of the landforms that occur at each stage or surface. These include dolines, uvalas, karst valleys, pocket valleys, unincised channels, incised channels, and cliffs (smaller features such as swallets and shafts also occur). Landform coincidence analysis shows higher coincidence indices (CI) than would be expected based on an idealized chronosequence. CI values indicate genetic relationships (common causality) among some landforms and unexpected persistence of some features on older surfaces. The idealized and two observed chronosequences were also represented as graphs and analyzed using algebraic graph theory. The two field sites yielded graphs more complex and with less historical contingency than the idealized sequence. Indeed, some of the spectral graph measures for the field sites more closely approximate a purely hypothetical no-historical-contingency benchmark graph. The deviations of observations from the idealized expectations, and the high levels of graph complexity both point to potential transitions among landform types as being the dominant phenomenon, rather than canalization along a particular evolutionary pathway. As the base level of both the fluvial and karst landforms is lowered as the meanders expand, both fluvial and karst denudation are rejuvenated, and landform transitions remain active.

  10. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  11. Advice for New and Student Lecturers on Probability and Statistics

    Science.gov (United States)

    Larsen, Michael D.

    2006-01-01

    Lecture is a common presentation style that gives instructors a lot of control over topics and time allocation, but can limit active student participation and learning. This article presents some ideas to increase the level of student involvement in lecture. The examples and suggestions are based on the author's experience as a senior lecturer for…

  12. Seismicity, seismic regionalization, earthquake risk, statistics, and probability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chinnery, M.A.; North, R.G.

    1975-12-19

    Observational data relating surface wave magnitude M/sub s/ to seismic moment M/sub 0/ are used to convert a well-known frequency-M/sub s/ plot into a frequency-M/sub 0/ relationship, which turns out to be remarkably linear. There is no evidence of an upper bound to M/sub 0/, on the basis of presently available evidence. The possibility exists that extremely large earthquakes (M/sub 0/ = 10/sup 31/ dyne-cm or greater) may occur from time to time.

  13. Motivating Inquiry in Statistics and Probability in the Primary Classroom

    Science.gov (United States)

    Leavy, Aisling; Hourigan, Mairéad

    2015-01-01

    We describe how the use of a games environment combined with technology supports upper primary children in engaging with a concept traditionally considered too advanced for the primary classes: "The Law of Large Numbers."

  14. Exact Probability Levels for Multisample SMIRNOV-TYPE Statistics

    Science.gov (United States)

    1983-03-01

    8217«^•<^«^< OlHO •^-<»>o>oo■•oa>*r^^•r-lIOlr^r^a(«^•£oo^^llB^- cD*mo>^aDM«^■Ol-l^olf^<o^ot^l^^H-o^-lfnlnom-l■lnooo^m(^’»lu^^-M•»■<o^•«>^0«0 o^Jm<^^mcn^^■»m

  15. Consolidity analysis for fully fuzzy functions, matrices, probability and statistics

    OpenAIRE

    Walaa Ibrahim Gabr

    2015-01-01

    The paper presents a comprehensive review of the know-how for developing the systems consolidity theory for modeling, analysis, optimization and design in fully fuzzy environment. The solving of systems consolidity theory included its development for handling new functions of different dimensionalities, fuzzy analytic geometry, fuzzy vector analysis, functions of fuzzy complex variables, ordinary differentiation of fuzzy functions and partial fraction of fuzzy polynomials. On the other hand, ...

  16. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  17. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...

  18. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  19. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  20. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  1. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  2. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  3. Computational statistics handbook with Matlab

    CERN Document Server

    Martinez, Wendy L

    2007-01-01

    Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...

  4. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  5. Structural Minimax Probability Machine.

    Science.gov (United States)

    Gu, Bin; Sun, Xingming; Sheng, Victor S

    2017-07-01

    Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

  6. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  7. Students' Understanding of Conditional Probability on Entering University

    Science.gov (United States)

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  8. Contingency Management Requirements Document: Preliminary Version. Revision F

    Science.gov (United States)

    2005-01-01

    This is the High Altitude, Long Endurance (HALE) Remotely Operated Aircraft (ROA) Contingency Management (CM) Functional Requirements document. This document applies to HALE ROA operating within the National Airspace System (NAS) limited at this time to enroute operations above 43,000 feet (defined as Step 1 of the Access 5 project, sponsored by the National Aeronautics and Space Administration). A contingency is an unforeseen event requiring a response. The unforeseen event may be an emergency, an incident, a deviation, or an observation. Contingency Management (CM) is the process of evaluating the event, deciding on the proper course of action (a plan), and successfully executing the plan.

  9. Application of the IPEBS method to dynamic contingency analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martins, A.C.B. [FURNAS, Rio de Janeiro, RJ (Brazil); Pedroso, A.S. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil)

    1994-12-31

    Dynamic contingency analysis is certainly a demanding task in the context of dynamic performance evaluation. This paper presents the results of a test for checking the contingency screening capability of the IPEBS method. A brazilian 1100-bus, 112-gen system was used in the test; the ranking of the contingencies based on critical clearing times obtained with IPEBS, was compared with the ranking derived from detailed time-domain simulation. The results of this comparison encourages us to recommended the use of the method in industry applications, in a complementary basis to the current method of time domain simulation. (author) 5 refs., 1 fig., 2 tabs.

  10. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  11. Orbitofrontal cortex reflects changes in response-outcome contingencies during probabilistic reversal learning.

    Science.gov (United States)

    Amodeo, L R; McMurray, M S; Roitman, J D

    2017-03-14

    In a continuously changing environment, in which behavioral outcomes are rarely certain, animals must be able to learn to integrate feedback from their choices over time and adapt to changing reward contingencies to maintain flexible behavior. The orbitofrontal region of prefrontal cortex (OFC) has been widely implicated as playing a role in the ability to flexibly control behavior. We used a probabilistic reversal learning task to measure rats' behavioral flexibility and its neural basis in the activity of single neurons in OFC. In this task, one lever, designated as 'correct', was rewarded at a high probability (80%) and a second, spatially distinct lever, designated as 'incorrect', was rewarded at a low probability (20%). Once rats reached a learning criterion for reliably selecting the correct lever, reward contingencies of the two levers were switched, and daily sessions were conducted until rats reliably selected the new correct lever. All rats performed the initial Acquisition and subsequent Reversal successfully, with more sessions needed to learn the Reversal. OFC neurons were recorded during five behavioral sessions spanning Acquisition and Reversal learning. The dominant pattern of neural responding in OFC, identified by principal component analysis of the population of neurons recorded, was modulated by reward outcome across behavioral sessions. Generally, activity was higher following rewarded choices than unrewarded. However, there was a correlation between reduced responses to reward following incorrect choices and the establishment of the preference for the correct lever. These results show how signaling by individual OFC neurons may participate in the flexible adaptation of behavior under changing reward contingencies. Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  12. Climbing Mount Probable

    Science.gov (United States)

    Harper, Marc Allen

    2009-01-01

    This work attempts to explain the relationships between natural selection, information theory, and statistical inference. In particular, a geometric formulation of information theory known as information geometry and its deep connections to evolutionary game theory inform the role of natural selection in evolutionary processes. The goals of this…

  13. Probability in Action: The Red Traffic Light

    Science.gov (United States)

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  14. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  15. Laplace's 1774 Memoir on Inverse Probability

    OpenAIRE

    Stigler, Stephen M.

    1986-01-01

    Laplace's first major article on mathematical statistics was published in 1774. It is arguably the most influential article in this field to appear before 1800, being the first widely read presentation of inverse probability and its application to both binomial and location parameter estimation. After a brief introduction, and English translation of this epochal memoir is given.

  16. A nonparametric method for predicting survival probabilities

    NARCIS (Netherlands)

    van der Klaauw, B.; Vriend, S.

    2015-01-01

    Public programs often use statistical profiling to assess the risk that applicants will become long-term dependent on the program. The literature uses linear probability models and (Cox) proportional hazard models to predict duration outcomes. These either focus on one threshold duration or impose

  17. Five-Parameter Bivariate Probability Distribution

    Science.gov (United States)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  18. Relative changes from prior reward contingencies can constrain brain correlates of outcome monitoring.

    Science.gov (United States)

    Mushtaq, Faisal; Stoet, Gijsbert; Bland, Amy Rachel; Schaefer, Alexandre

    2013-01-01

    It is well-known that the affective value of an environment can be relative to whether it reflects an improvement or a worsening from a previous state. A potential explanation for this phenomenon suggests that relative changes from previous reward contingencies can constrain how brain monitoring systems form predictions about future events. In support of this idea, we found that changes per se relative to previous states of learned reward contingencies modulated the Feedback-Related Negativity (FRN), a human brain potential known to index discrepancies between predictions and affective outcomes. Specifically, we observed that environments with a 50% reward probability yielded different FRN patterns according to whether they reflected an improvement or a worsening from a previous environment. Further, we also found that this pattern of results was driven mainly by variations in the amplitude of ERPs to positive outcomes. Overall, these results suggest that relative changes in reward probability from previous learned environments can constrain how neural systems of outcome monitoring formulate predictions about the likelihood of future rewards and nonrewards.

  19. Relative Changes from Prior Reward Contingencies Can Constrain Brain Correlates of Outcome Monitoring

    Science.gov (United States)

    Mushtaq, Faisal; Stoet, Gijsbert; Bland, Amy Rachel; Schaefer, Alexandre

    2013-01-01

    It is well-known that the affective value of an environment can be relative to whether it reflects an improvement or a worsening from a previous state. A potential explanation for this phenomenon suggests that relative changes from previous reward contingencies can constrain how brain monitoring systems form predictions about future events. In support of this idea, we found that changes per se relative to previous states of learned reward contingencies modulated the Feedback-Related Negativity (FRN), a human brain potential known to index discrepancies between predictions and affective outcomes. Specifically, we observed that environments with a 50% reward probability yielded different FRN patterns according to whether they reflected an improvement or a worsening from a previous environment. Further, we also found that this pattern of results was driven mainly by variations in the amplitude of ERPs to positive outcomes. Overall, these results suggest that relative changes in reward probability from previous learned environments can constrain how neural systems of outcome monitoring formulate predictions about the likelihood of future rewards and nonrewards. PMID:23840446

  20. Anxious individuals have difficulty learning the causal statistics of aversive environments.

    Science.gov (United States)

    Browning, Michael; Behrens, Timothy E; Jocham, Gerhard; O'Reilly, Jill X; Bishop, Sonia J

    2015-04-01

    Statistical regularities in the causal structure of the environment enable us to predict the probable outcomes of our actions. Environments differ in the extent to which action-outcome contingencies are stable or volatile. Difficulty in being able to use this information to optimally update outcome predictions might contribute to the decision-making difficulties seen in anxiety. We tested this using an aversive learning task manipulating environmental volatility. Human participants low in trait anxiety matched updating of their outcome predictions to the volatility of the current environment, as predicted by a Bayesian model. Individuals with high trait anxiety showed less ability to adjust updating of outcome expectancies between stable and volatile environments. This was linked to reduced sensitivity of the pupil dilatory response to volatility, potentially indicative of altered norepinephrinergic responsivity to changes in this aspect of environmental information.