WorldWideScience

Sample records for contingent probability statistics

  1. Comparison of accuracy in predicting emotional instability from MMPI data: fisherian versus contingent probability statistics

    Energy Technology Data Exchange (ETDEWEB)

    Berghausen, P.E. Jr.; Mathews, T.W.

    1987-01-01

    The security plans of nuclear power plants generally require that all personnel who are to have access to protected areas or vital islands be screened for emotional stability. In virtually all instances, the screening involves the administration of one or more psychological tests, usually including the Minnesota Multiphasic Personality Inventory (MMPI). At some plants, all employees receive a structured clinical interview after they have taken the MMPI and results have been obtained. At other plants, only those employees with dirty MMPI are interviewed. This latter protocol is referred to as interviews by exception. Behaviordyne Psychological Corp. has succeeded in removing some of the uncertainty associated with interview-by-exception protocols by developing an empirically based, predictive equation. This equation permits utility companies to make informed choices regarding the risks they are assuming. A conceptual problem exists with the predictive equation, however. Like most predictive equations currently in use, it is based on Fisherian statistics, involving least-squares analyses. Consequently, Behaviordyne Psychological Corp., in conjunction with T.W. Mathews and Associates, has just developed a second predictive equation, one based on contingent probability statistics. The particular technique used in the multi-contingent analysis of probability systems (MAPS) approach. The present paper presents a comparison of predictive accuracy of the two equations: the one derived using Fisherian techniques versus the one thing contingent probability techniques.

  2. Comparison of accuracy in predicting emotional instability from MMPI data: fisherian versus contingent probability statistics

    International Nuclear Information System (INIS)

    Berghausen, P.E. Jr.; Mathews, T.W.

    1987-01-01

    The security plans of nuclear power plants generally require that all personnel who are to have access to protected areas or vital islands be screened for emotional stability. In virtually all instances, the screening involves the administration of one or more psychological tests, usually including the Minnesota Multiphasic Personality Inventory (MMPI). At some plants, all employees receive a structured clinical interview after they have taken the MMPI and results have been obtained. At other plants, only those employees with dirty MMPI are interviewed. This latter protocol is referred to as interviews by exception. Behaviordyne Psychological Corp. has succeeded in removing some of the uncertainty associated with interview-by-exception protocols by developing an empirically based, predictive equation. This equation permits utility companies to make informed choices regarding the risks they are assuming. A conceptual problem exists with the predictive equation, however. Like most predictive equations currently in use, it is based on Fisherian statistics, involving least-squares analyses. Consequently, Behaviordyne Psychological Corp., in conjunction with T.W. Mathews and Associates, has just developed a second predictive equation, one based on contingent probability statistics. The particular technique used in the multi-contingent analysis of probability systems (MAPS) approach. The present paper presents a comparison of predictive accuracy of the two equations: the one derived using Fisherian techniques versus the one thing contingent probability techniques

  3. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  4. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  5. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  6. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  7. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  8. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  9. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  10. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  11. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  12. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  13. Simplified Freeman-Tukey test statistics for testing probabilities in ...

    African Journals Online (AJOL)

    This paper presents the simplified version of the Freeman-Tukey test statistic for testing hypothesis about multinomial probabilities in one, two and multidimensional contingency tables that does not require calculating the expected cell frequencies before test of significance. The simplified method established new criteria of ...

  14. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  15. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  16. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  17. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  18. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  19. Model selection for contingency tables with algebraic statistics

    NARCIS (Netherlands)

    Krampe, A.; Kuhnt, S.; Gibilisco, P.; Riccimagno, E.; Rogantin, M.P.; Wynn, H.P.

    2009-01-01

    Goodness-of-fit tests based on chi-square approximations are commonly used in the analysis of contingency tables. Results from algebraic statistics combined with MCMC methods provide alternatives to the chi-square approximation. However, within a model selection procedure usually a large number of

  20. Disruptive Effects of Contingent Food on High-Probability Behavior

    Science.gov (United States)

    Frank-Crawford, Michelle A.; Borrero, John C.; Nguyen, Linda; Leon-Enriquez, Yanerys; Carreau-Webster, Abbey B.; DeLeon, Iser G.

    2012-01-01

    The delivery of food contingent on 10 s of consecutive toy engagement resulted in a decrease in engagement and a corresponding increase in other responses that had been previously reinforced with food. Similar effects were not observed when tokens exchangeable for the same food were delivered, suggesting that engagement was disrupted by the…

  1. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  2. Contingency bias in probability judgement may arise from ambiguity regarding additional causes.

    Science.gov (United States)

    Mitchell, Chris J; Griffiths, Oren; More, Pranjal; Lovibond, Peter F

    2013-09-01

    In laboratory contingency learning tasks, people usually give accurate estimates of the degree of contingency between a cue and an outcome. However, if they are asked to estimate the probability of the outcome in the presence of the cue, they tend to be biased by the probability of the outcome in the absence of the cue. This bias is often attributed to an automatic contingency detection mechanism, which is said to act via an excitatory associative link to activate the outcome representation at the time of testing. We conducted 3 experiments to test alternative accounts of contingency bias. Participants were exposed to the same outcome probability in the presence of the cue, but different outcome probabilities in the absence of the cue. Phrasing the test question in terms of frequency rather than probability and clarifying the test instructions reduced but did not eliminate contingency bias. However, removal of ambiguity regarding the presence of additional causes during the test phase did eliminate contingency bias. We conclude that contingency bias may be due to ambiguity in the test question, and therefore it does not require postulation of a separate associative link-based mechanism.

  3. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  4. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  5. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  6. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  7. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  8. Statistical Efficiency of Double-Bounded Dichotomous Choice Contingent Valuation

    OpenAIRE

    Michael Hanemann; John Loomis; Barbara Kanninen

    1991-01-01

    The statistical efficiency of conventional dichotomous choice contingent valuation surveys can be improved by asking each respondent a second dichotomous choice question which depends on the response to the first question—if the first response is "yes," the second bid is some amount greater than the first bid; while, if the first response is "no," the second bid is some amount smaller. This "double-bounded" approach is shown to be asymptotically more efficient than the conventional, "singlebo...

  9. Statistics with JMP graphs, descriptive statistics and probability

    CERN Document Server

    Goos, Peter

    2015-01-01

    Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic

  10. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  11. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  12. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  13. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  14. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  16. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  17. Probability and logical structure of statistical theories

    International Nuclear Information System (INIS)

    Hall, M.J.W.

    1988-01-01

    A characterization of statistical theories is given which incorporates both classical and quantum mechanics. It is shown that each statistical theory induces an associated logic and joint probability structure, and simple conditions are given for the structure to be of a classical or quantum type. This provides an alternative for the quantum logic approach to axiomatic quantum mechanics. The Bell inequalities may be derived for those statistical theories that have a classical structure and satisfy a locality condition weaker than factorizability. The relation of these inequalities to the issue of hidden variable theories for quantum mechanics is discussed and clarified

  18. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data......: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying...

  19. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  20. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  1. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  2. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  3. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  4. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  5. Lectures on probability and statistics. Revision

    International Nuclear Information System (INIS)

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion

  6. A Multidisciplinary Approach for Teaching Statistics and Probability

    Science.gov (United States)

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  7. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  8. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  9. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  10. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  11. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  12. Introduction to probability and statistics for engineers and scientists

    CERN Document Server

    Ross, Sheldon M

    2009-01-01

    This updated text provides a superior introduction to applied probability and statistics for engineering or science majors. Ross emphasizes the manner in which probability yields insight into statistical problems; ultimately resulting in an intuitive understanding of the statistical procedures most often used by practicing engineers and scientists. Real data sets are incorporated in a wide variety of exercises and examples throughout the book, and this emphasis on data motivates the probability coverage.As with the previous editions, Ross' text has remendously clear exposition, plus real-data

  13. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  14. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  15. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  16. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided

  17. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    Science.gov (United States)

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  18. Teaching Basic Probability in Undergraduate Statistics or Management Science Courses

    Science.gov (United States)

    Naidu, Jaideep T.; Sanford, John F.

    2017-01-01

    Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…

  19. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Decision-making in probability and statistics Chilean curriculum

    DEFF Research Database (Denmark)

    Elicer, Raimundo

    2018-01-01

    Probability and statistics have become prominent subjects in school mathematics curricula. As an exemplary case, I investigate the role of decision making in the justification for probability and statistics in the current Chilean upper secondary mathematics curriculum. For addressing this concern......, I draw upon Fairclough’s model for Critical Discourse Analysis to analyse selected texts as examples of discourse practices. The texts are interconnected with politically driven ideas of stochastics “for all”, the notion of statistical literacy coined by statisticians’ communities, schooling...

  1. Statistics and Probability Theory In Pursuit of Engineering Decision Support

    CERN Document Server

    Faber, Michael Havbro

    2012-01-01

    This book provides the reader with the basic skills and tools of statistics and probability in the context of engineering modeling and analysis. The emphasis is on the application and the reasoning behind the application of these skills and tools for the purpose of enhancing  decision making in engineering. The purpose of the book is to ensure that the reader will acquire the required theoretical basis and technical skills such as to feel comfortable with the theory of basic statistics and probability. Moreover, in this book, as opposed to many standard books on the same subject, the perspective is to focus on the use of the theory for the purpose of engineering model building and decision making.  This work is suitable for readers with little or no prior knowledge on the subject of statistics and probability.

  2. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  3. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  4. Teaching Probability with the Support of the R Statistical Software

    Science.gov (United States)

    dos Santos Ferreira, Robson; Kataoka, Verônica Yumi; Karrer, Monica

    2014-01-01

    The objective of this paper is to discuss aspects of high school students' learning of probability in a context where they are supported by the statistical software R. We report on the application of a teaching experiment, constructed using the perspective of Gal's probabilistic literacy and Papert's constructionism. The results show improvement…

  5. Statistical learning of action: the role of conditional probability.

    Science.gov (United States)

    Meyer, Meredith; Baldwin, Dare

    2011-12-01

    Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.

  6. A Brief Look at the History of Probability and Statistics.

    Science.gov (United States)

    Lightner, James E.

    1991-01-01

    The historical development of probability theory is traced from its early origins in games of chance through its mathematical foundations in the work of Pascal and Fermat. The roots of statistics are also presented beginning with early actuarial developments through the work of Laplace, Gauss, and others. (MDH)

  7. Statistical complexity without explicit reference to underlying probabilities

    Science.gov (United States)

    Pennini, F.; Plastino, A.

    2018-06-01

    We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.

  8. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  9. On the limits of statistical learning: Intertrial contextual cueing is confined to temporally close contingencies.

    Science.gov (United States)

    Thomas, Cyril; Didierjean, André; Maquestiaux, François; Goujon, Annabelle

    2018-04-12

    Since the seminal study by Chun and Jiang (Cognitive Psychology, 36, 28-71, 1998), a large body of research based on the contextual-cueing paradigm has shown that the cognitive system is capable of extracting statistical contingencies from visual environments. Most of these studies have focused on how individuals learn regularities found within an intratrial temporal window: A context predicts the target position within a given trial. However, Ono, Jiang, and Kawahara (Journal of Experimental Psychology, 31, 703-712, 2005) provided evidence of an intertrial implicit-learning effect when a distractor configuration in preceding trials N - 1 predicted the target location in trials N. The aim of the present study was to gain further insight into this effect by examining whether it occurs when predictive relationships are impeded by interfering task-relevant noise (Experiments 2 and 3) or by a long delay (Experiments 1, 4, and 5). Our results replicated the intertrial contextual-cueing effect, which occurred in the condition of temporally close contingencies. However, there was no evidence of integration across long-range spatiotemporal contingencies, suggesting a temporal limitation of statistical learning.

  10. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  11. Joint probability of statistical success of multiple phase III trials.

    Science.gov (United States)

    Zhang, Jianliang; Zhang, Jenny J

    2013-01-01

    In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.

  12. On Dobrushin's way from probability theory to statistical physics

    CERN Document Server

    Minlos, R A; Suhov, Yu M; Suhov, Yu

    2000-01-01

    R. Dobrushin worked in several branches of mathematics (probability theory, information theory), but his deepest influence was on mathematical physics. He was one of the founders of the rigorous study of statistical physics. When Dobrushin began working in that direction in the early sixties, only a few people worldwide were thinking along the same lines. Now there is an army of researchers in the field. This collection is devoted to the memory of R. L. Dobrushin. The authors who contributed to this collection knew him quite well and were his colleagues. The title, "On Dobrushin's Way", is mea

  13. Applications of Statistics and Probability in Civil Engineering

    DEFF Research Database (Denmark)

    Faber, Michael Havbro

    contains the proceedings of the 11th International Conference on Applications of Statistics and Probability in Civil Engineering (ICASP11, Zürich, Switzerland, 1-4 August 2011). The book focuses not only on the more traditional technical issues, but also emphasizes the societal context...... and reliability in engineering; to professionals and engineers, including insurance and consulting companies working with natural hazards, design, operation and maintenance of civil engineering and industrial facilities; and to decision makers and professionals in the public sector, including nongovernmental...

  14. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  15. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  17. Efficient statistical tests to compare Youden index: accounting for contingency correlation.

    Science.gov (United States)

    Chen, Fangyao; Xue, Yuqiang; Tan, Ming T; Chen, Pingyan

    2015-04-30

    Youden index is widely utilized in studies evaluating accuracy of diagnostic tests and performance of predictive, prognostic, or risk models. However, both one and two independent sample tests on Youden index have been derived ignoring the dependence (association) between sensitivity and specificity, resulting in potentially misleading findings. Besides, paired sample test on Youden index is currently unavailable. This article develops efficient statistical inference procedures for one sample, independent, and paired sample tests on Youden index by accounting for contingency correlation, namely associations between sensitivity and specificity and paired samples typically represented in contingency tables. For one and two independent sample tests, the variances are estimated by Delta method, and the statistical inference is based on the central limit theory, which are then verified by bootstrap estimates. For paired samples test, we show that the estimated covariance of the two sensitivities and specificities can be represented as a function of kappa statistic so the test can be readily carried out. We then show the remarkable accuracy of the estimated variance using a constrained optimization approach. Simulation is performed to evaluate the statistical properties of the derived tests. The proposed approaches yield more stable type I errors at the nominal level and substantially higher power (efficiency) than does the original Youden's approach. Therefore, the simple explicit large sample solution performs very well. Because we can readily implement the asymptotic and exact bootstrap computation with common software like R, the method is broadly applicable to the evaluation of diagnostic tests and model performance. Copyright © 2015 John Wiley & Sons, Ltd.

  18. Do doctors need statistics? Doctors' use of and attitudes to probability and statistics.

    Science.gov (United States)

    Swift, Louise; Miles, Susan; Price, Gill M; Shepstone, Lee; Leinster, Sam J

    2009-07-10

    There is little published evidence on what doctors do in their work that requires probability and statistics, yet the General Medical Council (GMC) requires new doctors to have these skills. This study investigated doctors' use of and attitudes to probability and statistics with a view to informing undergraduate teaching.An email questionnaire was sent to 473 clinicians with an affiliation to the University of East Anglia's Medical School.Of 130 respondents approximately 90 per cent of doctors who performed each of the following activities found probability and statistics useful for that activity: accessing clinical guidelines and evidence summaries, explaining levels of risk to patients, assessing medical marketing and advertising material, interpreting the results of a screening test, reading research publications for general professional interest, and using research publications to explore non-standard treatment and management options.Seventy-nine per cent (103/130, 95 per cent CI 71 per cent, 86 per cent) of participants considered probability and statistics important in their work. Sixty-three per cent (78/124, 95 per cent CI 54 per cent, 71 per cent) said that there were activities that they could do better or start doing if they had an improved understanding of these areas and 74 of these participants elaborated on this. Themes highlighted by participants included: being better able to critically evaluate other people's research; becoming more research-active, having a better understanding of risk; and being better able to explain things to, or teach, other people.Our results can be used to inform how probability and statistics should be taught to medical undergraduates and should encourage today's medical students of the subjects' relevance to their future careers. Copyright 2009 John Wiley & Sons, Ltd.

  19. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  20. Cellular Analysis of Boltzmann Most Probable Ideal Gas Statistics

    Science.gov (United States)

    Cahill, Michael E.

    2018-04-01

    Exact treatment of Boltzmann's Most Probable Statistics for an Ideal Gas of Identical Mass Particles having Translational Kinetic Energy gives a Distribution Law for Velocity Phase Space Cell j which relates the Particle Energy and the Particle Population according toB e(j) = A - Ψ(n(j) + 1)where A & B are the Lagrange Multipliers and Ψ is the Digamma Function defined byΨ(x + 1) = d/dx ln(x!)A useful sufficiently accurate approximation for Ψ is given byΨ(x +1) ≈ ln(e-γ + x)where γ is the Euler constant (≈.5772156649) & so the above distribution equation is approximatelyB e(j) = A - ln(e-γ + n(j))which can be inverted to solve for n(j) givingn(j) = (eB (eH - e(j)) - 1) e-γwhere B eH = A + γ& where B eH is a unitless particle energy which replaces the parameter A. The 2 approximate distribution equations imply that eH is the highest particle energy and the highest particle population isnH = (eB eH - 1) e-γwhich is due to the facts that population becomes negative if e(j) > eH and kinetic energy becomes negative if n(j) > nH.An explicit construction of Cells in Velocity Space which are equal in volume and homogeneous for almost all cells is shown to be useful in the analysis.Plots for sample distribution properties using e(j) as the independent variable are presented.

  1. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  2. Sufficient Statistics for Divergence and the Probability of Misclassification

    Science.gov (United States)

    Quirein, J.

    1972-01-01

    One particular aspect is considered of the feature selection problem which results from the transformation x=Bz, where B is a k by n matrix of rank k and k is or = to n. It is shown that in general, such a transformation results in a loss of information. In terms of the divergence, this is equivalent to the fact that the average divergence computed using the variable x is less than or equal to the average divergence computed using the variable z. A loss of information in terms of the probability of misclassification is shown to be equivalent to the fact that the probability of misclassification computed using variable x is greater than or equal to the probability of misclassification computed using variable z. First, the necessary facts relating k-dimensional and n-dimensional integrals are derived. Then the mentioned results about the divergence and probability of misclassification are derived. Finally it is shown that if no information is lost (in x = Bz) as measured by the divergence, then no information is lost as measured by the probability of misclassification.

  3. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    Science.gov (United States)

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  4. Limiting values of large deviation probabilities of quadratic statistics

    NARCIS (Netherlands)

    Jeurnink, Gerardus A.M.; Kallenberg, W.C.M.

    1990-01-01

    Application of exact Bahadur efficiencies in testing theory or exact inaccuracy rates in estimation theory needs evaluation of large deviation probabilities. Because of the complexity of the expressions, frequently a local limit of the nonlocal measure is considered. Local limits of large deviation

  5. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  6. On Asymptotically Lacunary Statistical Equivalent Sequences of Order α in Probability

    Directory of Open Access Journals (Sweden)

    Işık Mahmut

    2017-01-01

    Full Text Available In this study, we introduce and examine the concepts of asymptotically lacunary statistical equivalent of order α in probability and strong asymptotically lacunary equivalent of order α in probability. We give some relations connected to these concepts.

  7. Consolidity analysis for fully fuzzy functions, matrices, probability and statistics

    Directory of Open Access Journals (Sweden)

    Walaa Ibrahim Gabr

    2015-03-01

    Full Text Available The paper presents a comprehensive review of the know-how for developing the systems consolidity theory for modeling, analysis, optimization and design in fully fuzzy environment. The solving of systems consolidity theory included its development for handling new functions of different dimensionalities, fuzzy analytic geometry, fuzzy vector analysis, functions of fuzzy complex variables, ordinary differentiation of fuzzy functions and partial fraction of fuzzy polynomials. On the other hand, the handling of fuzzy matrices covered determinants of fuzzy matrices, the eigenvalues of fuzzy matrices, and solving least-squares fuzzy linear equations. The approach demonstrated to be also applicable in a systematic way in handling new fuzzy probabilistic and statistical problems. This included extending the conventional probabilistic and statistical analysis for handling fuzzy random data. Application also covered the consolidity of fuzzy optimization problems. Various numerical examples solved have demonstrated that the new consolidity concept is highly effective in solving in a compact form the propagation of fuzziness in linear, nonlinear, multivariable and dynamic problems with different types of complexities. Finally, it is demonstrated that the implementation of the suggested fuzzy mathematics can be easily embedded within normal mathematics through building special fuzzy functions library inside the computational Matlab Toolbox or using other similar software languages.

  8. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  9. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  10. A new expression of the probability distribution in Incomplete Statistics and fundamental thermodynamic relations

    International Nuclear Information System (INIS)

    Huang Zhifu; Lin Bihong; ChenJincan

    2009-01-01

    In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier β introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.

  11. Statistical methods in epidemiology. VII. An overview of the chi2 test for 2 x 2 contingency table analysis.

    Science.gov (United States)

    Rigby, A S

    2001-11-10

    The odds ratio is an appropriate method of analysis for data in 2 x 2 contingency tables. However, other methods of analysis exist. One such method is based on the chi2 test of goodness-of-fit. Key players in the development of statistical theory include Pearson, Fisher and Yates. Data are presented in the form of 2 x 2 contingency tables and a method of analysis based on the chi2 test is introduced. There are many variations of the basic test statistic, one of which is the chi2 test with Yates' continuity correction. The usefulness (or not) of Yates' continuity correction is discussed. Problems of interpretation when the method is applied to k x m tables are highlighted. Some properties of the chi2 the test are illustrated by taking examples from the author's teaching experiences. Journal editors should be encouraged to give both observed and expected cell frequencies so that better information comes out of the chi2 test statistic.

  12. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  13. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  14. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  15. Introduction to probability and statistics for ecosystem managers simulation and resampling

    CERN Document Server

    Haas, Timothy C

    2013-01-01

    Explores computer-intensive probability and statistics for ecosystem management decision making Simulation is an accessible way to explain probability and stochastic model behavior to beginners. This book introduces probability and statistics to future and practicing ecosystem managers by providing a comprehensive treatment of these two areas. The author presents a self-contained introduction for individuals involved in monitoring, assessing, and managing ecosystems and features intuitive, simulation-based explanations of probabilistic and statistical concepts. Mathematical programming details are provided for estimating ecosystem model parameters with Minimum Distance, a robust and computer-intensive method. The majority of examples illustrate how probability and statistics can be applied to ecosystem management challenges. There are over 50 exercises - making this book suitable for a lecture course in a natural resource and/or wildlife management department, or as the main text in a program of self-stud...

  16. New exponential, logarithm and q-probability in the non-extensive statistical physics

    OpenAIRE

    Chung, Won Sang

    2013-01-01

    In this paper, a new exponential and logarithm related to the non-extensive statistical physics is proposed by using the q-sum and q-product which satisfy the distributivity. And we discuss the q-mapping from an ordinary probability to q-probability. The q-entropy defined by the idea of q-probability is shown to be q-additive.

  17. Quantum probability, choice in large worlds, and the statistical structure of reality.

    Science.gov (United States)

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  18. The exact probability distribution of the rank product statistics for replicated experiments.

    Science.gov (United States)

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  19. Academic Training Lecture | Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood | 7-9 April

    CERN Multimedia

    2015-01-01

    Please note that our next series of Academic Training Lectures will take place on the 7, 8 and 9 April 2015   Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood, by Harrison Prosper, Floridia State University, USA. from 11.00 a.m. to 12.00 p.m. in the Council Chamber (503-1-001) https://indico.cern.ch/event/358542/

  20. Statistics and Probability at Secondary Schools in the Federal State of Salzburg: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Wolfgang Voit

    2014-12-01

    Full Text Available Knowledge about the practical use of statistics and probability in today's mathematics instruction at secondary schools is vital in order to improve the academic education for future teachers. We have conducted an empirical study among school teachers to inform towards improved mathematics instruction and teacher preparation. The study provides a snapshot into the daily practice of instruction at school. Centered around the four following questions, the status of statistics and probability was examined. Where did  the current mathematics teachers study? What relevance do statistics and probability have in school? Which contents are actually taught in class? What kind of continuing education would be desirable for teachers? The study population consisted of all teachers of mathematics at secondary schools in the federal state of Salzburg.

  1. 8th International Conference on Soft Methods in Probability and Statistics

    CERN Document Server

    Giordani, Paolo; Vantaggi, Barbara; Gagolewski, Marek; Gil, María; Grzegorzewski, Przemysław; Hryniewicz, Olgierd

    2017-01-01

    This proceedings volume is a collection of peer reviewed papers presented at the 8th International Conference on Soft Methods in Probability and Statistics (SMPS 2016) held in Rome (Italy). The book is dedicated to Data science which aims at developing automated methods to analyze massive amounts of data and to extract knowledge from them. It shows how Data science employs various programming techniques and methods of data wrangling, data visualization, machine learning, probability and statistics. The soft methods proposed in this volume represent a collection of tools in these fields that can also be useful for data science.

  2. Probability density cloud as a geometrical tool to describe statistics of scattered light.

    Science.gov (United States)

    Yaitskova, Natalia

    2017-04-01

    First-order statistics of scattered light is described using the representation of the probability density cloud, which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.

  3. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  4. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    Science.gov (United States)

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  5. Probability cueing of distractor locations: both intertrial facilitation and statistical learning mediate interference reduction.

    Science.gov (United States)

    Goschy, Harriet; Bakos, Sarolta; Müller, Hermann J; Zehetleitner, Michael

    2014-01-01

    Targets in a visual search task are detected faster if they appear in a probable target region as compared to a less probable target region, an effect which has been termed "probability cueing." The present study investigated whether probability cueing cannot only speed up target detection, but also minimize distraction by distractors in probable distractor regions as compared to distractors in less probable distractor regions. To this end, three visual search experiments with a salient, but task-irrelevant, distractor ("additional singleton") were conducted. Experiment 1 demonstrated that observers can utilize uneven spatial distractor distributions to selectively reduce interference by distractors in frequent distractor regions as compared to distractors in rare distractor regions. Experiments 2 and 3 showed that intertrial facilitation, i.e., distractor position repetitions, and statistical learning (independent of distractor position repetitions) both contribute to the probability cueing effect for distractor locations. Taken together, the present results demonstrate that probability cueing of distractor locations has the potential to serve as a strong attentional cue for the shielding of likely distractor locations.

  6. The Effects and Side-Effects of Statistics Education: Psychology Students' (Mis-)Conceptions of Probability

    Science.gov (United States)

    Morsanyi, Kinga; Primi, Caterina; Chiesi, Francesca; Handley, Simon

    2009-01-01

    In three studies we looked at two typical misconceptions of probability: the representativeness heuristic, and the equiprobability bias. The literature on statistics education predicts that some typical errors and biases (e.g., the equiprobability bias) increase with education, whereas others decrease. This is in contrast with reasoning theorists'…

  7. Two-Way Tables: Issues at the Heart of Statistics and Probability for Students and Teachers

    Science.gov (United States)

    Watson, Jane; Callingham, Rosemary

    2014-01-01

    Some problems exist at the intersection of statistics and probability, creating a dilemma in relation to the best approach to assist student understanding. Such is the case with problems presented in two-way tables representing conditional information. The difficulty can be confounded if the context within which the problem is set is one where…

  8. There Once Was a 9-Block ...--A Middle-School Design for Probability and Statistics

    Science.gov (United States)

    Abrahamson, Dor; Janusz, Ruth M.; Wilensky, Uri

    2006-01-01

    ProbLab is a probability-and-statistics unit developed at the Center for Connected Learning and Computer-Based Modeling, Northwestern University. Students analyze the combinatorial space of the 9-block, a 3-by-3 grid of squares, in which each square can be either green or blue. All 512 possible 9-blocks are constructed and assembled in a "bar…

  9. Ergodic theory, interpretations of probability and the foundations of statistical mechanics

    NARCIS (Netherlands)

    van Lith, J.H.

    2001-01-01

    The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time

  10. Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function

    Science.gov (United States)

    Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.

    2011-01-01

    In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.

  11. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  12. Promoting Active Learning When Teaching Introductory Statistics and Probability Using a Portfolio Curriculum Approach

    Science.gov (United States)

    Adair, Desmond; Jaeger, Martin; Price, Owen M.

    2018-01-01

    The use of a portfolio curriculum approach, when teaching a university introductory statistics and probability course to engineering students, is developed and evaluated. The portfolio curriculum approach, so called, as the students need to keep extensive records both as hard copies and digitally of reading materials, interactions with faculty,…

  13. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    Science.gov (United States)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  14. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  15. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  16. A scan statistic for continuous data based on the normal probability model

    Directory of Open Access Journals (Sweden)

    Huang Lan

    2009-10-01

    Full Text Available Abstract Temporal, spatial and space-time scan statistics are commonly used to detect and evaluate the statistical significance of temporal and/or geographical disease clusters, without any prior assumptions on the location, time period or size of those clusters. Scan statistics are mostly used for count data, such as disease incidence or mortality. Sometimes there is an interest in looking for clusters with respect to a continuous variable, such as lead levels in children or low birth weight. For such continuous data, we present a scan statistic where the likelihood is calculated using the the normal probability model. It may also be used for other distributions, while still maintaining the correct alpha level. In an application of the new method, we look for geographical clusters of low birth weight in New York City.

  17. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  18. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  19. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  20. Bayes factor and posterior probability: Complementary statistical evidence to p-value.

    Science.gov (United States)

    Lin, Ruitao; Yin, Guosheng

    2015-09-01

    As a convention, a p-value is often computed in hypothesis testing and compared with the nominal level of 0.05 to determine whether to reject the null hypothesis. Although the smaller the p-value, the more significant the statistical test, it is difficult to perceive the p-value in a probability scale and quantify it as the strength of the data against the null hypothesis. In contrast, the Bayesian posterior probability of the null hypothesis has an explicit interpretation of how strong the data support the null. We make a comparison of the p-value and the posterior probability by considering a recent clinical trial. The results show that even when we reject the null hypothesis, there is still a substantial probability (around 20%) that the null is true. Not only should we examine whether the data would have rarely occurred under the null hypothesis, but we also need to know whether the data would be rare under the alternative. As a result, the p-value only provides one side of the information, for which the Bayes factor and posterior probability may offer complementary evidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Pairwise contact energy statistical potentials can help to find probability of point mutations.

    Science.gov (United States)

    Saravanan, K M; Suvaithenamudhan, S; Parthasarathy, S; Selvaraj, S

    2017-01-01

    To adopt a particular fold, a protein requires several interactions between its amino acid residues. The energetic contribution of these residue-residue interactions can be approximated by extracting statistical potentials from known high resolution structures. Several methods based on statistical potentials extracted from unrelated proteins are found to make a better prediction of probability of point mutations. We postulate that the statistical potentials extracted from known structures of similar folds with varying sequence identity can be a powerful tool to examine probability of point mutation. By keeping this in mind, we have derived pairwise residue and atomic contact energy potentials for the different functional families that adopt the (α/β) 8 TIM-Barrel fold. We carried out computational point mutations at various conserved residue positions in yeast Triose phosphate isomerase enzyme for which experimental results are already reported. We have also performed molecular dynamics simulations on a subset of point mutants to make a comparative study. The difference in pairwise residue and atomic contact energy of wildtype and various point mutations reveals probability of mutations at a particular position. Interestingly, we found that our computational prediction agrees with the experimental studies of Silverman et al. (Proc Natl Acad Sci 2001;98:3092-3097) and perform better prediction than i Mutant and Cologne University Protein Stability Analysis Tool. The present work thus suggests deriving pairwise contact energy potentials and molecular dynamics simulations of functionally important folds could help us to predict probability of point mutations which may ultimately reduce the time and cost of mutation experiments. Proteins 2016; 85:54-64. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. A statistical model for investigating binding probabilities of DNA nucleotide sequences using microarrays.

    Science.gov (United States)

    Lee, Mei-Ling Ting; Bulyk, Martha L; Whitmore, G A; Church, George M

    2002-12-01

    There is considerable scientific interest in knowing the probability that a site-specific transcription factor will bind to a given DNA sequence. Microarray methods provide an effective means for assessing the binding affinities of a large number of DNA sequences as demonstrated by Bulyk et al. (2001, Proceedings of the National Academy of Sciences, USA 98, 7158-7163) in their study of the DNA-binding specificities of Zif268 zinc fingers using microarray technology. In a follow-up investigation, Bulyk, Johnson, and Church (2002, Nucleic Acid Research 30, 1255-1261) studied the interdependence of nucleotides on the binding affinities of transcription proteins. Our article is motivated by this pair of studies. We present a general statistical methodology for analyzing microarray intensity measurements reflecting DNA-protein interactions. The log probability of a protein binding to a DNA sequence on an array is modeled using a linear ANOVA model. This model is convenient because it employs familiar statistical concepts and procedures and also because it is effective for investigating the probability structure of the binding mechanism.

  3. The intensity detection of single-photon detectors based on photon counting probability density statistics

    International Nuclear Information System (INIS)

    Zhang Zijing; Song Jie; Zhao Yuan; Wu Long

    2017-01-01

    Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)

  4. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    Science.gov (United States)

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  5. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  6. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K

    2016-01-01

    Genome-wide Association Studies (GWAS) result in millions of summary statistics ("z-scores") for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric......-scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing...... for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately...

  7. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    Science.gov (United States)

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  8. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    Science.gov (United States)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  9. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  10. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    Science.gov (United States)

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  11. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Simulation of statistical systems with not necessarily real and positive probabilities

    International Nuclear Information System (INIS)

    Kalkreuter, T.

    1991-01-01

    A new method to determine expectation values of observables in statistical systems with not necessarily real and positive probabilities is proposed. It is tested in a numerical study of the two-dimensional O(3)-symmetric nonlinear σ-model with Symanzik's one-loop improved lattice action. This model is simulated as polymer system with field dependent activities which can be made positive definite or indefinite by adjusting additive constants of the action. For a system with indefinite activities the new proposal is found to work. It is also verified that local observables are not affected by far-away ploymers with indefinite activities when the system has no long-range order. (orig.)

  13. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    International Nuclear Information System (INIS)

    Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van’t

    2012-01-01

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  14. A comparison of Probability Of Detection (POD) data determined using different statistical methods

    Science.gov (United States)

    Fahr, A.; Forsyth, D.; Bullock, M.

    1993-12-01

    Different statistical methods have been suggested for determining probability of detection (POD) data for nondestructive inspection (NDI) techniques. A comparative assessment of various methods of determining POD was conducted using results of three NDI methods obtained by inspecting actual aircraft engine compressor disks which contained service induced cracks. The study found that the POD and 95 percent confidence curves as a function of crack size as well as the 90/95 percent crack length vary depending on the statistical method used and the type of data. The distribution function as well as the parameter estimation procedure used for determining POD and the confidence bound must be included when referencing information such as the 90/95 percent crack length. The POD curves and confidence bounds determined using the range interval method are very dependent on information that is not from the inspection data. The maximum likelihood estimators (MLE) method does not require such information and the POD results are more reasonable. The log-logistic function appears to model POD of hit/miss data relatively well and is easy to implement. The log-normal distribution using MLE provides more realistic POD results and is the preferred method. Although it is more complicated and slower to calculate, it can be implemented on a common spreadsheet program.

  15. Assessment of climate change using methods of mathematic statistics and theory of probability

    International Nuclear Information System (INIS)

    Trajanoska, Lidija; Kaevski, Ivancho

    2004-01-01

    In simple terms: 'Climate' is the average of 'weather'. The Earth's weather system is a complex machine composed of coupled sub-systems (ocean, air, land, ice and the biosphere) between which energy are exchanged. The understanding and study of climate change does not only rely on the understanding of the physics of climate change but is linked to the following question: 'How we can detect change in a system that is changing all the time under its own volition'? What is even the meaning of 'change' in such a situation? The concept of 'change' we should transform into the concept of 'significant and long-term' then this re-phrasing allows for a definition in mathematical terms. Significant change in a system becomes a measure of how large an observed change is in terms of the variability one would see under 'normal' conditions. Example could be the analyses of the yearly temperature of the air and precipitations, like in this paper. A large amount of data are selected as representing the 'before' case (change) and another set of data are selected as being the 'after' case and then the average in these two cases are compared. These comparisons are in the form of 'hypothesis tests' in which one tests whether the hypothesis that there has Open no change can be rejected. Both parameter and nonparametric statistic methods are used in the theory of mathematic statistic. The most indicative changeable which show global change is an average, standard deviation and probability function distribution on examined time series. Examined meteorological series are taken like haphazard process so we can mathematic statistic applied.(Author)

  16. Probability and statistical correlation of the climatic parameters for estimatingenergy consumption of a building

    Directory of Open Access Journals (Sweden)

    Samarin Oleg Dmitrievich

    2014-01-01

    Full Text Available The problem of the most accurate estimation of energy consumption by ventilation and air conditioning systems in buildings is a high-priority task now because of the decrease of energy and fuel sources and because of the revision of building standards in Russian Federation. That’s why it is very important to find simple but accurate enough correlations of the climatic parameters in heating and cooling seasons of a year.Therefore the probabilistic and statistical relationship of the parameters of external climate in warm and cold seasons are considered. The climatic curves for cold and warm seasons in Moscow showing the most probable combinations between the external air temperature and the relative air humidity are plotted using the data from the Design Guidelines to the State Building Code “Building Climatology”. The statistical relationship of the enthalpy and the external air temperature for climatic conditions of Moscow are determined using these climatic curves and formulas connecting relative air humidity and other parameters of the air moisture degree.The mean value of the external air enthalpy for the heating season is calculated in order to simplify the determination of full heat consumption of ventilating and air conditioning systems taking into account the real mean state of external air. The field of application and the estimation of accuracy and standard deviation for the presented dependences are found. The obtained model contains the only independent parameter namely the external air temperature and therefore it can be easily used in engineering practice especially during preliminary calculation.

  17. When Is Statistical Evidence Superior to Anecdotal Evidence in Supporting Probability Claims? The Role of Argument Type

    Science.gov (United States)

    Hoeken, Hans; Hustinx, Lettica

    2009-01-01

    Under certain conditions, statistical evidence is more persuasive than anecdotal evidence in supporting a claim about the probability that a certain event will occur. In three experiments, it is shown that the type of argument is an important condition in this respect. If the evidence is part of an argument by generalization, statistical evidence…

  18. A statistical model for deriving probability distributions of contamination for accidental releases

    International Nuclear Information System (INIS)

    ApSimon, H.M.; Davison, A.C.

    1986-01-01

    Results generated from a detailed long-range transport model, MESOS, simulating dispersal of a large number of hypothetical releases of radionuclides in a variety of meteorological situations over Western Europe have been used to derive a simpler statistical model, MESOSTAT. This model may be used to generate probability distributions of different levels of contamination at a receptor point 100-1000 km or so from the source (for example, across a frontier in another country) without considering individual release and dispersal scenarios. The model is embodied in a series of equations involving parameters which are determined from such factors as distance between source and receptor, nuclide decay and deposition characteristics, release duration, and geostrophic windrose at the source. Suitable geostrophic windrose data have been derived for source locations covering Western Europe. Special attention has been paid to the relatively improbable extreme values of contamination at the top end of the distribution. The MESOSTAT model and its development are described, with illustrations of its use and comparison with the original more detailed modelling techniques. (author)

  19. The Work Sample Verification and the Calculation of the Statistical, Mathematical and Economical Probability for the Risks of the Direct Procurement

    Directory of Open Access Journals (Sweden)

    Lazăr Cristiana Daniela

    2017-01-01

    Full Text Available Each organization has among its multiple secondary endpoints subordinated to a centralobjective that one of avoiding the contingencies. The direct procurement is carried out on themarket in SEAP (Electronic System of Public Procurement, and a performing management in apublic institution has as sub-base and risk management. The risks may be investigated byeconometric simulation, which is calculated by the use of calculus of probability and the sample fordetermining the relevance of these probabilities.

  20. Optimum Inductive Methods. A study in Inductive Probability, Bayesian Statistics, and Verisimilitude.

    NARCIS (Netherlands)

    Festa, Roberto

    1992-01-01

    According to the Bayesian view, scientific hypotheses must be appraised in terms of their posterior probabilities relative to the available experimental data. Such posterior probabilities are derived from the prior probabilities of the hypotheses by applying Bayes'theorem. One of the most important

  1. Temporal contingency

    Science.gov (United States)

    Gallistel, C.R.; Craig, Andrew R.; Shahan, Timothy A.

    2015-01-01

    Contingency, and more particularly temporal contingency, has often figured in thinking about the nature of learning. However, it has never been formally defined in such a way as to make it a measure that can be applied to most animal learning protocols. We use elementary information theory to define contingency in such a way as to make it a measurable property of almost any conditioning protocol. We discuss how making it a measurable construct enables the exploration of the role of different contingencies in the acquisition and performance of classically and operantly conditioned behavior. PMID:23994260

  2. Temporal contingency.

    Science.gov (United States)

    Gallistel, C R; Craig, Andrew R; Shahan, Timothy A

    2014-01-01

    Contingency, and more particularly temporal contingency, has often figured in thinking about the nature of learning. However, it has never been formally defined in such a way as to make it a measure that can be applied to most animal learning protocols. We use elementary information theory to define contingency in such a way as to make it a measurable property of almost any conditioning protocol. We discuss how making it a measurable construct enables the exploration of the role of different contingencies in the acquisition and performance of classically and operantly conditioned behavior. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    Science.gov (United States)

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  4. The statistical significance of error probability as determined from decoding simulations for long codes

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  5. A Scan Statistic for Continuous Data Based on the Normal Probability Model

    OpenAIRE

    Konty, Kevin; Kulldorff, Martin; Huang, Lan

    2009-01-01

    Abstract Temporal, spatial and space-time scan statistics are commonly used to detect and evaluate the statistical significance of temporal and/or geographical disease clusters, without any prior assumptions on the location, time period or size of those clusters. Scan statistics are mostly used for count data, such as disease incidence or mortality. Sometimes there is an interest in looking for clusters with respect to a continuous variable, such as lead levels in children or low birth weight...

  6. A Study of Strengthening Secondary Mathematics Teachers' Knowledge of Statistics and Probability via Professional Development

    Science.gov (United States)

    DeVaul, Lina

    2017-01-01

    A professional development program (PSPD) was implemented to improve in-service secondary mathematics teachers' content knowledge, pedagogical knowledge, and self-efficacy in teaching secondary school statistics and probability. Participants generated a teaching resource website at the conclusion of the PSPD program. Participants' content…

  7. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  8. A statistical analysis on failure-to open/close probability of pneumatic valve in sodium cooling systems

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1999-11-01

    The objective of this study is to develop fundamental data for examination on efficiency of preventive maintenance and surveillance test from the standpoint of failure probability. In this study, as a major standby component, a pneumatic valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve failure-to-open/close (FTOC) probability depending on number of demands ('n'), time since installation ('t') and standby time since last open/close action ('T'). The analysis is based on the field data of operating- and failure-experiences stored in the Component Reliability Database and Statistical Analysis System for LMFBR's (CORDS). In the analysis, the FTOC probability ('P') was expressed as follows: P=1-exp{-C-En-F/n-λT-aT(t-T/2)-AT 2 /2}. The functional parameters, 'C', 'E', 'F', 'λ', 'a' and 'A', were estimated with the maximum likelihood estimation method. As a result, the FTOC probability is almost expressed with the failure probability being derived from the failure rate under assumption of the Poisson distribution only when valve cycle (i.e. open-close-open cycle) exceeds about 100 days. When the valve cycle is shorter than about 100 days, the FTOC probability can be adequately estimated with the parameter model proposed in this study. The results obtained from this study may make it possible to derive an adequate frequency of surveillance test for a given target of the FTOC probability. (author)

  9. Probability density function shape sensitivity in the statistical modeling of turbulent particle dispersion

    Science.gov (United States)

    Litchford, Ron J.; Jeng, San-Mou

    1992-01-01

    The performance of a recently introduced statistical transport model for turbulent particle dispersion is studied here for rigid particles injected into a round turbulent jet. Both uniform and isosceles triangle pdfs are used. The statistical sensitivity to parcel pdf shape is demonstrated.

  10. Computational Modeling of Statistical Learning: Effects of Transitional Probability versus Frequency and Links to Word Learning

    Science.gov (United States)

    Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.

    2010-01-01

    Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…

  11. Aarhus Conference on Probability, Statistics and Their Applications : Celebrating the Scientific Achievements of Ole E. Barndorff-Nielsen

    CERN Document Server

    Stelzer, Robert; Thorbjørnsen, Steen; Veraart, Almut

    2016-01-01

    Collecting together twenty-three self-contained articles, this volume presents the current research of a number of renowned scientists in both probability theory and statistics as well as their various applications in economics, finance, the physics of wind-blown sand, queueing systems, risk assessment, turbulence and other areas. The contributions are dedicated to and inspired by the research of Ole E. Barndorff-Nielsen who, since the early 1960s, has been and continues to be a very active and influential researcher working on a wide range of important problems. The topics covered include, but are not limited to, econometrics, exponential families, Lévy processes and infinitely divisible distributions, limit theory, mathematical finance, random matrices, risk assessment, statistical inference for stochastic processes, stochastic analysis and optimal control, time series, and turbulence. The book will be of interest to researchers and graduate students in probability, statistics and their applications. .

  12. Transformation & uncertainty : some thoughts on quantum probability theory, quantum statistics, and natural bundles

    NARCIS (Netherlands)

    Janssens, B.

    2010-01-01

    This PHD thesis is concerned partly with uncertainty relations in quantum probability theory, partly with state estimation in quantum stochastics, and partly with natural bundles in differential geometry. The laws of quantum mechanics impose severe restrictions on the performance of measurement.

  13. Statistical Study of Aircraft Icing Probabilities at the 700- and 500- Millibar Levels over Ocean Areas in the Northern Hemisphere

    Science.gov (United States)

    Perkins, Porter J.; Lewis, William; Mulholland, Donald R.

    1957-01-01

    A statistical study is made of icing data reported from weather reconnaissance aircraft flown by Air Weather Service (USAF). The weather missions studied were flown at fixed flight levels of 500 millibars (18,000 ft) and 700 millibars (10,000 ft) over wide areas of the Pacific, Atlantic, and Arctic Oceans. This report is presented as part of a program conducted by the NACA to obtain extensive icing statistics relevant to aircraft design and operation. The thousands of in-flight observations recorded over a 2- to 4-year period provide reliable statistics on icing encounters for the specific areas, altitudes, and seasons included in the data. The relative frequencies of icing occurrence are presented, together with the estimated icing probabilities and the relation of these probabilities to the frequencies of flight in clouds and cloud temperatures. The results show that aircraft operators can expect icing probabilities to vary widely throughout the year from near zero in the cold Arctic areas in winter up to 7 percent in areas where greater cloudiness and warmer temperatures prevail. The data also reveal a general tendency of colder cloud temperatures to reduce the probability of icing in equally cloudy conditions.

  14. Statistical analysis on failure-to-open/close probability of motor-operated valve in sodium system

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1998-08-01

    The objective of this work is to develop basic data for examination on efficiency of preventive maintenance and actuation test from the standpoint of failure probability. This work consists of a statistical trend analysis of valve failure probability in a failure-to-open/close mode on time since installation and time since last open/close action, based on the field data of operating- and failure-experience. In this work, the terms both dependent and independent on time were considered in the failure probability. The linear aging model was modified and applied to the first term. In this model there are two terms with both failure rates in proportion to time since installation and to time since last open/close-demand. Because of sufficient statistical population, motor-operated valves (MOV's) in sodium system were selected to be analyzed from the CORDS database which contains operating data and failure data of components in the fast reactors and sodium test facilities. According to these data, the functional parameters were statistically estimated to quantify the valve failure probability in a failure-to-open/close mode, with consideration of uncertainty. (J.P.N.)

  15. Conceptual and Statistical Issues Regarding the Probability of Default and Modeling Default Risk

    Directory of Open Access Journals (Sweden)

    Emilia TITAN

    2011-03-01

    Full Text Available In today’s rapidly evolving financial markets, risk management offers different techniques in order to implement an efficient system against market risk. Probability of default (PD is an essential part of business intelligence and customer relation management systems in the financial institutions. Recent studies indicates that underestimating this important component, and also the loss given default (LGD, might threaten the stability and smooth running of the financial markets. From the perspective of risk management, the result of predictive accuracy of the estimated probability of default is more valuable than the standard binary classification: credible or non credible clients. The Basle II Accord recognizes the methods of reducing credit risk and also PD and LGD as important components of advanced Internal Rating Based (IRB approach.

  16. Ignorance is not bliss: Statistical power is not probability of trial success.

    Science.gov (United States)

    Zierhut, M L; Bycott, P; Gibbs, M A; Smith, B P; Vicini, P

    2016-04-01

    The purpose of this commentary is to place probability of trial success, or assurance, in the context of decision making in drug development, and to illustrate its properties in an intuitive manner for the readers of Clinical Pharmacology and Therapeutics. The hope is that this will stimulate a dialog on how assurance should be incorporated into a quantitative decision approach for clinical development and trial design that uses all available information. © 2015 ASCPT.

  17. Contingent valuation and incentives

    Science.gov (United States)

    Patricia A. Champ; Nicholas E. Flores; Thomas C. Brown; James Chivers

    2002-01-01

    We empirically investigate the effect of the payment mechanism on contingent values by asking a willingness-to-pay question with one of three different payment mechanisms: individual contribution, contribution with provision point, and referendum. We find statistical evidence of more affirmative responses in the referendum treatment relative to the individual...

  18. Probabilistic risk assessment course documentation. Volume 2. Probability and statistics for PRA applications

    International Nuclear Information System (INIS)

    Iman, R.L.; Prairie, R.R.; Cramond, W.R.

    1985-08-01

    This course is intended to provide the necessary probabilistic and statistical skills to perform a PRA. Fundamental background information is reviewed, but the principal purpose is to address specific techniques used in PRAs and to illustrate them with applications. Specific examples and problems are presented for most of the topics

  19. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    Science.gov (United States)

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  20. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  1. Rank-k Maximal Statistics for Divergence and Probability of Misclassification

    Science.gov (United States)

    Decell, H. P., Jr.

    1972-01-01

    A technique is developed for selecting from n-channel multispectral data some k combinations of the n-channels upon which to base a given classification technique so that some measure of the loss of the ability to distinguish between classes, using the compressed k-dimensional data, is minimized. Information loss in compressing the n-channel data to k channels is taken to be the difference in the average interclass divergences (or probability of misclassification) in n-space and in k-space.

  2. Quantile selection procedure and assoiated distribution of ratios of order statistics from a restricted family of probability distributions

    International Nuclear Information System (INIS)

    Gupta, S.S.; Panchapakesan, S.

    1975-01-01

    A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure

  3. Lay understanding of forensic statistics: Evaluation of random match probabilities, likelihood ratios, and verbal equivalents.

    Science.gov (United States)

    Thompson, William C; Newman, Eryn J

    2015-08-01

    Forensic scientists have come under increasing pressure to quantify the strength of their evidence, but it is not clear which of several possible formats for presenting quantitative conclusions will be easiest for lay people, such as jurors, to understand. This experiment examined the way that people recruited from Amazon's Mechanical Turk (n = 541) responded to 2 types of forensic evidence--a DNA comparison and a shoeprint comparison--when an expert explained the strength of this evidence 3 different ways: using random match probabilities (RMPs), likelihood ratios (LRs), or verbal equivalents of likelihood ratios (VEs). We found that verdicts were sensitive to the strength of DNA evidence regardless of how the expert explained it, but verdicts were sensitive to the strength of shoeprint evidence only when the expert used RMPs. The weight given to DNA evidence was consistent with the predictions of a Bayesian network model that incorporated the perceived risk of a false match from 3 causes (coincidence, a laboratory error, and a frame-up), but shoeprint evidence was undervalued relative to the same Bayesian model. Fallacious interpretations of the expert's testimony (consistent with the source probability error and the defense attorney's fallacy) were common and were associated with the weight given to the evidence and verdicts. The findings indicate that perceptions of forensic science evidence are shaped by prior beliefs and expectations as well as expert testimony and consequently that the best way to characterize and explain forensic evidence may vary across forensic disciplines. (c) 2015 APA, all rights reserved).

  4. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2015-01-01

    contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...... about the future. Finally, it should be mentioned that temporal logic has found a remarkable application in computer science and applied mathematics. In the late 1970s the first computer scientists realised the relevance of temporal logic for the purposes of computer science (see Hasle and Øhrstrøm 2004)....

  5. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2011-01-01

    contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...... about the future. Finally, it should be mentioned that temporal logic has found a remarkable application in computer science and applied mathematics. In the late 1970s the first computer scientists realised the relevance of temporal logic for the purposes of computer science (see Hasle and Øhrstrøm 2004)....

  6. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.

    Science.gov (United States)

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  7. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density

    Directory of Open Access Journals (Sweden)

    Carmen Moret-Tatay

    2018-05-01

    Full Text Available The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area. The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  8. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information.

    Science.gov (United States)

    Perlin, Mark William

    2015-01-01

    DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI), used by crime labs for over 15 years. When testing 13 short tandem repeat (STR) genetic loci, the CPI(-1) value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR), spans a much broader range. This study examined probability of inclusion (PI) mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI(-1)) values were examined and compared with corresponding log(LR) values. The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN), CPI(-1) increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN rather than measuring identification information. A quantitative

  9. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    Directory of Open Access Journals (Sweden)

    Mark William Perlin

    2015-01-01

    Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN

  10. Active Contour Driven by Local Region Statistics and Maximum A Posteriori Probability for Medical Image Segmentation

    Directory of Open Access Journals (Sweden)

    Xiaoliang Jiang

    2014-01-01

    Full Text Available This paper presents a novel active contour model in a variational level set formulation for simultaneous segmentation and bias field estimation of medical images. An energy function is formulated based on improved Kullback-Leibler distance (KLD with likelihood ratio. According to the additive model of images with intensity inhomogeneity, we characterize the statistics of image intensities belonging to each different object in local regions as Gaussian distributions with different means and variances. Then, we use the Gaussian distribution with bias field as a local region descriptor in level set formulation for segmentation and bias field correction of the images with inhomogeneous intensities. Therefore, image segmentation and bias field estimation are simultaneously achieved by minimizing the level set formulation. Experimental results demonstrate desirable performance of the proposed method for different medical images with weak boundaries and noise.

  11. Binomial distribution of Poisson statistics and tracks overlapping probability to estimate total tracks count with low uncertainty

    International Nuclear Information System (INIS)

    Khayat, Omid; Afarideh, Hossein; Mohammadnia, Meisam

    2015-01-01

    In the solid state nuclear track detectors of chemically etched type, nuclear tracks with center-to-center neighborhood of distance shorter than two times the radius of tracks will emerge as overlapping tracks. Track overlapping in this type of detectors causes tracks count losses and it becomes rather severe in high track densities. Therefore, tracks counting in this condition should include a correction factor for count losses of different tracks overlapping orders since a number of overlapping tracks may be counted as one track. Another aspect of the problem is the cases where imaging the whole area of the detector and counting all tracks are not possible. In these conditions a statistical generalization method is desired to be applicable in counting a segmented area of the detector and the results can be generalized to the whole surface of the detector. Also there is a challenge in counting the tracks in densely overlapped tracks because not sufficient geometrical or contextual information are available. It this paper we present a statistical counting method which gives the user a relation between the tracks overlapping probabilities on a segmented area of the detector surface and the total number of tracks. To apply the proposed method one can estimate the total number of tracks on a solid state detector of arbitrary shape and dimensions by approximating the tracks averaged area, whole detector surface area and some orders of tracks overlapping probabilities. It will be shown that this method is applicable in high and ultra high density tracks images and the count loss error can be enervated using a statistical generalization approach. - Highlights: • A correction factor for count losses of different tracks overlapping orders. • For the cases imaging the whole area of the detector is not possible. • Presenting a statistical generalization method for segmented areas. • Giving a relation between the tracks overlapping probabilities and the total tracks

  12. A probabilistic framework for microarray data analysis: fundamental probability models and statistical inference.

    Science.gov (United States)

    Ogunnaike, Babatunde A; Gelmi, Claudio A; Edwards, Jeremy S

    2010-05-21

    Gene expression studies generate large quantities of data with the defining characteristic that the number of genes (whose expression profiles are to be determined) exceed the number of available replicates by several orders of magnitude. Standard spot-by-spot analysis still seeks to extract useful information for each gene on the basis of the number of available replicates, and thus plays to the weakness of microarrays. On the other hand, because of the data volume, treating the entire data set as an ensemble, and developing theoretical distributions for these ensembles provides a framework that plays instead to the strength of microarrays. We present theoretical results that under reasonable assumptions, the distribution of microarray intensities follows the Gamma model, with the biological interpretations of the model parameters emerging naturally. We subsequently establish that for each microarray data set, the fractional intensities can be represented as a mixture of Beta densities, and develop a procedure for using these results to draw statistical inference regarding differential gene expression. We illustrate the results with experimental data from gene expression studies on Deinococcus radiodurans following DNA damage using cDNA microarrays. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  13. A comparator-hypothesis account of biased contingency detection.

    Science.gov (United States)

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Between Certainty and Uncertainty Statistics and Probability in Five Units with Notes on Historical Origins and Illustrative Numerical Examples

    CERN Document Server

    Laudański, Ludomir M

    2013-01-01

    „Between Certainty & Uncertainty” is a one-of–a-kind short course on statistics for students, engineers  and researchers.  It is a fascinating introduction to statistics and probability with notes on historical origins and 80 illustrative numerical examples organized in the five units:   ·         Chapter 1  Descriptive Statistics:  Compressing small samples, basic averages - mean and variance, their main properties including God’s proof; linear transformations and z-scored statistics .   ·         Chapter 2 Grouped data: Udny Yule’s concept of qualitative and quantitative variables. Grouping these two kinds of data. Graphical tools. Combinatorial rules and qualitative variables.  Designing frequency histogram. Direct and coded evaluation of quantitative data. Significance of percentiles.   ·         Chapter 3 Regression and correlation: Geometrical distance and equivalent distances in two orthogonal directions  as a prerequisite to the concept of two regressi...

  15. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    Science.gov (United States)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-06-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  16. Development of a statistical model for the determination of the probability of riverbank erosion in a Meditteranean river basin

    Science.gov (United States)

    Varouchakis, Emmanouil; Kourgialas, Nektarios; Karatzas, George; Giannakis, Georgios; Lilli, Maria; Nikolaidis, Nikolaos

    2014-05-01

    Riverbank erosion affects the river morphology and the local habitat and results in riparian land loss, damage to property and infrastructures, ultimately weakening flood defences. An important issue concerning riverbank erosion is the identification of the areas vulnerable to erosion, as it allows for predicting changes and assists with stream management and restoration. One way to predict the vulnerable to erosion areas is to determine the erosion probability by identifying the underlying relations between riverbank erosion and the geomorphological and/or hydrological variables that prevent or stimulate erosion. A statistical model for evaluating the probability of erosion based on a series of independent local variables and by using logistic regression is developed in this work. The main variables affecting erosion are vegetation index (stability), the presence or absence of meanders, bank material (classification), stream power, bank height, river bank slope, riverbed slope, cross section width and water velocities (Luppi et al. 2009). In statistics, logistic regression is a type of regression analysis used for predicting the outcome of a categorical dependent variable, e.g. binary response, based on one or more predictor variables (continuous or categorical). The probabilities of the possible outcomes are modelled as a function of independent variables using a logistic function. Logistic regression measures the relationship between a categorical dependent variable and, usually, one or several continuous independent variables by converting the dependent variable to probability scores. Then, a logistic regression is formed, which predicts success or failure of a given binary variable (e.g. 1 = "presence of erosion" and 0 = "no erosion") for any value of the independent variables. The regression coefficients are estimated by using maximum likelihood estimation. The erosion occurrence probability can be calculated in conjunction with the model deviance regarding

  17. Interference statistics and capacity analysis for uplink transmission in two-tier small cell networks: A geometric probability approach

    KAUST Repository

    Tabassum, Hina

    2014-07-01

    This paper presents a novel framework to derive the statistics of the interference considering dedicated and shared spectrum access for uplink transmission in two-tier small cell networks such as the macrocell-femtocell networks. The framework exploits the distance distributions from geometric probability theory to characterize the uplink interference while considering a traditional grid-model set-up for macrocells along with the randomly deployed femtocells. The derived expressions capture the impact of path-loss, composite shadowing and fading, uniform and non-uniform traffic loads, spatial distribution of femtocells, and partial and full spectral reuse among femtocells. Considering dedicated spectrum access, first, we derive the statistics of co-tier interference incurred at both femtocell and macrocell base stations (BSs) from a single interferer by approximating generalized- K composite fading distribution with the tractable Gamma distribution. We then derive the distribution of the number of interferers considering partial spectral reuse and moment generating function (MGF) of the cumulative interference for both partial and full spectral reuse scenarios. Next, we derive the statistics of the cross-tier interference at both femtocell and macrocell BSs considering shared spectrum access. Finally, we utilize the derived expressions to analyze the capacity in both dedicated and shared spectrum access scenarios. The derived expressions are validated by the Monte Carlo simulations. Numerical results are generated to assess the feasibility of shared and dedicated spectrum access in femtocells under varying traffic load and spectral reuse scenarios. © 2014 IEEE.

  18. Statistical equivalence and test-retest reliability of delay and probability discounting using real and hypothetical rewards.

    Science.gov (United States)

    Matusiewicz, Alexis K; Carter, Anne E; Landes, Reid D; Yi, Richard

    2013-11-01

    Delay discounting (DD) and probability discounting (PD) refer to the reduction in the subjective value of outcomes as a function of delay and uncertainty, respectively. Elevated measures of discounting are associated with a variety of maladaptive behaviors, and confidence in the validity of these measures is imperative. The present research examined (1) the statistical equivalence of discounting measures when rewards were hypothetical or real, and (2) their 1-week reliability. While previous research has partially explored these issues using the low threshold of nonsignificant difference, the present study fully addressed this issue using the more-compelling threshold of statistical equivalence. DD and PD measures were collected from 28 healthy adults using real and hypothetical $50 rewards during each of two experimental sessions, one week apart. Analyses using area-under-the-curve measures revealed a general pattern of statistical equivalence, indicating equivalence of real/hypothetical conditions as well as 1-week reliability. Exceptions are identified and discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    Science.gov (United States)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  20. Flood probability quantification for road infrastructure: Data-driven spatial-statistical approach and case study applications.

    Science.gov (United States)

    Kalantari, Zahra; Cavalli, Marco; Cantone, Carolina; Crema, Stefano; Destouni, Georgia

    2017-03-01

    Climate-driven increase in the frequency of extreme hydrological events is expected to impose greater strain on the built environment and major transport infrastructure, such as roads and railways. This study develops a data-driven spatial-statistical approach to quantifying and mapping the probability of flooding at critical road-stream intersection locations, where water flow and sediment transport may accumulate and cause serious road damage. The approach is based on novel integration of key watershed and road characteristics, including also measures of sediment connectivity. The approach is concretely applied to and quantified for two specific study case examples in southwest Sweden, with documented road flooding effects of recorded extreme rainfall. The novel contributions of this study in combining a sediment connectivity account with that of soil type, land use, spatial precipitation-runoff variability and road drainage in catchments, and in extending the connectivity measure use for different types of catchments, improve the accuracy of model results for road flood probability. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Relationship between the generalized equivalent uniform dose formulation and the Poisson statistics-based tumor control probability model

    International Nuclear Information System (INIS)

    Zhou Sumin; Das, Shiva; Wang Zhiheng; Marks, Lawrence B.

    2004-01-01

    The generalized equivalent uniform dose (GEUD) model uses a power-law formalism, where the outcome is related to the dose via a power law. We herein investigate the mathematical compatibility between this GEUD model and the Poisson statistics based tumor control probability (TCP) model. The GEUD and TCP formulations are combined and subjected to a compatibility constraint equation. This compatibility constraint equates tumor control probability from the original heterogeneous target dose distribution to that from the homogeneous dose from the GEUD formalism. It is shown that this constraint equation possesses a unique, analytical closed-form solution which relates radiation dose to the tumor cell survival fraction. It is further demonstrated that, when there is no positive threshold or finite critical dose in the tumor response to radiation, this relationship is not bounded within the realistic cell survival limits of 0%-100%. Thus, the GEUD and TCP formalisms are, in general, mathematically inconsistent. However, when a threshold dose or finite critical dose exists in the tumor response to radiation, there is a unique mathematical solution for the tumor cell survival fraction that allows the GEUD and TCP formalisms to coexist, provided that all portions of the tumor are confined within certain specific dose ranges

  2. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education

    Science.gov (United States)

    Christou, Nicolas; Dinov, Ivo D.

    2011-01-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources. PMID:21603097

  3. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  4. A Study of Students' Learning Styles, Discipline Attitudes and Knowledge Acquisition in Technology-Enhanced Probability and Statistics Education.

    Science.gov (United States)

    Christou, Nicolas; Dinov, Ivo D

    2010-09-01

    Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources.

  5. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    Science.gov (United States)

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  6. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  7. Statistics concerning the Apollo command module water landing, including the probability of occurrence of various impact conditions, sucessful impact, and body X-axis loads

    Science.gov (United States)

    Whitnah, A. M.; Howes, D. B.

    1971-01-01

    Statistical information for the Apollo command module water landings is presented. This information includes the probability of occurrence of various impact conditions, a successful impact, and body X-axis loads of various magnitudes.

  8. Contingency proportion systematically influences contingency learning.

    Science.gov (United States)

    Forrin, Noah D; MacLeod, Colin M

    2018-01-01

    In the color-word contingency learning paradigm, each word appears more often in one color (high contingency) than in the other colors (low contingency). Shortly after beginning the task, color identification responses become faster on the high-contingency trials than on the low-contingency trials-the contingency learning effect. Across five groups, we varied the high-contingency proportion in 10% steps, from 80% to 40%. The size of the contingency learning effect was positively related to high-contingency proportion, with the effect disappearing when high contingency was reduced to 40%. At the two highest contingency proportions, the magnitude of the effect increased over trials, the pattern suggesting that there was an increasing cost for the low-contingency trials rather than an increasing benefit for the high-contingency trials. Overall, the results fit a modified version of Schmidt's (2013, Acta Psychologica, 142, 119-126) parallel episodic processing account in which prior trial instances are routinely retrieved from memory and influence current trial performance.

  9. A new framework of statistical inferences based on the valid joint sampling distribution of the observed counts in an incomplete contingency table.

    Science.gov (United States)

    Tian, Guo-Liang; Li, Hui-Qiong

    2017-08-01

    Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.

  10. Estimating survival probabilities by exposure levels: utilizing vital statistics and complex survey data with mortality follow-up.

    Science.gov (United States)

    Landsman, V; Lou, W Y W; Graubard, B I

    2015-05-20

    We present a two-step approach for estimating hazard rates and, consequently, survival probabilities, by levels of general categorical exposure. The resulting estimator utilizes three sources of data: vital statistics data and census data are used at the first step to estimate the overall hazard rate for a given combination of gender and age group, and cohort data constructed from a nationally representative complex survey with linked mortality records, are used at the second step to divide the overall hazard rate by exposure levels. We present an explicit expression for the resulting estimator and consider two methods for variance estimation that account for complex multistage sample design: (1) the leaving-one-out jackknife method, and (2) the Taylor linearization method, which provides an analytic formula for the variance estimator. The methods are illustrated with smoking and all-cause mortality data from the US National Health Interview Survey Linked Mortality Files, and the proposed estimator is compared with a previously studied crude hazard rate estimator that uses survey data only. The advantages of a two-step approach and possible extensions of the proposed estimator are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Automated segmentation of ultrasonic breast lesions using statistical texture classification and active contour based on probability distance.

    Science.gov (United States)

    Liu, Bo; Cheng, H D; Huang, Jianhua; Tian, Jiawei; Liu, Jiafeng; Tang, Xianglong

    2009-08-01

    Because of its complicated structure, low signal/noise ratio, low contrast and blurry boundaries, fully automated segmentation of a breast ultrasound (BUS) image is a difficult task. In this paper, a novel segmentation method for BUS images without human intervention is proposed. Unlike most published approaches, the proposed method handles the segmentation problem by using a two-step strategy: ROI generation and ROI segmentation. First, a well-trained texture classifier categorizes the tissues into different classes, and the background knowledge rules are used for selecting the regions of interest (ROIs) from them. Second, a novel probability distance-based active contour model is applied for segmenting the ROIs and finding the accurate positions of the breast tumors. The active contour model combines both global statistical information and local edge information, using a level set approach. The proposed segmentation method was performed on 103 BUS images (48 benign and 55 malignant). To validate the performance, the results were compared with the corresponding tumor regions marked by an experienced radiologist. Three error metrics, true-positive ratio (TP), false-negative ratio (FN) and false-positive ratio (FP) were used for measuring the performance of the proposed method. The final results (TP = 91.31%, FN = 8.69% and FP = 7.26%) demonstrate that the proposed method can segment BUS images efficiently, quickly and automatically.

  12. Contingent and Alternative Work Arrangements, Defined.

    Science.gov (United States)

    Polivka, Anne E.

    1996-01-01

    Discusses the definitions of contingent workers and alternative work arrangements used by the Bureau of Labor Statistics to analyze data, and presents aggregate estimates of the number of workers in each group. Discusses the overlap between contingent workers and workers in alternative arrangements. (Author/JOW)

  13. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  14. Introduction to Statistics - eNotes

    DEFF Research Database (Denmark)

    Brockhoff, Per B.; Møller, Jan Kloppenborg; Andersen, Elisabeth Wreford

    2015-01-01

    Online textbook used in the introductory statistics courses at DTU. It provides a basic introduction to applied statistics for engineers. The necessary elements from probability theory are introduced (stochastic variable, density and distribution function, mean and variance, etc.) and thereafter...... the most basic statistical analysis methods are presented: Confidence band, hypothesis testing, simulation, simple and muliple regression, ANOVA and analysis of contingency tables. Examples with the software R are included for all presented theory and methods....

  15. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  16. Detection and Classification of Low Probability of Intercept Radar Signals Using Parallel Filter Arrays and Higher Order Statistics

    National Research Council Canada - National Science Library

    Taboada, Fernando

    2002-01-01

    Low probability of intercept (LPI) is that property of an emitter that because of its low power, wide bandwidth, frequency variability, or other design attributes, makes it difficult to be detected or identified by means of passive...

  17. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values

  18. Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes

    Science.gov (United States)

    Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy

    2006-01-01

    We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…

  19. SOME ASPECTS OF THE USE OF MATHEMATICAL-STATISTICAL METHODS IN THE ANALYSIS OF SOCIO-HUMANISTIC TEXTS Humanities and social text, mathematics, method, statistics, probability

    Directory of Open Access Journals (Sweden)

    Zaira M Alieva

    2016-01-01

    Full Text Available The article analyzes the application of mathematical and statistical methods in the analysis of socio-humanistic texts. The essence of mathematical and statistical methods, presents examples of their use in the study of Humanities and social phenomena. Considers the key issues faced by the expert in the application of mathematical-statistical methods in socio-humanitarian sphere, including the availability of sustainable contrasting socio-humanitarian Sciences and mathematics; the complexity of the allocation of the object that is the bearer of the problem; having the use of a probabilistic approach. The conclusion according to the results of the study.

  20. What subject matter questions motivate the use of machine learning approaches compared to statistical models for probability prediction?

    Science.gov (United States)

    Binder, Harald

    2014-07-01

    This is a discussion of the following papers: "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Hanford Facility contingency plan

    International Nuclear Information System (INIS)

    Sutton, L.N.; Miskho, A.G.; Brunke, R.C.

    1993-10-01

    The Hanford Facility Contingency Plan, together with each TSD unit-specific contingency plan, meets the WAC 173-303 requirements for a contingency plan. This plan includes descriptions of responses to a nonradiological hazardous materials spill or release at Hanford Facility locations not covered by TSD unit-specific contingency plans or building emergency plans. This plan includes descriptions of responses for spills or releases as a result of transportation activities, movement of materials, packaging, and storage of hazardous materials

  2. Statistical methods to quantify the effect of mite parasitism on the probability of death in honey bee colonies

    Science.gov (United States)

    Varroa destructor is a mite parasite of European honey bees, Apis mellifera, that weakens the population, can lead to the death of an entire honey bee colony, and is believed to be the parasite with the most economic impact on beekeeping. The purpose of this study was to estimate the probability of ...

  3. Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection

    Science.gov (United States)

    Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...

  4. Interference statistics and capacity analysis for uplink transmission in two-tier small cell networks: A geometric probability approach

    KAUST Repository

    Tabassum, Hina; Dawy, Zaher; Hossain, Ekram; Alouini, Mohamed-Slim

    2014-01-01

    This paper presents a novel framework to derive the statistics of the interference considering dedicated and shared spectrum access for uplink transmission in two-tier small cell networks such as the macrocell-femtocell networks. The framework

  5. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    Science.gov (United States)

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  6. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    OpenAIRE

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...

  7. Graft rejection episodes after Descemet stripping with endothelial keratoplasty: part two: the statistical analysis of probability and risk factors.

    Science.gov (United States)

    Price, M O; Jordan, C S; Moore, G; Price, F W

    2009-03-01

    To investigate risk factors and probability of initial immunological graft rejection episodes after Descemet stripping with endothelial keratoplasty (DSEK). Outcomes of 598 DSEK cases from a single tertiary referral centre were reviewed. Risk factors and probability of rejection were assessed by multivariate Cox proportional hazards modelling. Rejection episodes occurred in 54 eyes of 48 patients. Estimated probability of a rejection episode was 7.6% by 1 year and 12% by 2 years after grafting. Relative risk of rejection was five times higher for African-American patients compared with Caucasians (p = 0.0002). Eyes with pre-existing glaucoma (9%) or steroid-responsive ocular hypertension (27%) had twice the relative risk of rejection (p = 0.045) compared with eyes that did not have those problems. Patient age, sex and corneal diagnosis did not significantly influence rejection risk. Risk of rejection was not increased when fellow eyes were grafted within 1 year of the first eye (p = 0.62). Pre-existing glaucoma or steroid-responsive ocular hypertension and race were the two factors that independently influenced relative risk of rejection after DSEK. Rejection risk was not increased if the fellow eye was grafted within the prior year with DSEK.

  8. Natural analogue study of CO2 storage monitoring using probability statistics of CO2-rich groundwater chemistry

    Science.gov (United States)

    Kim, K. K.; Hamm, S. Y.; Kim, S. O.; Yun, S. T.

    2016-12-01

    For confronting global climate change, carbon capture and storage (CCS) is one of several very useful strategies as using capture of greenhouse gases like CO2 spewed from stacks and then isolation of the gases in underground geologic storage. CO2-rich groundwater could be produced by CO2 dissolution into fresh groundwater around a CO2 storage site. As consequence, natural analogue studies related to geologic storage provide insights into future geologic CO2 storage sites as well as can provide crucial information on the safety and security of geologic sequestration, the long-term impact of CO2 storage on the environment, and field operation and monitoring that could be implemented for geologic sequestration. In this study, we developed CO2 leakage monitoring method using probability density function (PDF) by characterizing naturally occurring CO2-rich groundwater. For the study, we used existing data of CO2-rich groundwaters in different geological regions (Gangwondo, Gyeongsangdo, and Choongchungdo provinces) in South Korea. Using PDF method and QI (quantitative index), we executed qualitative and quantitative comparisons among local areas and chemical constituents. Geochemical properties of groundwater with/without CO2 as the PDF forms proved that pH, EC, TDS, HCO3-, Ca2+, Mg2+, and SiO2 were effective monitoring parameters for carbonated groundwater in the case of CO2leakage from an underground storage site. KEY WORDS: CO2-rich groundwater, CO2 storage site, monitoring parameter, natural analogue, probability density function (PDF), QI_quantitative index Acknowledgement This study was supported by the "Basic Science Research Program through the National Research Foundation of Korea (NRF), which is funded by the Ministry of Education (NRF-2013R1A1A2058186)" and the "R&D Project on Environmental Management of Geologic CO2 Storage" from KEITI (Project number: 2014001810003).

  9. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  10. DETERMINING TYPE Ia SUPERNOVA HOST GALAXY EXTINCTION PROBABILITIES AND A STATISTICAL APPROACH TO ESTIMATING THE ABSORPTION-TO-REDDENING RATIO R{sub V}

    Energy Technology Data Exchange (ETDEWEB)

    Cikota, Aleksandar [European Southern Observatory, Karl-Schwarzschild-Strasse 2, D-85748 Garching b. München (Germany); Deustua, Susana [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Marleau, Francine, E-mail: acikota@eso.org [Institute for Astro- and Particle Physics, University of Innsbruck, Technikerstrasse 25/8, A-6020 Innsbruck (Austria)

    2016-03-10

    We investigate limits on the extinction values of Type Ia supernovae (SNe Ia) to statistically determine the most probable color excess, E(B – V), with galactocentric distance, and use these statistics to determine the absorption-to-reddening ratio, R{sub V}, for dust in the host galaxies. We determined pixel-based dust mass surface density maps for 59 galaxies from the Key Insight on Nearby Galaxies: a Far-infrared Survey with Herschel (KINGFISH). We use SN Ia spectral templates to develop a Monte Carlo simulation of color excess E(B – V) with R{sub V} = 3.1 and investigate the color excess probabilities E(B – V) with projected radial galaxy center distance. Additionally, we tested our model using observed spectra of SN 1989B, SN 2002bo, and SN 2006X, which occurred in three KINGFISH galaxies. Finally, we determined the most probable reddening for Sa–Sap, Sab–Sbp, Sbc–Scp, Scd–Sdm, S0, and irregular galaxy classes as a function of R/R{sub 25}. We find that the largest expected reddening probabilities are in Sab–Sb and Sbc–Sc galaxies, while S0 and irregular galaxies are very dust poor. We present a new approach for determining the absorption-to-reddening ratio R{sub V} using color excess probability functions and find values of R{sub V} = 2.71 ± 1.58 for 21 SNe Ia observed in Sab–Sbp galaxies, and R{sub V} = 1.70 ± 0.38, for 34 SNe Ia observed in Sbc–Scp galaxies.

  11. The DO-climate events are probably noise induced: statistical investigation of the claimed 1470 years cycle

    Directory of Open Access Journals (Sweden)

    P. D. Ditlevsen

    2007-01-01

    Full Text Available The significance of the apparent 1470 years cycle in the recurrence of the Dansgaard-Oeschger (DO events, observed in the Greenland ice cores, is debated. Here we present statistical significance tests of this periodicity. The detection of a periodicity relies strongly on the accuracy of the dating of the DO events. Here we use both the new NGRIP GICC05 time scale based on multi-parameter annual layer counting and the GISP2 time scale where the periodicity is most pronounced. For the NGRIP dating the recurrence times are indistinguishable from a random occurrence. This is also the case for the GISP2 dating, except in the case where the DO9 event is omitted from the record.

  12. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  13. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  14. Contingency planning: preparation of contingency plans

    DEFF Research Database (Denmark)

    Westergaard, J M

    2008-01-01

    . The risk of introducing disease pathogens into a country and the spread of the agent within a country depends on a number of factors including import controls, movement of animals and animal products and the biosecurity applied by livestock producers. An adequate contingency plan is an important instrument...... in the preparation for and the handling of an epidemic. The legislation of the European Union requires that all Member States draw up a contingency plan which specifies the national measures required to maintain a high level of awareness and preparedness and is to be implemented in the event of disease outbreak...

  15. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  16. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  17. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  18. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  19. No Win, No Fee: Some Economics of Contingent Legal Fees.

    OpenAIRE

    Gravelle, Hugh; Waterson, Michael

    1993-01-01

    This paper analyzes the effects on the litigation process of alternative contracts between plaintiffs and their lawyers. Three contracts are compared: normal (hourly fee), contingent mark up fees, and contingent share contracts. The focus is on the first two, a recent change in English law governing legal fees providing the motivation. The influences of the contract type on the acceptance of settlement offers, the settlement probability, the accident probability, the demand for trials, and th...

  20. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  1. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  2. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  3. Probability and statistics with R

    CERN Document Server

    Ugarte, Maria Dolores; Arnholt, Alan T

    2008-01-01

    -Technometrics, May 2009, Vol. 51, No. 2 The book is comprehensive and well written. The notation is clear and the mathematical derivations behind nontrivial equations and computational implementations are carefully explained. Rather than presenting a collection of R scripts together with a summary of relevant theoretical results, this book offers a well-balanced mix of theory, examples and R code.-Raquel Prado, University of California, Santa Cruz,  The American Statistician, February 2009… an impressive book … Overall, this is a good reference book with comprehensive coverage of the details

  4. Licensee safeguards contingency plans

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The Nuclear Regulatory Commission is amending its regulations to require that licensees authorized to operate a nuclear reactor (other than certain research and test reactors), and those authorized to possess strategic quantities of plutonium, uranium-233, or uranium-235 develop and implement acceptable plans for responding to threats, thefts, and industrial sabotage of licensed nuclear materials and facilities. The plans will provide a structured, orderly, and timely response to safeguards contingencies and will be an important segment of NRC's contingency planning programs. Licensee safeguards contingency plans will result in organizing licensee's safeguards resources in such a way that, in the unlikely event of a safeguards contingency, the responding participants will be identified, their several responsibilities specified, and their responses coordinated

  5. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  6. Probability of mediastinal involvement in non-small-cell lung cancer: a statistical definition of the clinical target volume for 3-dimensional conformal radiotherapy?

    International Nuclear Information System (INIS)

    Giraud, Philippe; De Rycke, Yann; Lavole, Armelle; Milleron, Bernard; Cosset, Jean-Marc; Rosenzweig, Kenneth E.

    2006-01-01

    Purpose: Conformal irradiation (3D-CRT) of non-small-cell lung carcinoma (NSCLC) is largely based on precise definition of the nodal clinical target volume (CTVn). A reduction of the number of nodal stations to be irradiated would facilitate tumor dose escalation. The aim of this study was to design a mathematical tool based on documented data to predict the risk of metastatic involvement for each nodal station. Methods and Materials: We reviewed the large surgical series published in the literature to identify the main pretreatment parameters that modify the risk of nodal invasion. The probability of involvement for the 17 nodal stations described by the American Thoracic Society (ATS) was computed from all these publications. Starting with the primary site of the tumor as the main characteristic, we built a probabilistic tree for each nodal station representing the risk distribution as a function of each tumor feature. Statistical analysis used the inversion of probability trees method described by Weinstein and Feinberg. Validation of the software based on 134 patients from two different populations was performed by receiver operator characteristic (ROC) curves and multivariate logistic regression. Results: Analysis of all of the various parameters of pretreatment staging relative to each level of the ATS map results in 20,000 different combinations. The first parameters included in the tree, depending on tumor site, were histologic classification, metastatic stage, nodal stage weighted as a function of the sensitivity and specificity of the diagnostic examination used (positron emission tomography scan, computed tomography scan), and tumor stage. Software is proposed to compute a predicted probability of involvement of each nodal station for any given clinical presentation. Double cross validation confirmed the methodology. A 10% cutoff point was calculated from ROC and logistic model giving the best prediction of mediastinal lymph node involvement. Conclusion

  7. Monte Carlo dose calculations and radiobiological modelling: analysis of the effect of the statistical noise of the dose distribution on the probability of tumour control

    International Nuclear Information System (INIS)

    Buffa, Francesca M.

    2000-01-01

    The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, σ d ; whilst the quantities d and σ d depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 10 8 from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error on the

  8. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  9. Contingencies of Value

    DEFF Research Database (Denmark)

    Strandvad, Sara Malou

    2014-01-01

    Based on a study of the admission test at a design school, this paper investigates the contingencies of aesthetic values as these become visible in assessment practices. Theoretically, the paper takes its starting point in Herrnstein Smith’s notion of ‘contingencies of values’ and outlines...... a pragmatist ground where cultural sociology and economic sociology meet. Informed by the literature on cultural intermediaries, the paper discusses the role of evaluators and the devices which accompany them. Whereas studies of cultural intermediaries traditionally apply a Bourdieusian perspective, recent......, the paper does not accept this storyline. As an alternative, the paper outlines the contingencies of values which are at play at the admission test, composed of official assessment criteria and scoring devices together with conventions within the world of design, and set in motion by interactions...

  10. Applied Problems and Use of Technology in an Aligned Way in Basic Courses in Probability and Statistics for Engineering Students--A Way to Enhance Understanding and Increase Motivation

    Science.gov (United States)

    Zetterqvist, Lena

    2017-01-01

    Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…

  11. Appendix F - Sample Contingency Plan

    Science.gov (United States)

    This sample Contingency Plan in Appendix F is intended to provide examples of contingency planning as a reference when a facility determines that the required secondary containment is impracticable, pursuant to 40 CFR §112.7(d).

  12. Supervisory Styles: A Contingency Framework

    Science.gov (United States)

    Boehe, Dirk Michael

    2016-01-01

    While the contingent nature of doctoral supervision has been acknowledged, the literature on supervisory styles has yet to deliver a theory-based contingency framework. A contingency framework can assist supervisors and research students in identifying appropriate supervisory styles under varying circumstances. The conceptual study reported here…

  13. Contingency diagrams as teaching tools

    OpenAIRE

    Mattaini, Mark A.

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching.

  14. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  15. Fuel rod pressure in nuclear power reactors: Statistical evaluation of the fuel rod internal pressure in LWRs with application to lift-off probability

    Energy Technology Data Exchange (ETDEWEB)

    Jelinek, Tomas

    2001-02-01

    In this thesis, a methodology for quantifying the risk of exceeding the Lift-off limit in nuclear light water power reactors is outlined. Due to fission gas release, the pressure in the gap between the fuel pellets and the cladding increases with burnup of the fuel. An increase in the fuel-clad gap due to clad creep would be expected to result in positive feedback, in the form of higher fuel temperatures, leading to more fission gas release, higher rod pressure, etc, until the cladding breaks. An increase in the fuel-clad gap that leads to this positive feedback is a phenomenon called Lift-off and is a limitation that must be considered in the fuel core management. Lift-off is a consequence of very high internal fuel rod pressure. The internal fuel rod pressure is therefore used as a Lift-off indicator. The internal fuel rod pressure is closely connected to the fission gas release into the fuel rod plenum and is thus used to increase the database. It is concluded that the dominating error source in the prediction of the pressure in Boiling Water Reactors (BWR), is the power history. There is a bias in the fuel pressure prediction that is dependent on the fuel rod position in the fuel assembly for BWRs. A methodology to quantify the risk of the fuel rod internal pressure exceeding a certain limit is developed; the risk is dependent of the pressure prediction and the fuel rod position. The methodology is based on statistical treatment of the discrepancies between predicted and measured fuel rod internal pressures. Finally, a methodology to estimate the Lift-off probability of the whole core is outlined.

  16. Using GAISE and NCTM Standards as Frameworks for Teaching Probability and Statistics to Pre-Service Elementary and Middle School Mathematics Teachers

    Science.gov (United States)

    Metz, Mary Louise

    2010-01-01

    Statistics education has become an increasingly important component of the mathematics education of today's citizens. In part to address the call for a more statistically literate citizenship, The "Guidelines for Assessment and Instruction in Statistics Education (GAISE)" were developed in 2005 by the American Statistical Association. These…

  17. Social Sensorimotor Contingencies

    OpenAIRE

    Bütepage, Judith

    2016-01-01

    As the field of robotics advances, more robots are employed in our everyday environment. Thus, the implementation of robots that can actively engage in physical collaboration and naturally interact with humans is of high importance. In order to achieve this goal, it is necessary to study human interaction and social cognition and how these aspects can be implemented in robotic agents. The theory of social sensorimotor contingencies hypothesises that many aspects of human-human interaction de...

  18. Numerical comparison of improved methods of testing in contingency tables with small frequencies

    Energy Technology Data Exchange (ETDEWEB)

    Sugiura, Nariaki; Otake, Masanori

    1968-11-14

    The significance levels of various tests for a general c x k contingency table are usually given by large sample theory. But they are not accurate for the one having small frequencies. In this paper, a numerical evaluation was made to determine how good the approximation of significance level is for various improved tests that have been developed by Nass, Yoshimura, Gart, etc. for c x k contingency table with small frequencies in some of cells. For this purpose we compared the significance levels of the various approximate methods (i) with those of one-sided tail defined in terms of exact probabilities for given marginals in 2 x 2 table; (ii) with those of exact probabilities accumulated in the order of magnitude of Chi/sup 2/ statistic or likelihood ratio (=LR) statistic in 2 x 3 table mentioned by Yates. In 2 x 2 table it is well known that Yates' correction gives satisfactory result for small cell frequencies and the other methods that we have not referred here, can be considered if we devote our attention only to 2 x 2 or 2 x k table. But we are mainly interested in comparing the methods that are applicable to a general c x k table. It appears that such a comparison for the various improved methods in the same example has not been made explicitly, even though these tests are frequently used in biological and medical research. 9 references, 6 figures, 6 tables.

  19. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  20. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  1. Alternative Forms of Fit in Contingency Theory.

    Science.gov (United States)

    Drazin, Robert; Van de Ven, Andrew H.

    1985-01-01

    This paper examines the selection, interaction, and systems approaches to fit in structural contingency theory. The concepts of fit evaluated may be applied not only to structural contingency theory but to contingency theories in general. (MD)

  2. Contingent Conspiracies: Art, Philosophy, Science

    DEFF Research Database (Denmark)

    Wilson, Alexander

    2013-01-01

    The question of whether creativity comes from being “open” or “closed” to contingent processes, deeply intersects art-historical discourse on authorship, style, technique and practice: from the Greek notion of the Daimon, through commedia dell'arte’s improvised styles and romanticism’s investment......, Hegel) contain a deeper tension between contingency and necessity, often revealed in correlate discussions of the sublime. But as artists find themselves returning again to a concern or care for contingency (a thread running through Heidegger, Levinas and Derrida) or the question how to conspire...... with contingency (Negarestani), they do so today with a new paradigm of scientific knowledge at their disposal. For science too has increasingly been forced to respond to the notion of contingency. Progressively discovering the ubiquity of non-linear dynamics, deterministic chaos and emergent complexity...

  3. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    Science.gov (United States)

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  4. Waste Management Project Contingency Analysis

    International Nuclear Information System (INIS)

    Edward L. Parsons, Jr.

    1999-01-01

    The purpose of this report is to provide the office of Waste Management (WM) with recommended contingency calculation procedures for typical WM projects. Typical projects were defined as conventional construction-type activities that use innovative elements when necessary to meet the project objectives. Projects involve treatment, storage, and disposal of low level, mixed low level, hazardous, transuranic, and high level waste. Cost contingencies are an essential part of Total Cost Management. A contingency is an amount added to a cost estimate to compensate for unexpected expenses resulting from incomplete design, unforeseen and unpredictable conditions, or uncertainties in the project scope (DOE 1994, AACE 1998). Contingency allowances are expressed as percentages of estimated cost and improve cost estimates by accounting for uncertainties. The contingency allowance is large at the beginning of a project because there are more uncertainties, but as a project develops, the allowance shrinks to adjust for costs already incurred. Ideally, the total estimated cost remains the same throughout a project. Project contingency reflects the degree of uncertainty caused by lack of project definition, and process contingency reflects the degree of uncertainty caused by use of new technology. Different cost estimation methods were reviewed and compared with respect to terminology, accuracy, and Cost Guide standards. The Association for the Advancement of Cost Engineering (AACE) methods for cost estimation were selected to represent best industry practice. AACE methodology for contingency analysis can be readily applied to WM Projects, accounts for uncertainties associated with different stages of a project, and considers both project and process contingencies and the stage of technical readiness. As recommended, AACE contingency allowances taper off linearly as a project nears completion

  5. Grading system to categorize breast MRI using BI-RADS 5th edition: a statistical study of non-mass enhancement descriptors in terms of probability of malignancy.

    Science.gov (United States)

    Asada, Tatsunori; Yamada, Takayuki; Kanemaki, Yoshihide; Fujiwara, Keishi; Okamoto, Satoko; Nakajima, Yasuo

    2018-03-01

    To analyze the association of breast non-mass enhancement descriptors in the BI-RADS 5th edition with malignancy, and to establish a grading system and categorization of descriptors. This study was approved by our institutional review board. A total of 213 patients were enrolled. Breast MRI was performed with a 1.5-T MRI scanner using a 16-channel breast radiofrequency coil. Two radiologists determined internal enhancement and distribution of non-mass enhancement by consensus. Corresponding pathologic diagnoses were obtained by either biopsy or surgery. The probability of malignancy by descriptor was analyzed using Fisher's exact test and multivariate logistic regression analysis. The probability of malignancy by category was analyzed using Fisher's exact and multi-group comparison tests. One hundred seventy-eight lesions were malignant. Multivariate model analysis showed that internal enhancement (homogeneous vs others, p probability of malignancy (p < 0.0001). The three-grade criteria and categorization by sum-up grades of descriptors appear valid for non-mass enhancement.

  6. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  7. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  8. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  9. Changes in reward contingency modulate the trial-to-trial variability of hippocampal place cells.

    Science.gov (United States)

    Wikenheiser, Andrew M; Redish, A David

    2011-08-01

    Pyramidal cells in the rodent hippocampus often exhibit clear spatial tuning. Theories of hippocampal function suggest that these "place cells" implement multiple, independent neural representations of position (maps), based on different reference frames or environmental features. Consistent with the "multiple maps" theory, previous studies have shown that manipulating spatial factors related to task performance modulates the within-session variability (overdispersion) of cells in the hippocampus. However, the influence of changes in reward contingency on overdispersion has not been examined. To test this, we first trained rats to collect food from three feeders positioned around a circular track (task(1)). When subjects were proficient, the reward contingency was altered such that every other feeder delivered food (task(2)). We recorded ensembles of hippocampal neurons as rats performed both tasks. Place cell overdispersion was high during task(1) but decreased significantly during task(2), and this increased reliability could not be accounted for by changes in running speed or familiarity with the task. Intuitively, decreased variability might be expected to improve neural representations of position. To test this, we used Bayesian decoding of hippocampal spike trains to estimate subjects' location. Neither the amount of probability decoded to subjects' position (local probability) nor the difference between estimated position and true location (decoding accuracy) differed between tasks. However, we found that hippocampal ensembles were significantly more self-consistent during task(2) performance. These results suggest that changes in task demands can affect the firing statistics of hippocampal neurons, leading to changes in the properties of decoded neural representations.

  10. Spectral clustering and biclustering learning large graphs and contingency tables

    CERN Document Server

    Bolla, Marianna

    2013-01-01

    Explores regular structures in graphs and contingency tables by spectral theory and statistical methods This book bridges the gap between graph theory and statistics by giving answers to the demanding questions which arise when statisticians are confronted with large weighted graphs or rectangular arrays. Classical and modern statistical methods applicable to biological, social, communication networks, or microarrays are presented together with the theoretical background and proofs. This book is suitable for a one-semester course for graduate students in data mining, mult

  11. Contingent Commitments: Bringing Part-Time Faculty into Focus. Methodology Supplement

    Science.gov (United States)

    Center for Community College Student Engagement, 2014

    2014-01-01

    Center reporting prior to 2013 focused primarily on descriptive statistics (frequencies and means) of student and faculty behaviors. The goal of the analyses reported here and in "Contingent Commitments: Bringing Part-Time Faculty into Focus" is to understand the engagement of part-time or contingent faculty in various activities that…

  12. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  13. 49 CFR 1544.301 - Contingency plan.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Contingency plan. 1544.301 Section 1544.301... COMMERCIAL OPERATORS Threat and Threat Response § 1544.301 Contingency plan. Each aircraft operator must adopt a contingency plan and must: (a) Implement its contingency plan when directed by TSA. (b) Ensure...

  14. 30 CFR 282.26 - Contingency Plan.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Contingency Plan. 282.26 Section 282.26 Mineral... § 282.26 Contingency Plan. (a) When required by the Director, a lessee shall include a Contingency Plan as part of its request for approval of a Delineation, Testing, or Mining Plan. The Contingency Plan...

  15. National Contingency Plan Subpart J

    Science.gov (United States)

    Subpart J of the National Oil and Hazardous Substances Pollution Contingency Plan (NCP) directs EPA to prepare a schedule of dispersants, other chemicals, and oil spill mitigating devices and substances that may be used to remove or control oil discharges.

  16. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  17. Mobile contingency unit

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Sergio O. da; Magalhaes, Milton P. de [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil); Junqueira, Rodrigo A.; Torres, Carlos A.R. [PETROBRAS Transporte S/A (TRANSPETRO), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper is aimed at presenting what is already a reality in TRANSPETRO in the area covered by OSBRA, a pipeline that carries by-products to the Mid-West region of Brazil. In order to meet the needs of covering occasional accidents, TRANSPETRO counts on a standardized system of emergency management. It is a great challenge to secure an efficient communication along the 964 km of extension, considering that there are shadow zones where it is not possible to use conventional means of communication such as mobile telephony and internet. It was in this context that the Mobile Contingency Unit Via Satellite - MCU was developed, to extend the communication facilities existing in fixed installations to remote places, mainly the pipeline right of ways. In case of emergency, simulation and work in the pipeline right of way, MCU is fully able to provide the same data, voice, closed-circuit TV and satellite video conference facilities that are available in any internal area of the PETROBRAS system. (author)

  18. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  19. Dealing with customers enquiries simultaneously under contingent situation

    Directory of Open Access Journals (Sweden)

    Sujan Piya

    2015-09-01

    Full Text Available This paper proposes a method to quote the due date and the price of incoming orders to multiple customers simultaneously when the contingent orders exist. The proposed method utilizes probabilistic information on contingent orders and incorporates some negotiation theories to generate quotations. Rather than improving the acceptance probability of quotation for single customer, the method improves the overall acceptance probability of quotations being submitted to the multiple customers. This helps increase the total expected contribution of company and acceptance probability of entire new orders rather than increasing these measures only for a single customer. Numerical analysis is conducted to demonstrate the working mechanism of proposed method and its effectiveness in contrast to sequential method of quotation.

  20. Surface drift prediction in the Adriatic Sea using hyper-ensemble statistics on atmospheric, ocean and wave models: Uncertainties and probability distribution areas

    Science.gov (United States)

    Rixen, M.; Ferreira-Coelho, E.; Signell, R.

    2008-01-01

    Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).

  1. 用于统计测试概率分布生成的自动搜索方法%Automated Search Method for Statistical Test Probability Distribution Generation

    Institute of Scientific and Technical Information of China (English)

    周晓莹; 高建华

    2013-01-01

    A strategy based on automated search for probability distribution construction is proposed, which comprises the design of representation format and evaluation function for the probability distribution. Combining with simulated annealing algorithm, an indicator is defined to formalize the automated search process based on the Markov model. Experimental results show that the method effectively improves the accuracy of the automated search, which can reduce the expense of statistical test by providing the statistical test with fairly efficient test data since it successfully finds the neat-optimal probability distribution within a certain time.%提出一种基于自动搜索的概率分布生成方法,设计对概率分布的表示形式与评估函数,同时结合模拟退火算法设计基于马尔可夫模型的自动搜索过程.实验结果表明,该方法能够有效地提高自动搜索的准确性,在一定时间内成功找到接近最优的概率分布,生成高效的测试数据,同时达到降低统计测试成本的目的.

  2. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  3. Tethered Satellite System Contingency Investigation Board

    Science.gov (United States)

    1992-11-01

    The Tethered Satellite System (TSS-1) was launched aboard the Space Shuttle Atlantis (STS-46) on July 31, 1992. During the attempted on-orbit operations, the Tethered Satellite System failed to deploy successfully beyond 256 meters. The satellite was retrieved successfully and was returned on August 6, 1992. The National Aeronautics and Space Administration (NASA) Associate Administrator for Space Flight formed the Tethered Satellite System (TSS-1) Contingency Investigation Board on August 12, 1992. The TSS-1 Contingency Investigation Board was asked to review the anomalies which occurred, to determine the probable cause, and to recommend corrective measures to prevent recurrence. The board was supported by the TSS Systems Working group as identified in MSFC-TSS-11-90, 'Tethered Satellite System (TSS) Contingency Plan'. The board identified five anomalies for investigation: initial failure to retract the U2 umbilical; initial failure to flyaway; unplanned tether deployment stop at 179 meters; unplanned tether deployment stop at 256 meters; and failure to move tether in either direction at 224 meters. Initial observations of the returned flight hardware revealed evidence of mechanical interference by a bolt with the level wind mechanism travel as well as a helical shaped wrap of tether which indicated that the tether had been unwound from the reel beyond the travel by the level wind mechanism. Examination of the detailed mission events from flight data and mission logs related to the initial failure to flyaway and the failure to move in either direction at 224 meters, together with known preflight concerns regarding slack tether, focused the assessment of these anomalies on the upper tether control mechanism. After the second meeting, the board requested the working group to complete and validate a detailed integrated mission sequence to focus the fault tree analysis on a stuck U2 umbilical, level wind mechanical interference, and slack tether in upper tether

  4. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  5. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  6. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  7. Inevitability, contingency, and epistemic humility.

    Science.gov (United States)

    Kidd, Ian James

    2016-02-01

    This paper offers an epistemological framework for the debate about whether the results of scientific enquiry are inevitable or contingent. I argue in Sections 2 and 3 that inevitabilist stances are doubly guilty of epistemic hubris--a lack of epistemic humility--and that the real question concerns the scope and strength of our contingentism. The latter stages of the paper-Sections 4 and 5-address some epistemological and historiographical worries and sketch some examples of deep contingencies to guide further debate. I conclude by affirming that the concept of epistemic humility can usefully inform critical reflection on the contingency of the sciences and the practice of history of science. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Contingent Employment in the Netherlands

    OpenAIRE

    Pot, F.; Koene, Bas; Paauwe, Jaap

    2001-01-01

    textabstractIn the last decade the Dutch labour market has demonstrated an admirable capacity to generate jobs. Consequently, the unemployment rate has significantly decreased. However, the newly generated jobs are a-typical in the sense that they are not full-time jobs based on open-ended contracts. Instead, the job growth has relied on the growth of part-time and contingent jobs. While the creation of part-time jobs seems to be employee-driven, contingent employment, in contrast, seems to b...

  9. Statistical eruption forecast for the Chilean Southern Volcanic Zone: typical probabilities of volcanic eruptions as baseline for possibly enhanced activity following the large 2010 Concepción earthquake

    Directory of Open Access Journals (Sweden)

    Y. Dzierma

    2010-10-01

    Full Text Available A probabilistic eruption forecast is provided for ten volcanoes of the Chilean Southern Volcanic Zone (SVZ. Since 70% of the Chilean population lives in this area, the estimation of future eruption likelihood is an important part of hazard assessment. After investigating the completeness and stationarity of the historical eruption time series, the exponential, Weibull, and log-logistic distribution functions are fit to the repose time distributions for the individual volcanoes and the models are evaluated. This procedure has been implemented in two different ways to methodologically compare details in the fitting process. With regard to the probability of at least one VEI ≥ 2 eruption in the next decade, Llaima, Villarrica and Nevados de Chillán are most likely to erupt, while Osorno shows the lowest eruption probability among the volcanoes analysed. In addition to giving a compilation of the statistical eruption forecasts along the historically most active volcanoes of the SVZ, this paper aims to give "typical" eruption probabilities, which may in the future permit to distinguish possibly enhanced activity in the aftermath of the large 2010 Concepción earthquake.

  10. Contingency Theories of Leadership: A Study.

    Science.gov (United States)

    Saha, Sunhir K.

    1979-01-01

    Some of the major contingency theories of leadership are reviewed; some results from the author's study of Fiedler's contingency model are reported; and some thoughts for the future of leadership research are provided. (Author/MLF)

  11. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  12. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  13. Probability, statistics and modelling in public health

    National Research Council Canada - National Science Library

    Nikulin, Mikhail Stepanovich; Commenges, Daniel; Huber, Catherine

    2006-01-01

    .... Several well known biostatisticians from Europe and America were invited. A special issue of Lifetime Data Analysis was published (Volume 10, No 4), gathering some of the works discussed at this symposium. This volume gathers a larger number of papers, some of them being extended versions of papers published in the Lifetime Data Analysis issu...

  14. Statistical Methods for Solar Flare Probability Forecasting.

    Science.gov (United States)

    1980-09-01

    TAu C a -. 66M7. stNIr’ICANCe .0000 -SOMEOSIS 0 IASYMEMTIC- a -. i55 MITM fLARPE- -D"NOroMT, *-.255S WITH APPLONG OCSEOE4T. 8OLa sSES 0 (S "MERrCTAX 0...8217 q~ - - -WS S0MEOSNSO 0 ASYHNETP1CI * 291312UI HLARE4 OPENS(N, - * 49645 MITM *Ec$POT OWITN E f. ONIRSVS 0 ESYHNETRICI a .36652 0I"El of HISSING ONSERVATIONS t 19 76 ILMEI

  15. Contingency Teaching during Close Reading

    Science.gov (United States)

    Fisher, Douglas; Frey, Nancy

    2015-01-01

    12 teachers were interviewed and observed as they engaged students in close reading. We analyzed their responses and instruction to determine the scaffolds that were used as well as the contingency teaching plans they implemented when students were unable to understand the text.

  16. Breakdown concepts for contingency tables

    NARCIS (Netherlands)

    Kuhnt, S.

    2010-01-01

    Loglinear Poisson models are commonly used to analyse contingency tables. So far, robustness of parameter estimators as well as outlier detection have rarely been treated in this context. We start with finite-sample breakdown points. We yield that the breakdown point of mean value estimators

  17. Developing standardized facility contingency plans

    International Nuclear Information System (INIS)

    Davidson, D.A.

    1993-01-01

    Texaco consists of several operating departments that are, in effect, independent companies. Each of these departments is responsible for complying with all environmental laws and regulations. This includes the preparation by each facility to respond to an oil spill at that location. For larger spills, however, management of the response will rest with corporate regional response teams. Personnel from all departments make up the regional teams. In 1990, Congress passed the Oil Pollution Act. In 1991, the US Coast Guard began developing oil spill response contingency plan regulations, which they are still working on. Meanwhile, four of the five west coast states have also passed laws requiring contingency plans. (Only Hawaii has chosen to wait and see what the federal regulations will entail). Three of the states have already adopted regulations. Given these laws and regulations, along with its corporate structure, Texaco addressed the need to standardize local facility plans as well as its response organization. This paper discusses how, by working together, the Texaco corporate international oil spill response staff and the Texaco western region on-scene commander developed: A standard contingency plan format crossing corporate boundaries and meeting federal and state requirements. A response organization applicable to any size facility or spill. A strategy to sell the standard contingency plan and response organization to the operating units

  18. Lessons in Contingent, Recursive Humility

    Science.gov (United States)

    Vagle, Mark D.

    2011-01-01

    In this article, the author argues that critical work in teacher education should begin with teacher educators turning a critical eye on their own practices. The author uses Lesko's conception of contingent, recursive growth and change to analyze a lesson he observed as part of a phenomenological study aimed at understanding more about what it is…

  19. Job satisfaction and contingent employment

    NARCIS (Netherlands)

    de Graaf-Zijl, M.

    2012-01-01

    This paper analyses job satisfaction as an aggregate of satisfaction with several job aspects, with special focus on the influence of contingent-employment contracts. Fixed-effect analysis is applied on a longitudinal sample of Dutch employees in four work arrangements: regular, fixed-term, on-call

  20. A Profile of Contingent Workers.

    Science.gov (United States)

    Polivka, Anne E.

    1996-01-01

    Based on data from the supplement to the February 1995 Current Population Survey, contingent workers were more likely to be female, black, young, enrolled in school, and employed in services and construction industries than were noncontingent workers. More than 10% were teachers. (Author)

  1. How Precarious Is Contingent Work?

    DEFF Research Database (Denmark)

    Scheuer, Steen

    2015-01-01

    agree. This study focuses on a number of non-pay conditions for contingent employees, compared to permanent staff, under the assumption that these conditions are cumulatively negative. The article is based on utilizes a survey of app.4,900 employees (response rate 57%), asking questions concerning...

  2. 48 CFR 18.201 - Contingency operation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Contingency operation. 18... METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 18.201 Contingency operation. (a) Contingency operation is defined in 2.101. (b) Micro-purchase threshold. The threshold...

  3. 48 CFR 218.201 - Contingency operation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Contingency operation. 218... Flexibilities 218.201 Contingency operation. (1) Selection, appointment, and termination of appointment... in a contingency contracting force. See 201.603-2(2). (2) Policy for unique item identification...

  4. 49 CFR 1542.301 - Contingency plan.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Contingency plan. 1542.301 Section 1542.301..., DEPARTMENT OF HOMELAND SECURITY CIVIL AVIATION SECURITY AIRPORT SECURITY Contingency Measures § 1542.301 Contingency plan. (a) Each airport operator required to have a security program under § 1542.103(a) and (b...

  5. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  6. Contingency management for patients with dual disorders in intensive outpatient treatment for addiction.

    Science.gov (United States)

    Kelly, Thomas M; Daley, Dennis C; Douaihy, Antoine B

    2014-01-01

    This quality improvement program evaluation investigated the effectiveness of contingency management for improving retention in treatment and positive outcomes among patients with dual disorders in intensive outpatient treatment for addiction. The effect of contingency management was explored among a group of 160 patients exposed to contingency management (n = 88) and not exposed to contingency management (no contingency management, n = 72) in a six-week partial hospitalization program. Patients referred to the partial hospitalization program for treatment of substance use and comorbid psychiatric disorders received diagnoses from psychiatrists and specialist clinicians according to the Diagnostic and Statistical Manual of the American Psychiatric Association. A unique application of the contingency management "fishbowl" method was used to improve the consistency of attendance at treatment sessions, which patients attended 5 days a week. Days attending treatment and drug-free days were the main outcome variables. Other outcomes of interest were depression, anxiety and psychological stress, coping ability, and intensity of drug cravings. Patients in the contingency management group attended more treatment days compared to patients in the no contingency management group; M = 16.2 days (SD = 10.0) versus M = 9.9 days (SD = 8.5), respectively; t = 4.2, df = 158, p contingency management and self-reported drug-free days. Contingency management is a valuable adjunct for increasing retention in treatment among patients with dual disorders in partial hospitalization treatment. Exposure to contingency management increases retention in treatment, which in turn contributes to increased drug-free days. Interventions for coping with psychological stress and drug cravings should be emphasized in intensive dual diagnosis group therapy.

  7. Contingency Operations of Americas Next Moon Rocket, Ares V

    Science.gov (United States)

    Jaap, John; Richardson, Lea

    2010-01-01

    America has begun the development of a new space vehicle system which will enable humans to return to the moon and reach even farther destinations. The system is called Constellation: it has 2 earth-launch vehicles, Ares I and Ares V; a crew module, Orion; and a lander, Altair with descent and ascent stages. Ares V will launch an Earth Departure Stage (EDS) and Altair into low earth orbit. Ares I will launch the Orion crew module into low earth orbit where it will rendezvous and dock with the Altair and EDS "stack". After rendezvous, the stack will contain four complete rocket systems, each capable of independent operations. Of course this multiplicity of vehicles provides a multiplicity of opportunities for off-nominal behavior and multiple mitigation options for each. Contingency operations are complicated by the issues of crew safety and the possibility of debris from the very large components impacting the ground. This paper examines contingency operations of the EDS in low earth orbit, during the boost to translunar orbit, and after the translunar boost. Contingency operations under these conditions have not been a consideration since the Apollo era and analysis of the possible contingencies and mitigations will take some time to evolve. Since the vehicle has not been designed, much less built, it is not possible to evaluate contingencies from a root-cause basis or from a probability basis; rather they are discussed at an effects level (such as the reaction control system is consuming propellant at a high rate). Mitigations for the contingencies are based on the severity of the off-nominal condition, the time of occurrence, recovery options, options for alternate missions, crew safety, evaluation of the condition (forensics) and future prevention. Some proposed mitigations reflect innovation in thinking and make use of the multiplicity of on-orbit resources including the crew; example: Orion could do a "fly around" to allow the crew to determine the condition

  8. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  9. Fuzzy-set based contingency ranking

    International Nuclear Information System (INIS)

    Hsu, Y.Y.; Kuo, H.C.

    1992-01-01

    In this paper, a new approach based on fuzzy set theory is developed for contingency ranking of Taiwan power system. To examine whether a power system can remain in a secure and reliable operating state under contingency conditions, those contingency cases that will result in loss-of-load, loss-of generation, or islanding are first identified. Then 1P-1Q iteration of fast decoupled load flow is preformed to estimate post-contingent quantities (line flows, bus voltages) for other contingency cases. Based on system operators' past experience, each post-contingent quantity is assigned a degree of severity according to the potential damage that could be imposed on the power system by the quantity, should the contingency occurs. An approach based on fuzzy set theory is developed to deal with the imprecision of linguistic terms

  10. STATLIB, Interactive Statistics Program Library of Tutorial System

    International Nuclear Information System (INIS)

    Anderson, H.E.

    1986-01-01

    1 - Description of program or function: STATLIB is a conversational statistical program library developed in conjunction with a Sandia National Laboratories applied statistics course intended for practicing engineers and scientists. STATLIB is a group of 15 interactive, argument-free, statistical routines. Included are analysis of sensitivity tests; sample statistics for the normal, exponential, hypergeometric, Weibull, and extreme value distributions; three models of multiple regression analysis; x-y data plots; exact probabilities for RxC tables; n sets of m permuted integers in the range 1 to m; simple linear regression and correlation; K different random integers in the range m to n; and Fisher's exact test of independence for a 2 by 2 contingency table. Forty-five other subroutines in the library support the basic 15

  11. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  12. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  13. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  14. Contingent factors affecting network learning

    OpenAIRE

    Peters, Linda D.; Pressey, Andrew D.; Johnston, Wesley J.

    2016-01-01

    To increase understanding of the impact of individuals on organizational learning processes, this paper explores the impact of individual cognition and action on the absorptive capacity process of the wider network. In particular this study shows how contingent factors such as social integration mechanisms and power relationships influence how network members engage in, and benefit from, learning. The use of cognitive consistency and sensemaking theory enables examination of how these conting...

  15. Suited Contingency Ops Food - 2

    Science.gov (United States)

    Glass, J. W.; Leong, M. L.; Douglas, G. L.

    2014-01-01

    The contingency scenario for an emergency cabin depressurization event may require crewmembers to subsist in a pressurized suit for up to 144 hours. This scenario requires the capability for safe nutrition delivery through a helmet feed port against a 4 psi pressure differential to enable crewmembers to maintain strength and cognition to perform critical tasks. Two nutritional delivery prototypes were developed and analyzed for compatibility with the helmet feed port interface and for operational effectiveness against the pressure differential. The bag-in-bag (BiB) prototype, designed to equalize the suit pressure with the beverage pouch and enable a crewmember to drink normally, delivered water successfully to three different subjects in suits pressurized to 4 psi. The Boa restrainer pouch, designed to provide mechanical leverage to overcome the pressure differential, did not operate sufficiently. Guidelines were developed and compiled for contingency beverages that provide macro-nutritional requirements, a minimum one-year shelf life, and compatibility with the delivery hardware. Evaluation results and food product parameters have the potential to be used to improve future prototype designs and develop complete nutritional beverages for contingency events. These feeding capabilities would have additional use on extended surface mission EVAs, where the current in-suit drinking device may be insufficient.

  16. Some Tests for Evaluation of Contingency Tables (for Biomedical Applications)

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 7, č. 1 (2011), s. 37-50 ISSN 1336-9180 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : contingency tables * hypothesis testing Subject RIV: BB - Applied Statistics , Operational Research http://jamsi.fpv.ucm.sk/docs/v07n01_05_2011/v07_n01_03_KALINA.pdf

  17. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  18. Network location theory and contingency planning

    Energy Technology Data Exchange (ETDEWEB)

    Hakimi, S L

    1983-08-01

    A brief survey of results in network location theory is first presented. Then, a systems view of contingency planning is described. Finally, some results in location theory are re-examined and it is shown that they are motivated by contingency planning considerations. Some new issues and problems in location theory are described, which, if properly tackled, will have a substantial impact on contingency planning in transportation.

  19. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  20. 40 CFR 264.53 - Copies of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Copies of contingency plan. 264.53... Contingency Plan and Emergency Procedures § 264.53 Copies of contingency plan. A copy of the contingency plan... called upon to provide emergency services. [Comment: The contingency plan must be submitted to the...

  1. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  2. Pavlovian contingencies and temporal information.

    Science.gov (United States)

    Balsam, Peter D; Fairhurst, Stephen; Gallistel, Charles R

    2006-07-01

    The effects of altering the contingency between the conditioned stimulus (CS) and the unconditioned stimulus (US) on the acquisition of autoshaped responding was investigated by changing the frequency of unsignaled USs during the intertrial interval. The addition of the unsignaled USs had an effect on acquisition speed comparable with that of massing trials. The effects of these manipulations can be understood in terms of their effect on the amount of information (number of bits) that the average CS conveys to the subject about the timing of the next US. The number of reinforced CSs prior to acquisition is inversely related to the information content of the CS.

  3. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  4. Impaired awareness of action-outcome contingency and causality during healthy ageing and following ventromedial prefrontal cortex lesions.

    Science.gov (United States)

    O'Callaghan, Claire; Vaghi, Matilde M; Brummerloh, Berit; Cardinal, Rudolf N; Robbins, Trevor W

    2018-02-02

    Detecting causal relationships between actions and their outcomes is fundamental to guiding goal-directed behaviour. The ventromedial prefrontal cortex (vmPFC) has been extensively implicated in computing these environmental contingencies, via animal lesion models and human neuroimaging. However, whether the vmPFC is critical for contingency learning, and whether it can occur without subjective awareness of those contingencies, has not been established. To address this, we measured response adaption to contingency and subjective awareness of action-outcome relationships in individuals with vmPFC lesions and healthy elderly subjects. We showed that in both vmPFC damage and ageing, successful behavioural adaptation to variations in action-outcome contingencies was maintained, but subjective awareness of these contingencies was reduced. These results highlight two contexts where performance and awareness have been dissociated, and show that learning response-outcome contingencies to guide behaviour can occur without subjective awareness. Preserved responding in the vmPFC group suggests that this region is not critical for computing action-outcome contingencies to guide behaviour. In contrast, our findings highlight a critical role for the vmPFC in supporting awareness, or metacognitive ability, during learning. We further advance the hypothesis that responding to changing environmental contingencies, whilst simultaneously maintaining conscious awareness of those statistical regularities, is a form of dual-tasking that is impaired in ageing due to reduced prefrontal function. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  6. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  7. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    Science.gov (United States)

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  8. Contingent Diversity on Anthropic Landscapes

    Directory of Open Access Journals (Sweden)

    William Balée

    2010-02-01

    Full Text Available Behaviorally modern human beings have lived in Amazonia for thousands of years. Significant dynamics in species turnovers due to human-mediated disturbance were associated with the ultimate emergence and expansion of agrarian technologies in prehistory. Such disturbances initiated primary and secondary landscape transformations in various locales of the Amazon region. Diversity in these locales can be understood by accepting the initial premise of contingency, expressed as unprecedented human agency and human history. These effects can be accessed through the archaeological record and in the study of living languages. In addition, landscape transformation can be demonstrated in the study of traditional knowledge (TK. One way of elucidating TK distinctions between anthropic and nonanthropic landscapes concerns elicitation of differential labeling of these landscapes and more significantly, elicitation of the specific contents, such as trees, occurring in these landscapes. Freelisting is a method which can be used to distinguish the differential species compositions of landscapes resulting from human-mediated disturbance vs. those which do not evince records of human agency and history. The TK of the Ka’apor Indians of Amazonian Brazil as revealed in freelisting exercises shows differentiation of anthropogenic from high forests as well as a recognition of diversity in the anthropogenic forests. This suggests that the agents of human-mediated disturbance and landscape transformation in traditional Amazonia encode diversity and contingency into their TK, which encoding reflects past cultural influence on landscape and society over time.

  9. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  10. 48 CFR 1318.201 - Contingency operation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Contingency operation. 1318.201 Section 1318.201 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.201 Contingency...

  11. 7 CFR 457.9 - Appropriation contingency.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 6 2010-01-01 2010-01-01 false Appropriation contingency. 457.9 Section 457.9 Agriculture Regulations of the Department of Agriculture (Continued) FEDERAL CROP INSURANCE CORPORATION, DEPARTMENT OF AGRICULTURE COMMON CROP INSURANCE REGULATIONS § 457.9 Appropriation contingency...

  12. Reporting, Recording, and Transferring Contingency Demand Data

    National Research Council Canada - National Science Library

    Smith, Bernard

    2000-01-01

    .... In this report, we develop a standard set of procedures for reporting and recording demand data at the contingency location and transferring contingency demand data to the home base - ensuring proper level allocation and valid worldwide peacetime operating stock (POS) and readiness spares package (RSP) requirements.

  13. Contingency management: perspectives of Australian service providers.

    Science.gov (United States)

    Cameron, Jacqui; Ritter, Alison

    2007-03-01

    Given the very positive and extensive research evidence demonstrating efficacy and effectiveness of contingency management, it is important that Australia explore whether contingency management has a role to play in our own treatment context. Qualitative interviews were conducted with 30 experienced alcohol and drug practitioners, service managers and policy-makers in Victoria. Interviewees were selected to represent the range of drug treatment services types and included rural representation. A semi-structured interview schedule, covering their perceptions and practices of contingency management was used. All interviews were transcribed verbatim and analysed using N2 qualitative data analysis program. The majority of key informants were positively inclined toward contingency management, notwithstanding some concerns about the philosophical underpinnings. Concerns were raised in relation to the use of monetary rewards. Examples of the use of contingency management provided by key informants demonstrated an over-inclusive definition: all the examples did not adhere to the key principles of contingency management. This may create problems if a structured contingency management were to be introduced in Australia. Contingency management is an important adjunctive treatment intervention and its use in Australia has the potential to enhance treatment outcomes. No unmanageable barriers were identified in this study.

  14. Contingent Attentional Capture by Conceptually Relevant Images

    Science.gov (United States)

    Wyble, Brad; Folk, Charles; Potter, Mary C.

    2013-01-01

    Attentional capture is an unintentional shift of visuospatial attention to the location of a distractor that is either highly salient, or relevant to the current task set. The latter situation is referred to as contingent capture, in that the effect is contingent on a match between characteristics of the stimuli and the task-defined…

  15. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  16. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  17. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  18. Thevenin Equivalent Method for Dynamic Contingency Assessment

    DEFF Research Database (Denmark)

    Møller, Jakob Glarbo; Jóhannsson, Hjörtur; Østergaard, Jacob

    2015-01-01

    A method that exploits Thevenin equivalent representation for obtaining post-contingency steady-state nodal voltages is integrated with a method of detecting post-contingency aperiodic small-signal instability. The task of integrating stability assessment with contingency assessment is challenged...... by the cases of unstable post-contingency conditions. For unstable postcontingency conditions there exists no credible steady-state which can be used for basis of a stability assessment. This paper demonstrates how Thevenin Equivalent methods can be applied in algebraic representation of such bifurcation...... points which may be used in assessment of post-contingency aperiodic small-signal stability. The assessment method is introduced with a numeric example....

  19. Four hundred or more participants needed for stable contingency table estimates of clinical prediction rule performance

    DEFF Research Database (Denmark)

    Kent, Peter; Boyle, Eleanor; Keating, Jennifer L

    2017-01-01

    OBJECTIVE: To quantify variability in the results of statistical analyses based on contingency tables and discuss the implications for the choice of sample size for studies that derive clinical prediction rules. STUDY DESIGN AND SETTING: An analysis of three pre-existing sets of large cohort data......, odds ratios and risk/prevalence ratios, for each sample size was calculated. RESULTS: There were very wide, and statistically significant, differences in estimates derived from contingency tables from the same dataset when calculated in sample sizes below 400 people, and typically this variability...... stabilized in samples of 400 to 600 people. Although estimates of prevalence also varied significantly in samples below 600 people, that relationship only explains a small component of the variability in these statistical parameters. CONCLUSION: To reduce sample-specific variability, contingency tables...

  20. Outlier identification procedures for contingency tables using maximum likelihood and $L_1$ estimates

    NARCIS (Netherlands)

    Kuhnt, S.

    2004-01-01

    Observed cell counts in contingency tables are perceived as outliers if they have low probability under an anticipated loglinear Poisson model. New procedures for the identification of such outliers are derived using the classical maximum likelihood estimator and an estimator based on the L1 norm.

  1. Contingency Contracting within the Department of Defense: A Comparative Analysis

    National Research Council Canada - National Science Library

    McMillion, Chester

    2000-01-01

    .... The thesis compares and contrasts the regulations governing the contingency contracting operations, the organization structure, contingency contracting support plans, and the training requirements...

  2. Historical Contingency in Controlled Evolution

    Science.gov (United States)

    Schuster, Peter

    2014-12-01

    A basic question in evolution is dealing with the nature of an evolutionary memory. At thermodynamic equilibrium, at stable stationary states or other stable attractors the memory on the path leading to the long-time solution is erased, at least in part. Similar arguments hold for unique optima. Optimality in biology is discussed on the basis of microbial metabolism. Biology, on the other hand, is characterized by historical contingency, which has recently become accessible to experimental test in bacterial populations evolving under controlled conditions. Computer simulations give additional insight into the nature of the evolutionary memory, which is ultimately caused by the enormous space of possibilities that is so large that it escapes all attempts of visualization. In essence, this contribution is dealing with two questions of current evolutionary theory: (i) Are organisms operating at optimal performance? and (ii) How is the evolutionary memory built up in populations?

  3. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  4. Equilibria of perceptrons for simple contingency problems.

    Science.gov (United States)

    Dawson, Michael R W; Dupuis, Brian

    2012-08-01

    The contingency between cues and outcomes is fundamentally important to theories of causal reasoning and to theories of associative learning. Researchers have computed the equilibria of Rescorla-Wagner models for a variety of contingency problems, and have used these equilibria to identify situations in which the Rescorla-Wagner model is consistent, or inconsistent, with normative models of contingency. Mathematical analyses that directly compare artificial neural networks to contingency theory have not been performed, because of the assumed equivalence between the Rescorla-Wagner learning rule and the delta rule training of artificial neural networks. However, recent results indicate that this equivalence is not as straightforward as typically assumed, suggesting a strong need for mathematical accounts of how networks deal with contingency problems. One such analysis is presented here, where it is proven that the structure of the equilibrium for a simple network trained on a basic contingency problem is quite different from the structure of the equilibrium for a Rescorla-Wagner model faced with the same problem. However, these structural differences lead to functionally equivalent behavior. The implications of this result for the relationships between associative learning, contingency theory, and connectionism are discussed.

  5. ACCOUNTING FOR CONTINGENT CONSIDERATIONS IN BUSINESS COMBINATIONS

    Directory of Open Access Journals (Sweden)

    Gurgen KALASHYAN

    2017-07-01

    Full Text Available According to IFRS 3 Business Combinations contingent considerations must be included in the total consideration given for the acquired entity along with cash, other assets, ordinary or preference equity instruments, options, warrants. The contingent consideration is the determined amount which acquiring entity has to pay to acquired entity provided, that certain conditions will be fulfilled in the future. In case the provisions are not satisfied, we will get the situation when the amount of contingent consideration has been included in the total consideration given in the business combination, but in fact, the acquirer has not paid that amount. In its turn, the acquired entity will recognize the contingent consideration as a financial asset according to IFRS 9 Financial Instruments. In that case, it would be appropriately to recognize the contingent consideration as a contingent asset applying IAS 37. In the Article the author will explore the challenges of contingent consideration accounting and suggest the ways of solving the above mentioned problems.

  6. Statistical processing of experimental data

    OpenAIRE

    NAVRÁTIL, Pavel

    2012-01-01

    This thesis contains theory of probability and statistical sets. Solved and unsolved problems of probability, random variable and distributions random variable, random vector, statistical sets, regression and correlation analysis. Unsolved problems contains solutions.

  7. Contingent Faculty Perceptions of Organizational Support, Workplace Attitudes, and Teaching Evaluations at a Public Research University

    Directory of Open Access Journals (Sweden)

    Min Young Cha

    2016-03-01

    Full Text Available This research examines contingent faculty’s perception of organizational support, workplace attitudes, and Student Ratings of Teaching (SRT in a large public research university to investigate their employee-organization relationship. According to t-tests and regression analyses for samples of 2,229 faculty and instructional staff who answered the survey and had SRT data (tenured and tenure-track faculty: 1,708, 76.6% of total; contingent faculty: 521, 23.4% of total, employment relationship of contingent faculty in this institution was closer to a combined economic and social exchange model than to a pure economic exchange model or underinvestment model. Contingent faculty’s satisfaction with work, satisfaction with coworkers, perception of being supported at work, and affective organizational commitment were higher than tenured and tenure-track faculty at a statistically significant level. In addition, contingent faculty had higher SRT mean results in all areas of SRT items in medium-size (10-30 classes and in ‘class presentation,’ ‘feedback,’ ‘deeper understanding,’ and ‘interest stimulated’ in large-size (30-50 classes than Tenured and Tenure-track Faculty. These results not only refute the misconception that contingent faculty have too little time to provide students with feedback but also support that they provide students with good teaching, at least in medium-size and large-size classes. Whereas these results might be partially attributable to the relatively stable status of contingent faculty in this study (who work for more than 50 percent FTE, they indicate that, as a collective, contingent faculty also represent a significant contributor to the university, who are satisfied with their work, enjoy the community they are in, and are committed to their institution.

  8. Thoughts Without Content are Empty, Intuitions Without Concepts are Blind - Determinism and Contingency Revisited

    Science.gov (United States)

    Pohorille, Andrew

    2011-01-01

    similar reasoning is common in other fields of science, for example in statistical mechanics. Some trajectories lead to life, perhaps in different forms, whereas others do not. Of our true interest is the ratio of these two outcomes. The issue of determinism does not directly enter the picture. The debate about the likelihood of the emergence of life is quite old. One view holds that the origin of life is an event governed by chance, and the result of so many random events (contingencies) is unpredictable. This view was eloquently expressed by Monod. In his book "Chance or Necessity" he argued that life was a product of "nature's roulette." In an alternative view, expressed in particular by deDuve and Morowitz, the origin of life is considered a highly probable or even inevitable event (although its details need not be determined in every respect). Only in this sense the origin of life can be considered a "deterministic event".

  9. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  10. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  11. Understanding advanced statistical methods

    CERN Document Server

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  12. DETERMINE THE PROBABILITY OF PASSENGER SURVIVAL IN AN AVIATION INCIDENT WITH FIRE ON THE GROUND

    Directory of Open Access Journals (Sweden)

    Vladislav Pavlovich Turko

    2017-05-01

    Full Text Available Conducting the risk level of aviation incident with fire and the impacts of contingence affecting factors on people. Base on statistical data of aviation incident, the model of aircraft fire situation on the ground was offer.

  13. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  14. 40 CFR 265.54 - Amendment of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Amendment of contingency plan. 265.54... DISPOSAL FACILITIES Contingency Plan and Emergency Procedures § 265.54 Amendment of contingency plan. The contingency plan must be reviewed, and immediately amended, if necessary, whenever: (a) Applicable regulations...

  15. 40 CFR 265.53 - Copies of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Copies of contingency plan. 265.53... DISPOSAL FACILITIES Contingency Plan and Emergency Procedures § 265.53 Copies of contingency plan. A copy of the contingency plan and all revisions to the plan must be: (a) Maintained at the facility; and (b...

  16. 40 CFR 264.54 - Amendment of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Amendment of contingency plan. 264.54 Section 264.54 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES... Contingency Plan and Emergency Procedures § 264.54 Amendment of contingency plan. The contingency plan must be...

  17. The Corps Engineer Battalion in Contingency Operations

    National Research Council Canada - National Science Library

    Raymer, James

    2001-01-01

    .... The central research question asks: Is the proposed echelons above division engineer battalion design a better one for active and reserve component corps engineer forces to respond in a contingency...

  18. Strategy as Mutually Contingent Choice

    Directory of Open Access Journals (Sweden)

    Neil Martin

    2016-05-01

    Full Text Available Thomas Schelling’s The Strategy of Conflict carries significant behavioral implications which have been overlooked by economic readers. I argue that these implications are central to Schelling’s vision of game theory, that they fit well with recent advances in experimental psychology and behavioral economics, and provide a comprehensive framework that can inform research on strategy. In my view, Schelling develops a non-mathematical approach to strategy which anticipates on Gigerenzer and Selten’s “ecological rationality” program. This approach maps the processes involved in strategic reasoning and highlights their reliance on the particular information structure of interactive social environments. Building on this approach, I model strategy as a heuristic form of reasoning that governs the way in which individuals search for and provide cues in situations of mutually contingent choice. I conclude by examining how the reference to ecological rationality can help clarify Schelling’s contribution to game theory and outline potential avenues of research into strategic reasoning and interaction.

  19. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  20. Probabilistic real-time contingency ranking method

    International Nuclear Information System (INIS)

    Mijuskovic, N.A.; Stojnic, D.

    2000-01-01

    This paper describes a real-time contingency method based on a probabilistic index-expected energy not supplied. This way it is possible to take into account the stochastic nature of the electric power system equipment outages. This approach enables more comprehensive ranking of contingencies and it is possible to form reliability cost values that can form the basis for hourly spot price calculations. The electric power system of Serbia is used as an example for the method proposed. (author)

  1. ACCOUNTING FOR CONTINGENT CONSIDERATIONS IN BUSINESS COMBINATIONS

    OpenAIRE

    Gurgen KALASHYAN

    2017-01-01

    According to IFRS 3 Business Combinations contingent considerations must be included in the total consideration given for the acquired entity along with cash, other assets, ordinary or preference equity instruments, options, warrants. The contingent consideration is the determined amount which acquiring entity has to pay to acquired entity provided, that certain conditions will be fulfilled in the future. In case the provisions are not satisfied, we will get the situation when the amount of c...

  2. The Contingent Value of Organizational Integration

    Directory of Open Access Journals (Sweden)

    Virpi Turkulainen

    2013-08-01

    Full Text Available We elaborate the link between organizational design and effectiveness by examining organizational integration and performance in the context of modern manufacturing. Through careful contextualization and empirical analysis of 266 manufacturing organizations in three industries and nine countries, we uncover a joint effect of integration and complexity on organizational effectiveness. The results extend structural contingency theory, in particular the mechanisms that link organizational integration to organizational effectiveness. We conclude by discussing the continuing relevance of structural contingency theory.

  3. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  4. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  5. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  6. Using contingency management procedures to reduce at-risk drinking in heavy drinkers.

    Science.gov (United States)

    Dougherty, Donald M; Lake, Sarah L; Hill-Kapturczak, Nathalie; Liang, Yuanyuan; Karns, Tara E; Mullen, Jillian; Roache, John D

    2015-04-01

    Treatments for alcohol use disorders typically have been abstinence based, but harm reduction approaches that encourage drinkers to alter their drinking behavior to reduce the probability of alcohol-related consequences, have gained in popularity. This study used a contingency management procedure to determine its effectiveness in reducing alcohol consumption among heavy drinkers. Eighty-two nontreatment-seeking heavy drinkers (ages 21 to 54, M = 30.20) who did not meet diagnostic criteria for alcohol dependence participated in the study. The study had 3 phases: (i) an Observation phase (4 weeks) where participants drank normally; (ii) a Contingency Management phase (12 weeks) where participants were paid $50 weekly for not exceeding low levels of alcohol consumption as measured by transdermal alcohol concentrations, contingencies were removed. Transdermal alcohol monitors were used to verify meeting contingency requirements; all other analyses were conducted on self-reported alcohol use. On average 42.3% of participants met the contingency criteria and were paid an average of $222 during the Contingency Management phase, with an average $1,998 in total compensation throughout the study. Compared to the Observation phase, the percent of any self-reported drinking days significantly decreased from 59.9 to 40.0% in the Contingency Management and 32.0% in the Follow-up phases. The percent of self-reported heavy drinking days reported also significantly decreased from 42.4% in the Observation phase to 19.7% in the Contingency Management phase, which was accompanied by a significant increase in percent days of self-reported no (from 40.1 to 60.0%) and low-level drinking (from 9.9 to 15.4%). Self-reported reductions in drinking either persisted, or became more pronounced, during the Follow-up phase. Contingency management was associated with a reduction in self-reported episodes of heavy drinking among nontreatment-seeking heavy drinkers. These effects persisted even

  7. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  8. Schema bias in source monitoring varies with encoding conditions: support for a probability-matching account.

    Science.gov (United States)

    Kuhlmann, Beatrice G; Vaterrodt, Bianca; Bayen, Ute J

    2012-09-01

    Two experiments examined reliance on schematic knowledge in source monitoring. Based on a probability-matching account of source guessing, a schema bias will only emerge if participants do not have a representation of the source-item contingency in the study list, or if the perceived contingency is consistent with schematic expectations. Thus, the account predicts that encoding conditions that affect contingency detection also affect schema bias. In Experiment 1, the schema bias commonly found when schematic information about the sources is not provided before encoding was diminished by an intentional source-memory instruction. In Experiment 2, the depth of processing of schema-consistent and schema-inconsistent source-item pairings was manipulated. Participants consequently overestimated the occurrence of the pairing type they processed in a deep manner, and their source guessing reflected this biased contingency perception. Results support the probability-matching account of source guessing. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  9. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  10. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  11. Problems with Contingency Theory: Testing Assumptions Hidden within the Language of Contingency "Theory".

    Science.gov (United States)

    Schoonhoven, Clausia Bird

    1981-01-01

    Discusses problems in contingency theory, which relates organizational structure to the tasks performed and the information needed. Analysis of data from 17 hospitals suggests that traditional contingency theory underrepresents the complexity of relations among technological uncertainty, structure, and organizational effectiveness. (Author/RW)

  12. Optimal self-esteem is contingent: Intrinsic versus extrinsic and upward versus downward contingencies

    NARCIS (Netherlands)

    Vonk, R.; Smit, H.M.M.

    2012-01-01

    We argue that noncontingent, unconditional self-esteem is not optimal but defensive. We introduce the concept of intrinsic contingency, where self-esteem is affected by whether one's actions are self-congruent and conducive to personal growth. Whereas external contingencies, especially social and

  13. Statistical optics

    Science.gov (United States)

    Goodman, J. W.

    This book is based on the thesis that some training in the area of statistical optics should be included as a standard part of any advanced optics curriculum. Random variables are discussed, taking into account definitions of probability and random variables, distribution functions and density functions, an extension to two or more random variables, statistical averages, transformations of random variables, sums of real random variables, Gaussian random variables, complex-valued random variables, and random phasor sums. Other subjects examined are related to random processes, some first-order properties of light waves, the coherence of optical waves, some problems involving high-order coherence, effects of partial coherence on imaging systems, imaging in the presence of randomly inhomogeneous media, and fundamental limits in photoelectric detection of light. Attention is given to deterministic versus statistical phenomena and models, the Fourier transform, and the fourth-order moment of the spectrum of a detected speckle image.

  14. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  15. Searching for Plausible N-k Contingencies Endangering Voltage Stability

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel; Van Cutsem, Thierry

    2017-01-01

    This paper presents a novel search algorithm using time-domain simulations to identify plausible N − k contingencies endangering voltage stability. Starting from an initial list of disturbances, progressively more severe contingencies are investigated. After simulation of a N − k contingency......, the simulation results are assessed. If the system response is unstable, a plausible harmful contingency sequence has been found. Otherwise, components affected by the contingencies are considered as candidate next event leading to N − (k + 1) contingencies. This implicitly takes into account hidden failures...

  16. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  17. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  18. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  19. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  20. Color and Contingency in Robert Boyle's Works.

    Science.gov (United States)

    Baker, Tawrin

    2015-01-01

    This essay investigates the relationship between color and contingency in Robert Boyle's Experiments and Considerations Touching Colours (1664) and his essays on the unsuccessfulness of experiments in Certain Physiological Essays (1661). In these two works Boyle wrestles with a difficult practical and philosophical problem with experiments, which he calls the problem of contingency. In Touching Colours, the problem of contingency is magnified by the much-debated issue of whether color had any deep epistemic importance. His limited theoretical principle guiding him in Touching Colours, that color is but modified light, further exacerbated the problem. Rather than theory, Boyle often relied on craftsmen, whose mastery of color phenomena was, Boyle mentions, brought about by economic forces, to determine when colors were indicators of important 'inward' properties of substances, and thus to secure a solid foundation for his experimental history of color.

  1. Equivalence relations and the reinforcement contingency.

    Science.gov (United States)

    Sidman, M

    2000-07-01

    Where do equivalence relations come from? One possible answer is that they arise directly from the reinforcement contingency. That is to say, a reinforcement contingency produces two types of outcome: (a) 2-, 3-, 4-, 5-, or n-term units of analysis that are known, respectively, as operant reinforcement, simple discrimination, conditional discrimination, second-order conditional discrimination, and so on; and (b) equivalence relations that consist of ordered pairs of all positive elements that participate in the contingency. This conception of the origin of equivalence relations leads to a number of new and verifiable ways of conceptualizing equivalence relations and, more generally, the stimulus control of operant behavior. The theory is also capable of experimental disproof.

  2. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  3. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  4. Contingency learning in human fear conditioning involves the ventral striatum.

    Science.gov (United States)

    Klucken, Tim; Tabbert, Katharina; Schweckendiek, Jan; Merz, Christian Josef; Kagerer, Sabine; Vaitl, Dieter; Stark, Rudolf

    2009-11-01

    The ability to detect and learn contingencies between fearful stimuli and their predictive cues is an important capacity to cope with the environment. Contingency awareness refers to the ability to verbalize the relationships between conditioned and unconditioned stimuli. Although there is a heated debate about the influence of contingency awareness on conditioned fear responses, neural correlates behind the formation process of contingency awareness have gained only little attention in human fear conditioning. Recent animal studies indicate that the ventral striatum (VS) could be involved in this process, but in human studies the VS is mostly associated with positive emotions. To examine this question, we reanalyzed four recently published classical fear conditioning studies (n = 117) with respect to the VS at three distinct levels of contingency awareness: subjects, who did not learn the contingencies (unaware), subjects, who learned the contingencies during the experiment (learned aware) and subjects, who were informed about the contingencies in advance (instructed aware). The results showed significantly increased activations in the left and right VS in learned aware compared to unaware subjects. Interestingly, this activation pattern was only found in learned but not in instructed aware subjects. We assume that the VS is not involved when contingency awareness does not develop during conditioning or when contingency awareness is unambiguously induced already prior to conditioning. VS involvement seems to be important for the transition from a contingency unaware to a contingency aware state. Implications for fear conditioning models as well as for the contingency awareness debate are discussed.

  5. Estimating state-contingent production functions

    DEFF Research Database (Denmark)

    Rasmussen, Svend; Karantininis, Kostas

    The paper reviews the empirical problem of estimating state-contingent production functions. The major problem is that states of nature may not be registered and/or that the number of observation per state is low. Monte Carlo simulation is used to generate an artificial, uncertain production...... environment based on Cobb Douglas production functions with state-contingent parameters. The pa-rameters are subsequently estimated based on different sizes of samples using Generalized Least Squares and Generalized Maximum Entropy and the results are compared. It is concluded that Maximum Entropy may...

  6. The contingent valuation method: a review

    International Nuclear Information System (INIS)

    Venkatachalam, L.

    2004-01-01

    The contingent valuation method (CVM) is a simple, flexible nonmarket valuation method that is widely used in cost-benefit analysis and environmental impact assessment. However, this method is subject to severe criticism. The criticism revolves mainly around two aspects, namely, the validity and the reliability of the results, and the effects of various biases and errors. The major objective of this paper is to review the recent developments on measures to address the validity and reliability issues arising out of different kinds of biases/errors and other related empirical and methodological issues concerning contingent valuation method

  7. On the discretization of probability density functions and the ...

    Indian Academy of Sciences (India)

    important for most applications or theoretical problems of interest. In statistics ... In probability theory, statistics, statistical mechanics, communication theory, and other .... (1) by taking advantage of SMVT as a general mathematical approach.

  8. Comments on contingency management and conditional cash transfers.

    Science.gov (United States)

    Higgins, Stephen T

    2010-10-01

    This essay discusses research on incentive-based interventions to promote healthy behavior change, contingency management (CM) and conditional cash transfers (CCT). The overarching point of the essay is that CM and CCT are often treated as distinct areas of inquiry when at their core they represent a common approach. Some potential bi-directional benefits of recognizing this commonality are discussed. Distinct intellectual traditions probably account for the separate paths of CM and CCT to date, with the former being rooted in behavioral psychology and the latter in microeconomics. It is concluded that the emerging field of behavioral economics, which is informed by and integrates principles of each of those disciplines, may provide the proper conceptual framework for integrating CM and CCT.

  9. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  10. Understanding Statistics - Cancer Statistics

    Science.gov (United States)

    Annual reports of U.S. cancer statistics including new cases, deaths, trends, survival, prevalence, lifetime risk, and progress toward Healthy People targets, plus statistical summaries for a number of common cancer types.

  11. The Role of the Rat Medial Prefrontal Cortex in Adapting to Changes in Instrumental Contingency

    Science.gov (United States)

    Coutureau, Etienne; Esclassan, Frederic; Di Scala, Georges; Marchand, Alain R.

    2012-01-01

    In order to select actions appropriate to current needs, a subject must identify relationships between actions and events. Control over the environment is determined by the degree to which action consequences can be predicted, as described by action-outcome contingencies – i.e. performing an action should affect the probability of the outcome. We evaluated in a first experiment adaptation to contingency changes in rats with neurotoxic lesions of the medial prefrontal cortex. Results indicate that this brain region is not critical to adjust instrumental responding to a negative contingency where the rats must refrain from pressing a lever, as this action prevents reward delivery. By contrast, this brain region is required to reduce responding in a non-contingent situation where the same number of rewards is freely delivered and actions do not affect the outcome any more. In a second experiment, we determined that this effect does not result from a different perception of temporal relationships between actions and outcomes since lesioned rats adapted normally to gradually increasing delays in reward delivery. These data indicate that the medial prefrontal cortex is not directly involved in evaluating the correlation between action-and reward-rates or in the perception of reward delays. The deficit in lesioned rats appears to consist of an abnormal response to the balance between contingent and non-contingent rewards. By highlighting the role of prefrontal regions in adapting to the causal status of actions, these data contribute to our understanding of the neural basis of choice tasks. PMID:22496747

  12. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Analyzing Contingency Contracting Purchases for Operation Iraqi Freedom (Unrestricted Version)

    National Research Council Canada - National Science Library

    Baldwin, Laura H; Ausink, John A; Campbell, Nancy F; Drew, John G; Roll, Jr, Charles R

    2008-01-01

    ...) in an effort to determine the size and extent of contractor support, and how plans for and the organization and execution of contingency contracting activities might be improved so that Contingency...

  14. Effectiveness evaluation of contingency sum as a risk management ...

    African Journals Online (AJOL)

    Ethiopian Journal of Environmental Studies and Management ... manage risks prone projects have adopted several methods, one of which is contingency sum. ... initial project cost, cost overrun and percentage allowed for contingency.

  15. 48 CFR 1632.770 - Contingency reserve payments.

    Science.gov (United States)

    2010-10-01

    ... FINANCING Contract Funding 1632.770 Contingency reserve payments. (a) Payments from the contingency reserve... advise the carrier of its decision. However, OPM shall not unreasonably withhold approval for amounts...

  16. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  17. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  18. Statistical Optics

    Science.gov (United States)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  19. Contingency Management with Human Autonomy Teaming

    Science.gov (United States)

    Shively, Robert J.; Lachter, Joel B.

    2018-01-01

    Automation is playing an increasingly important role in many operations. It is often cheaper faster and more precise than human operators. However, automation is not perfect. There are many situations in which a human operator must step in. We refer to these instances as contingencies and the act of stepping in contingency management. Here we propose coupling Human Autonomy Teaming (HAT) with contingency management. We describe two aspects to HAT, bi-directional communication, and working agreements (or plays). Bi-directional communication like Crew Resource Management in traditional aviation, allows all parties to contribute to a decision. Working agreements specify roles and responsibilities. Importantly working agreements allow for the possibility of roles and responsibilities changing depending on environmental factors (e.g., situations the automation was not designed for, workload, risk, or trust). This allows for the automation to "automatically" become more autonomous as it becomes more trusted and/or it is updated to deal with a more complete set of possible situations. We present a concrete example using a prototype contingency management station one might find in a future airline operations center. Automation proposes reroutes for aircraft that encounter bad weather or are forced to divert for environmental or systems reasons. If specific conditions are met, these recommendations may be autonomously datalinked to the affected aircraft.

  20. Thomas Aquinas on Contingency of Nature

    Czech Academy of Sciences Publication Activity Database

    Dvořák, Petr

    2008-01-01

    Roč. 5, č. 2 (2008), s. 185-196 ISSN 1214-8407 R&D Projects: GA AV ČR(CZ) IAA900090602 Institutional research plan: CEZ:AV0Z90090514 Keywords : Thomas Aquinas * determinism * contingency Subject RIV: AA - Philosophy ; Religion

  1. Two psychologies: Cognitive versus contingency-oriented

    NARCIS (Netherlands)

    Mey, H.R.A. De

    2003-01-01

    Cognitive psychology and contingency-based behavior analysis are contrasted to each other with respect to their philosophical and theoretical underpinnings as well as to theirpractical goals. Whereas the former focuses on intra-organismic structure and function in explaining minds, the latter

  2. The dependency and contingency of politics

    DEFF Research Database (Denmark)

    Triantafillou, Peter

    2016-01-01

    differences which make any analytical synthesis both a difficult and a questionable endeavour. In particular, whereas historical institutionalism seeks to explain the present in terms of its dependence on past events, genealogy seeks to provoke the present by demonstrating its historical contingency. In spite...

  3. Management issues regarding the contingent workforce

    Energy Technology Data Exchange (ETDEWEB)

    Bowen-Smed, S. [Bowen Workforce Solutions, Calgary, AB (Canada)

    2004-07-01

    Fifty per cent of corporate leaders in Calgary today will be eligible for retirement over the next 5 years. In addition, 53 per cent of the entire Calgary workforce is 45 years or older. This paper suggests that only companies that seek aggressive programs to engage immigrants and contractors will weather the skills shortages anticipated in the future. It was noted that contractors care about aligning values to organizations, regardless of the project length, and that professional development is a key consideration when it comes to selecting their next project. Contingent workforce issues include: effectiveness; classification; risk; and cost. It was stated that effectiveness of the contingent workforce is an employer's responsibility. Factors that would strengthen the relationship between corporations and contractors include: proper orientation to manage expectations; training to improve productivity; tracking to enhance the quality of the workforce; and a management process to ensure adherence to protocol. It was concluded that the contingent workforce is an essential component to human capital management strategy, but that key issues must be managed to avoid unnecessary costs. In addition, effectiveness improves when processes are implemented. It was also suggested that technology is an essential component of the solution. Outsourcing is an effective approach to managing the contingent workforce. tabs., figs.

  4. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  5. Contingency learning in alcohol dependence and pathological gambling: learning and unlearning reward contingencies

    NARCIS (Netherlands)

    Vanes, Lucy D.; van Holst, Ruth J.; Jansen, Jochem M.; van den Brink, Wim; Oosterlaan, Jaap; Goudriaan, Anna E.

    2014-01-01

    Patients with alcohol dependence (AD) and pathological gambling (PG) are characterized by dysfunctional reward processing and their ability to adapt to alterations of reward contingencies is impaired. However, most neurocognitive tasks investigating reward processing involve a complex mix of

  6. Contingency learning in alcohol dependence and pathological gambling: learning and unlearning reward contingencies

    NARCIS (Netherlands)

    Vanes, L.D.; Holst, R.J. van; Jansen, J.M.; Brink, W. van den; Oosterlaan, J.; Goudriaan, A.E.

    2014-01-01

    BACKGROUND: Patients with alcohol dependence (AD) and pathological gambling (PG) are characterized by dysfunctional reward processing and their ability to adapt to alterations of reward contingencies is impaired. However, most neurocognitive tasks investigating reward processing involve a complex

  7. Contingency Learning in Alcohol Dependence and Pathological Gambling: Learning and Unlearning Reward Contingencies

    NARCIS (Netherlands)

    Vanes, L.D.; Holst, R.; Jansen, J.D.; van den Brink, W.A.; Oosterlaan, J.; Goudriaan, A.E.

    2014-01-01

    Background: Patients with alcohol dependence (AD) and pathological gambling (PG) are characterized by dysfunctional reward processing and their ability to adapt to alterations of reward contingencies is impaired. However, most neurocognitive tasks investigating reward processing involve a complex

  8. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  9. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  10. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  11. 40 CFR 51.1012 - Requirement for contingency measures.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Requirement for contingency measures... Implementation of PM2.5 National Ambient Air Quality Standards § 51.1012 Requirement for contingency measures... contingency measures to be undertaken if the area fails to make reasonable further progress, or fails to...

  12. 40 CFR 264.227 - Emergency repairs; contingency plans.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Emergency repairs; contingency plans... FACILITIES Surface Impoundments § 264.227 Emergency repairs; contingency plans. (a) A surface impoundment... days after detecting the problem. (c) As part of the contingency plan required in subpart D of this...

  13. 78 FR 46781 - Federal Acquisition Regulation; Definition of Contingency Operation

    Science.gov (United States)

    2013-08-01

    ... Federal Acquisition Regulation; Definition of Contingency Operation AGENCY: Department of Defense (DoD... the Federal Acquisition Regulation (FAR) to revise the definition of ``contingency operation'' to... ``contingency operation'' at FAR 2.101 in accordance with the statutory change to the definition made by...

  14. 30 CFR 218.152 - Fishermen's Contingency Fund.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Fishermen's Contingency Fund. 218.152 Section 218.152 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR MINERALS REVENUE..., Offshore § 218.152 Fishermen's Contingency Fund. Upon the establishment of the Fishermen's Contingency Fund...

  15. 78 FR 13765 - Federal Acquisition Regulation; Definition of Contingency Operation

    Science.gov (United States)

    2013-02-28

    ... Federal Acquisition Regulation; Definition of Contingency Operation AGENCY: Department of Defense (DoD... Regulation (FAR) to revise the definition of ``contingency operation'' to address the statutory change to the... ``contingency operation'' at FAR 2.101 in accordance with the statutory change to the definition made by...

  16. 10 CFR 72.184 - Safeguards contingency plan.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Safeguards contingency plan. 72.184 Section 72.184 Energy... Protection § 72.184 Safeguards contingency plan. (a) The requirements of the licensee's safeguards contingency plan for responding to threats and radiological sabotage must be as defined in appendix C to part...

  17. 50 CFR 296.3 - Fishermen's contingency fund.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false Fishermen's contingency fund. 296.3... ADMINISTRATION, DEPARTMENT OF COMMERCE CONTINENTAL SHELF FISHERMEN'S CONTINGENCY FUND § 296.3 Fishermen's contingency fund. (a) General. There is established in the Treasury of the United States the Fishermen's...

  18. Psychophysics of associative learning: Quantitative properties of subjective contingency.

    Science.gov (United States)

    Maia, Susana; Lefèvre, Françoise; Jozefowiez, Jérémie

    2018-01-01

    Allan and collaborators (Allan, Hannah, Crump, & Siegel, 2008; Allan, Siegel, & Tangen, 2005; Siegel, Allan, Hannah, & Crump, 2009) recently proposed to apply signal detection theory to the analysis of contingency judgment tasks. When exposed to a flow of stimuli, participants are asked to judge whether there is a contingent relation between a cue and an outcome, that is, whether the subjective cue-outcome contingency exceeds a decision threshold. In this context, we tested the following hypotheses regarding the relation between objective and subjective cue-outcome contingency: (a) The underlying distributions of subjective cue-outcome contingency are Gaussian; (b) The mean distribution of subjective contingency is a linear function of objective cue-outcome contingency; and (c) The variance in the distribution of subjective contingency is constant. The hypotheses were tested by combining a streamed-trial contingency assessment task with a confidence rating procedure. Participants were exposed to rapid flows of stimuli at the end of which they had to judge whether an outcome was more (Experiment 1) or less (Experiment 2) likely to appear following a cue and how sure they were of their judgment. We found that although Hypothesis A seems reasonable, Hypotheses B and C were not. Regarding Hypothesis B, participants were more sensitive to positive than to negative contingencies. Regarding Hypothesis C, the perceived cue-outcome contingency became more variable when the contingency became more positive or negative, but only to a slight extent. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. 48 CFR 225.7303-4 - Contingent fees.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Contingent fees. 225.7303....7303-4 Contingent fees. (a) Except as provided in paragraph (b) of this subsection, contingent fees are generally allowable under DoD contracts, provided— (1) The fees are paid to a bona fide employee or a bona...

  20. Sartre's Contingency of Being and Asouzu's Principle of Causality ...

    African Journals Online (AJOL)

    The position of this work is that all contingent beings have a causal agent. This position is taken as a result of trying to delve into the issue of contingency and causality of being which has been discussed by many philosophers of diverse epochs of philosophy. This work tries to participate in the debate of whether contingent ...

  1. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  2. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  3. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  4. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  5. Contingent capture effects in temporal order judgments.

    Science.gov (United States)

    Born, Sabine; Kerzel, Dirk; Pratt, Jay

    2015-08-01

    The contingent attentional capture hypothesis proposes that visual stimuli that do not possess characteristics relevant for the current task will not capture attention, irrespective of their bottom-up saliency. Typically, contingent capture is tested in a spatial cuing paradigm, comparing manual reaction times (RTs) across different conditions. However, attention may act through several mechanisms and RTs may not be ideal to disentangle those different components. In 3 experiments, we examined whether color singleton cues provoke cuing effects in temporal order judgments (TOJs) and whether they would be contingent on attentional control sets. Experiment 1 showed that color singleton cues indeed produce cuing effects in TOJs, even in a cluttered and dynamic target display containing multiple heterogeneous distractors. In Experiment 2, consistent with contingent capture, we observed reliable cuing effects only when the singleton cue matched participants' current attentional control set. Experiment 3 suggests that a sensory interaction account of the differences found in Experiment 2 is unlikely. Our results help to discern the attentional components that may play a role in contingent capture. Further, we discuss a number of other effects (e.g., reversed cuing effects) that are found in RTs, but so far have not been reported in TOJs. Those differences suggest that RTs are influenced by a multitude of mechanisms; however, not all of these mechanisms may affect TOJs. We conclude by highlighting how the study of attentional capture in TOJs provides valuable insights for the attention literature, but also for studies concerned with the perceived timing between stimuli. (c) 2015 APA, all rights reserved).

  6. Financial derivative pricing under probability operator via Esscher transfomation

    Energy Technology Data Exchange (ETDEWEB)

    Achi, Godswill U., E-mail: achigods@yahoo.com [Department of Mathematics, Abia State Polytechnic Aba, P.M.B. 7166, Aba, Abia State (Nigeria)

    2014-10-24

    The problem of pricing contingent claims has been extensively studied for non-Gaussian models, and in particular, Black- Scholes formula has been derived for the NIG asset pricing model. This approach was first developed in insurance pricing{sup 9} where the original distortion function was defined in terms of the normal distribution. This approach was later studied6 where they compared the standard Black-Scholes contingent pricing and distortion based contingent pricing. So, in this paper, we aim at using distortion operators by Cauchy distribution under a simple transformation to price contingent claim. We also show that we can recuperate the Black-Sholes formula using the distribution. Similarly, in a financial market in which the asset price represented by a stochastic differential equation with respect to Brownian Motion, the price mechanism based on characteristic Esscher measure can generate approximate arbitrage free financial derivative prices. The price representation derived involves probability Esscher measure and Esscher Martingale measure and under a new complex valued measure φ (u) evaluated at the characteristic exponents φ{sub x}(u) of X{sub t} we recuperate the Black-Scholes formula for financial derivative prices.

  7. Financial derivative pricing under probability operator via Esscher transfomation

    International Nuclear Information System (INIS)

    Achi, Godswill U.

    2014-01-01

    The problem of pricing contingent claims has been extensively studied for non-Gaussian models, and in particular, Black- Scholes formula has been derived for the NIG asset pricing model. This approach was first developed in insurance pricing 9 where the original distortion function was defined in terms of the normal distribution. This approach was later studied6 where they compared the standard Black-Scholes contingent pricing and distortion based contingent pricing. So, in this paper, we aim at using distortion operators by Cauchy distribution under a simple transformation to price contingent claim. We also show that we can recuperate the Black-Sholes formula using the distribution. Similarly, in a financial market in which the asset price represented by a stochastic differential equation with respect to Brownian Motion, the price mechanism based on characteristic Esscher measure can generate approximate arbitrage free financial derivative prices. The price representation derived involves probability Esscher measure and Esscher Martingale measure and under a new complex valued measure φ (u) evaluated at the characteristic exponents φ x (u) of X t we recuperate the Black-Scholes formula for financial derivative prices

  8. Financial derivative pricing under probability operator via Esscher transfomation

    Science.gov (United States)

    Achi, Godswill U.

    2014-10-01

    The problem of pricing contingent claims has been extensively studied for non-Gaussian models, and in particular, Black- Scholes formula has been derived for the NIG asset pricing model. This approach was first developed in insurance pricing9 where the original distortion function was defined in terms of the normal distribution. This approach was later studied6 where they compared the standard Black-Scholes contingent pricing and distortion based contingent pricing. So, in this paper, we aim at using distortion operators by Cauchy distribution under a simple transformation to price contingent claim. We also show that we can recuperate the Black-Sholes formula using the distribution. Similarly, in a financial market in which the asset price represented by a stochastic differential equation with respect to Brownian Motion, the price mechanism based on characteristic Esscher measure can generate approximate arbitrage free financial derivative prices. The price representation derived involves probability Esscher measure and Esscher Martingale measure and under a new complex valued measure φ (u) evaluated at the characteristic exponents φx(u) of Xt we recuperate the Black-Scholes formula for financial derivative prices.

  9. Dysphoric mood states are related to sensitivity to temporal changes in contingency

    Directory of Open Access Journals (Sweden)

    Rachel M. eMsetfi

    2012-09-01

    Full Text Available A controversial finding in the field of causal learning is that mood contributes to the accuracy of perceptions of uncorrelated relationships. When asked to report the degree of control between an action and its outcome, people with dysphoria or depression are claimed to be more realistic in reporting non-contingency (e.g., Alloy & Abramson, 1979. The strongest evidence for this depressive realism (DR effect is derived from data collected with experimental procedures in which the dependent variables are verbal or written ratings of contingency or cause, and, perhaps more importantly, the independent variable in these procedures may be ambiguous and difficult to define. In order to address these possible confounds, we used a two-response free-operant causal learning task in which the dependent measures were performance based. Participants were required to respond to maximise the occurrence of a temporally contiguous outcome that was programmed with different probabilities, which also varied temporally across two responses. Dysphoric participants were more sensitive to the changing outcome contingencies than controls even though they responded at a similar rate. During probe trials, in which the outcome was masked, their performance recovered more quickly than that of the control group. These data provide unexpected support for the depressive realism hypothesis suggesting that dysphoria is associated with heightened sensitivity to temporal shifts in contingency.

  10. Dysphoric Mood States are Related to Sensitivity to Temporal Changes in Contingency.

    Science.gov (United States)

    Msetfi, Rachel M; Murphy, Robin A; Kornbrot, Diana E

    2012-01-01

    A controversial finding in the field of causal learning is that mood contributes to the accuracy of perceptions of uncorrelated relationships. When asked to report the degree of control between an action and its outcome, people with dysphoria or depression are claimed to be more realistic in reporting non-contingency (e.g., Alloy and Abramson, 1979). The strongest evidence for this depressive realism (DR) effect is derived from data collected with experimental procedures in which the dependent variables are verbal or written ratings of contingency or cause, and, perhaps more importantly, the independent variable in these procedures may be ambiguous and difficult to define. In order to address these possible confounds, we used a two-response free-operant causal learning task in which the dependent measures were performance based. Participants were required to respond to maximize the occurrence of a temporally contiguous outcome that was programmed with different probabilities, which also varied temporally across two responses. Dysphoric participants were more sensitive to the changing outcome contingencies than controls even though they responded at a similar rate. During probe trials, in which the outcome was masked, their performance recovered more quickly than that of the control group. These data provide unexpected support for the DR hypothesis suggesting that dysphoria is associated with heightened sensitivity to temporal shifts in contingency.

  11. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  12. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  13. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  14. Yampa River Valley sub-area contingency plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-08-01

    The Yampa River Valley sub-area contingency plan (Contingency Plan) has been prepared for two counties in northwestern Colorado: Moffat County and Routt County. The Contingency Plan is provided in two parts, the Contingency Plan and the Emergency Response Action Plan (ERAP). The Contingency Plan provides information that should be helpful in planning to minimize the impact of an oil spill or hazardous material incident. It contains discussions of planning and response role, hazards identification, vulnerability analysis, risk analysis, cleanup, cost recovery, training, and health and safety. It includes information on the incident command system, notifications, response capabilities, emergency response organizations, evacuation and shelter-in-place, and immediate actions.

  15. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  16. Statistics of Extremes

    KAUST Repository

    Davison, Anthony C.; Huser, Raphaë l

    2015-01-01

    Statistics of extremes concerns inference for rare events. Often the events have never yet been observed, and their probabilities must therefore be estimated by extrapolation of tail models fitted to available data. Because data concerning the event

  17. Playing at Statistical Mechanics

    Science.gov (United States)

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  18. Molecular Darwinism: the contingency of spontaneous genetic variation.

    Science.gov (United States)

    Arber, Werner

    2011-01-01

    The availability of spontaneously occurring genetic variants is an important driving force of biological evolution. Largely thanks to experimental investigations by microbial geneticists, we know today that several different molecular mechanisms contribute to the overall genetic variations. These mechanisms can be assigned to three natural strategies to generate genetic variants: 1) local sequence changes, 2) intragenomic reshuffling of DNA segments, and 3) acquisition of a segment of foreign DNA. In these processes, specific gene products are involved in cooperation with different nongenetic elements. Some genetic variations occur fully at random along the DNA filaments, others rather with a statistical reproducibility, although at many possible sites. We have to be aware that evolution in natural ecosystems is of higher complexity than under most laboratory conditions, not at least in view of symbiotic associations and the occurrence of horizontal gene transfer. The encountered contingency of genetic variation can possibly best ensure a long-term persistence of life under steadily changing living conditions.

  19. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  20. Business statistics I essentials

    CERN Document Server

    Clark, Louise

    2014-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t

  1. Statistics for Engineers

    International Nuclear Information System (INIS)

    Kim, Jin Gyeong; Park, Jin Ho; Park, Hyeon Jin; Lee, Jae Jun; Jun, Whong Seok; Whang, Jin Su

    2009-08-01

    This book explains statistics for engineers using MATLAB, which includes arrangement and summary of data, probability, probability distribution, sampling distribution, assumption, check, variance analysis, regression analysis, categorical data analysis, quality assurance such as conception of control chart, consecutive control chart, breakthrough strategy and analysis using Matlab, reliability analysis like measurement of reliability and analysis with Maltab, and Markov chain.

  2. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  3. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  4. [Contingency management in opioid substitution treatment].

    Science.gov (United States)

    Specka, M; Böning, A; Scherbaum, N

    2011-07-01

    The majority of opiate-dependent patients in substitution treatment show additional substance-related disorders. Concomitant use of heroin, alcohol, benzodiazepines or cocaine compromises treatment success. Concomitant drug use may be treated by using contingency management (CM) which is based on learning theory. In CM, abstinence from drugs, as verified by drug screenings, is reinforced directly and contingently. Reinforcers used in CM studies with substituted patients were, amongst others, vouchers and take-home privileges. Studies in the USA show a medium average effect of CM on drug consumption rates and abstinence. The effects decrease markedly after the end of the intervention. We discuss whether CM is applicable within the German substitution treatment system and how it can be combined with other interventions such as selective detoxification treatments or cognitive-behavioural programmes. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Contingency and similarity in response selection.

    Science.gov (United States)

    Prinz, Wolfgang

    2018-05-09

    This paper explores issues of task representation in choice reaction time tasks. How is it possible, and what does it take, to represent such a task in a way that enables a performer to do the task in line with the prescriptions entailed in the instructions? First, a framework for task representation is outlined which combines the implementation of task sets and their use for performance with different kinds of representational operations (pertaining to feature compounds for event codes and code assemblies for task sets, respectively). Then, in a second step, the framework is itself embedded in the bigger picture of the classical debate on the roles of contingency and similarity for the formation of associations. The final conclusion is that both principles are needed and that the operation of similarity at the level of task sets requires and presupposes the operation of contingency at the level of event codes. Copyright © 2018 The Author. Published by Elsevier Inc. All rights reserved.

  6. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  7. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  8. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  9. Food Marketing Technology and Contingency Market Valuation

    OpenAIRE

    Garth J. Holloway; Anthony C. Zwart

    1993-01-01

    Marketing activities are introduced into a rational expectations model of the food marketing system. The model is used to evaluate effects of alternative marketing technologies on the distribution of the benefits of contingency markets in agriculture. Benefits depend on two parameters: the cost share of farm inputs and the elasticity of substitution between farm and nonfarm inputs in food marketing. Over a broad spectrum of technologies, consumers are likely to be the net beneficiaries and fa...

  10. Capacity Adjustment through Contingent Staffing Outsourcing

    OpenAIRE

    Neubert , Gilles; Adjadj , Philippe

    2009-01-01

    International audience; For a long time, contingent staffing was considered as the responsability of the Human Resource department. The high needs of workforce flexibility combined with disseminated agencies have led some companies to a great number of labor suppliers. This situation has produced important cost variation, poor quality of service, and important risk due to the mistunderstanding by local managers of legal considerations. To face this situation, companies have started to move fr...

  11. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  12. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  13. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  14. Parallel auto-correlative statistics with VTK.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  15. Contingency Cost estimation for Research reactor Decommissioning

    International Nuclear Information System (INIS)

    Jin, Hyung Gon; Hong, Yun Jeong

    2016-01-01

    There are many types of cost items in decommissioning cost estimation, however, contingencies are for unforeseen elements of cost within the defined project scope. Regulatory body wants to reasonable quantification for this issue. Many countries have adopted the breakdown of activity dependent and period-dependent costs to structure their estimates. Period-dependent costs could be broken down into defined time frames to reduce overall uncertainties. Several countries apply this notion by having different contingency factors for different phases of the project. This study is a compilation of contingency cost of research reactor and for each country. Simulation techniques using TRIM, MATLAB, and PSpice can be useful tools for designing detector channels. Thus far TRIM, MATLAB and PSpice have been used to calculate the detector current output pulse for SiC semiconductor detectors and to model the pulses that propagate through potential detector channels. This model is useful for optimizing the detector and the resolution for application to neutron monitoring in the Generation IV power reactors

  16. Contingency Cost estimation for Research reactor Decommissioning

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Hyung Gon; Hong, Yun Jeong [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There are many types of cost items in decommissioning cost estimation, however, contingencies are for unforeseen elements of cost within the defined project scope. Regulatory body wants to reasonable quantification for this issue. Many countries have adopted the breakdown of activity dependent and period-dependent costs to structure their estimates. Period-dependent costs could be broken down into defined time frames to reduce overall uncertainties. Several countries apply this notion by having different contingency factors for different phases of the project. This study is a compilation of contingency cost of research reactor and for each country. Simulation techniques using TRIM, MATLAB, and PSpice can be useful tools for designing detector channels. Thus far TRIM, MATLAB and PSpice have been used to calculate the detector current output pulse for SiC semiconductor detectors and to model the pulses that propagate through potential detector channels. This model is useful for optimizing the detector and the resolution for application to neutron monitoring in the Generation IV power reactors.

  17. Sound-contingent visual motion aftereffect

    Directory of Open Access Journals (Sweden)

    Kobayashi Maori

    2011-05-01

    Full Text Available Abstract Background After a prolonged exposure to a paired presentation of different types of signals (e.g., color and motion, one of the signals (color becomes a driver for the other signal (motion. This phenomenon, which is known as contingent motion aftereffect, indicates that the brain can establish new neural representations even in the adult's brain. However, contingent motion aftereffect has been reported only in visual or auditory domain. Here, we demonstrate that a visual motion aftereffect can be contingent on a specific sound. Results Dynamic random dots moving in an alternating right or left direction were presented to the participants. Each direction of motion was accompanied by an auditory tone of a unique and specific frequency. After a 3-minutes exposure, the tones began to exert marked influence on the visual motion perception, and the percentage of dots required to trigger motion perception systematically changed depending on the tones. Furthermore, this effect lasted for at least 2 days. Conclusions These results indicate that a new neural representation can be rapidly established between auditory and visual modalities.

  18. Contingency Valuation of Croatian Arboretum Opeka

    Directory of Open Access Journals (Sweden)

    Stjepan Posavec

    2012-12-01

    Full Text Available Background and Purpose: Social aspects of forestry have always been an important factor of forest usage and management, and therefore have significant influence on its sustainability. Non-wood forest functions such as recreation, tourism, aesthetic and educational factors influence development of rural areas. Contingent valuation method has rarely been used for evaluation of protected forest areas. The aim of the article is to estimate the amount of money visitors are willing to pay for nature’s resources preservation in the arboretum Opeka in the North-West Croatia. Material and Methods: Opeka Arboretum is situated in the Vinica municipality in northern Croatia. Located in a large park surrounding a manor, Opeka arboretum, with its 65 hectares is the largest of the three arboretums existing in Croatia today. The arboretum was founded in 1860 by the Count Marko Bombelles. Contingent valuation is a survey-based economic technique for the non-market valuation of resources, such as environmental preservation or the impact of contamination. It is also the approach that can generally be used to include what is usually referred to as the passive use component of the economic value of environmental goods. Results and Conclusion: Willingness to pay for visitor’s use of the arboretum has been investigated using the survey and contingency valuation method on a sample of 53 respondents. Research results present high preference for arboretum benefits such as beauty of landscape, cultural and historical significance, recreation and health but low willingness to pay.

  19. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  20. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  1. Stupid statistics!

    Science.gov (United States)

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  2. Portable EMG devices, Biofeedback and Contingent Electrical Stimulation applications in Bruxism

    DEFF Research Database (Denmark)

    Castrillon, Eduardo

    Portable EMG devices, Biofeedback and Contingent Electrical Stimulation applications in Bruxism Eduardo Enrique, Castrillon Watanabe, DDS, MSc, PhD Section of Orofacial Pain and Jaw Function, Department of Dentistry, Aarhus University, Aarhus, Denmark; Scandinavian Center for Orofacial Neuroscience...... Summary: Bruxism is a parafunctional activity, which involves the masticatory muscles and probably it is as old as human mankind. Different methods such as portable EMG devices have been proposed to diagnose and understand the pathophysiology of bruxism. Biofeedback / contingent electrical stimulation...... characteristics make it complicated to assess bruxism using portable EMG devices. The possibility to assess bruxism like EMG activity on a portable device made it possible to use biofeedback and CES approaches in order to treat / manage bruxism. The available scientific information about CES effects on bruxism...

  3. [A factor analysis method for contingency table data with unlimited multiple choice questions].

    Science.gov (United States)

    Toyoda, Hideki; Haiden, Reina; Kubo, Saori; Ikehara, Kazuya; Isobe, Yurie

    2016-02-01

    The purpose of this study is to propose a method of factor analysis for analyzing contingency tables developed from the data of unlimited multiple-choice questions. This method assumes that the element of each cell of the contingency table has a binominal distribution and a factor analysis model is applied to the logit of the selection probability. Scree plot and WAIC are used to decide the number of factors, and the standardized residual, the standardized difference between the sample, and the proportion ratio, is used to select items. The proposed method was applied to real product impression research data on advertised chips and energy drinks. Since the results of the analysis showed that this method could be used in conjunction with conventional factor analysis model, and extracted factors were fully interpretable, and suggests the usefulness of the proposed method in the study of psychology using unlimited multiple-choice questions.

  4. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  5. Advances in statistics

    Science.gov (United States)

    Howard Stauffer; Nadav Nur

    2005-01-01

    The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...

  6. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  7. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  8. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  9. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  10. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  11. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  12. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  13. Equilibrium statistical mechanics

    CERN Document Server

    Jackson, E Atlee

    2000-01-01

    Ideal as an elementary introduction to equilibrium statistical mechanics, this volume covers both classical and quantum methodology for open and closed systems. Introductory chapters familiarize readers with probability and microscopic models of systems, while additional chapters describe the general derivation of the fundamental statistical mechanics relationships. The final chapter contains 16 sections, each dealing with a different application, ordered according to complexity, from classical through degenerate quantum statistical mechanics. Key features include an elementary introduction t

  14. Statistics & probaility for dummies

    CERN Document Server

    Rumsey, Deborah J

    2013-01-01

    Two complete eBooks for one low price! Created and compiled by the publisher, this Statistics I and Statistics II bundle brings together two math titles in one, e-only bundle. With this special bundle, you'll get the complete text of the following two titles: Statistics For Dummies, 2nd Edition  Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more. Tra

  15. Head First Statistics

    CERN Document Server

    Griffiths, Dawn

    2009-01-01

    Wouldn't it be great if there were a statistics book that made histograms, probability distributions, and chi square analysis more enjoyable than going to the dentist? Head First Statistics brings this typically dry subject to life, teaching you everything you want and need to know about statistics through engaging, interactive, and thought-provoking material, full of puzzles, stories, quizzes, visual aids, and real-world examples. Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics

  16. Disciplined Decision Making in an Interdisciplinary Environment: Some Implications for Clinical Applications of Statistical Process Control.

    Science.gov (United States)

    Hantula, Donald A.

    1995-01-01

    Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…

  17. Skype me! Socially Contingent Interactions Help Toddlers Learn Language

    OpenAIRE

    Roseberry, Sarah; Hirsh-Pasek, Kathy; Golinkoff, Roberta Michnick

    2013-01-01

    Language learning takes place in the context of social interactions, yet the mechanisms that render social interactions useful for learning language remain unclear. This paper focuses on whether social contingency might support word learning. Toddlers aged 24- to 30-months (N=36) were exposed to novel verbs in one of three conditions: live interaction training, socially contingent video training over video chat, and non-contingent video training (yoked video). Results sugges...

  18. Contingency planning in southern Africa: Events rather than processes?

    OpenAIRE

    Elias Mabaso; Siambabala B. Manyena

    2013-01-01

    With the increasing frequency, magnitude and impact of disasters, there is growing focus on contingency planning as a tool for enhancing resilience. Yet, there is little empirical evidence that reflects on the practice of contingency planning systems within the context of disaster risk reduction. This article explores the practice of contingency planning in southern Africa, focussing on Malawi, Mozambique, Namibia, Zambia and Zimbabwe. A qualitative comparative analysis informed by fieldwork ...

  19. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  20. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  1. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  2. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  3. The Impact of the Contingency of Robot Feedback for HRI

    DEFF Research Database (Denmark)

    Fischer, Kerstin; Lohan, Katrin Solveig; Saunders, Joe

    2013-01-01

    robot iCub on a set of shapes and on a stacking task in two conditions, once with socially contingent, nonverbal feedback implemented in response to different gaze and looming behaviors of the human tutor, and once with non-contingent, saliency-based feedback. The results of the analysis of participants......’ linguistic behaviors in the two conditions show that contingency has an impact on the complexity and the pre-structuring of the task for the robot, i.e. on the participants’ tutoring behaviors. Contingency thus plays a considerable role for learning by demonstration....

  4. Motor contingency learning and infants with Spina Bifida.

    Science.gov (United States)

    Taylor, Heather B; Barnes, Marcia A; Landry, Susan H; Swank, Paul; Fletcher, Jack M; Huang, Furong

    2013-02-01

    Infants with Spina Bifida (SB) were compared to typically developing infants (TD) using a conjugate reinforcement paradigm at 6 months-of-age (n = 98) to evaluate learning, and retention of a sensory-motor contingency. Analyses evaluated infant arm-waving rates at baseline (wrist not tethered to mobile), during acquisition of the sensory-motor contingency (wrist tethered), and immediately after the acquisition phase and then after a delay (wrist not tethered), controlling for arm reaching ability, gestational age, and socioeconomic status. Although both groups responded to the contingency with increased arm-waving from baseline to acquisition, 15% to 29% fewer infants with SB than TD were found to learn the contingency depending on the criterion used to determine contingency learning. In addition, infants with SB who had learned the contingency had more difficulty retaining the contingency over time when sensory feedback was absent. The findings suggest that infants with SB do not learn motor contingencies as easily or at the same rate as TD infants, and are more likely to decrease motor responses when sensory feedback is absent. Results are discussed with reference to research on contingency learning in infants with and without neurodevelopmental disorders, and with reference to motor learning in school-age children with SB.

  5. Resiliency Evaluation, Assessment and Contingency Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Resiliency Evaluation, Assessment and Contingency Tools (REACT) Achieving resiliency in any system requires capabilities that are beyond the boundaries of currently...

  6. The Need for a Strategic Approach to Contingency Contracting

    National Research Council Canada - National Science Library

    D'Angelo, Anthony F; Houglan, Danny H; Ruckwardt, Edwin

    2007-01-01

    ...; however, the contingency arena is often overlooked. Corporations are finding a strategic enterprise orientation to procurement can create or enhance their own competitive position within a market...

  7. Practical application of double-contingency protection

    International Nuclear Information System (INIS)

    Kent, N.A.; Sanders, C.F.

    1995-01-01

    The Westinghouse Commercial Fuel Fabrication Facility in Columbia, South Carolina, manufactures fuel assemblies and core components for the commercial nuclear power industry. The ammonium diurinate conversion process converts low-enriched ( 235 U) uranium hexafluoride (UF 6 ) or uranyl nitrate into ceramic-grade uranium dioxide (UO 2 ) powder. The UO 2 powder is then tumble blended in 1700-kg containers to ensure powder homogeneity and obtain necessary enrichments. The double-contingency principle is applied to all systems, processes, and components in which special nuclear material is processed, handled, or stored to ensure that an acceptable nuclear criticality margin of safety is maintained. The Nuclear Criticality Safety (NCS) Program at the Columbia plant is divided into three primary functions: analysis and evaluation, implementation, and compliance. The primary task in analysis and evaluation is to develop comprehensive criticality safety evaluations for all proposed new installations and system modifications. These evaluations involve identifying which of the nine physical process parameters directly affect neutron multiplication and establishing bounding assumptions and criticality safety limits (CSLs). The implementation function primarily consists of translating the open-quotes NCS-speakclose quotes (parameters, k eff , contingencies, barriers, controls, etc.) into operator language (procedural requirements, valve positions, flow rates, pressures, temperatures, etc.) and communicating this information clearly to the manufacturing function through procedures and training. The compliance function deals primarily with conducting criticality safety inspections, audits, and process upset investigations. This paper presents two examples of the challenges associated with the practical implementation of the double-contingency principle to the chemical manufacturing process at the Columbia plant

  8. Rethinking Reinforcement: Allocation, Induction, and Contingency

    Science.gov (United States)

    Baum, William M

    2012-01-01

    The concept of reinforcement is at least incomplete and almost certainly incorrect. An alternative way of organizing our understanding of behavior may be built around three concepts: allocation, induction, and correlation. Allocation is the measure of behavior and captures the centrality of choice: All behavior entails choice and consists of choice. Allocation changes as a result of induction and correlation. The term induction covers phenomena such as adjunctive, interim, and terminal behavior—behavior induced in a situation by occurrence of food or another Phylogenetically Important Event (PIE) in that situation. Induction resembles stimulus control in that no one-to-one relation exists between induced behavior and the inducing event. If one allowed that some stimulus control were the result of phylogeny, then induction and stimulus control would be identical, and a PIE would resemble a discriminative stimulus. Much evidence supports the idea that a PIE induces all PIE-related activities. Research also supports the idea that stimuli correlated with PIEs become PIE-related conditional inducers. Contingencies create correlations between “operant” activity (e.g., lever pressing) and PIEs (e.g., food). Once an activity has become PIE-related, the PIE induces it along with other PIE-related activities. Contingencies also constrain possible performances. These constraints specify feedback functions, which explain phenomena such as the higher response rates on ratio schedules in comparison with interval schedules. Allocations that include a lot of operant activity are “selected” only in the sense that they generate more frequent occurrence of the PIE within the constraints of the situation; contingency and induction do the “selecting.” PMID:22287807

  9. 40 CFR 264.51 - Purpose and implementation of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... contingency plan. 264.51 Section 264.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... DISPOSAL FACILITIES Contingency Plan and Emergency Procedures § 264.51 Purpose and implementation of contingency plan. (a) Each owner or operator must have a contingency plan for his facility. The contingency...

  10. Sustainability between Necessity, Contingency and Impossibility

    Directory of Open Access Journals (Sweden)

    Karl Bruckmeier

    2009-12-01

    Full Text Available Sustainable use of natural resources seems necessary to maintain functions and services of eco- and social systems in the long run. Efforts in policy and science for sustainable development have shown the splintering of local, national and global strategies. Sustainability becomes contingent and insecure with the actors´ conflicting knowledge, interests and aims, and seems even impossible through the “rebound”-effect. To make short and long term requirements of sustainability coherent requires critical, comparative and theoretical analysis of the problems met. For this purpose important concepts and theories are discussed in this review of recent interdisciplinary literature about resource management.

  11. Resource Contingency Program : Draft Environmental Impact Statement.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1995-02-01

    In 1990, the Bonneville Power Administration (BPA) embarked upon the Resource Contingency Program (RCP) to fulfill its statutory responsibilities to supply electrical power to its utility, industrial and other customers in the Pacific Northwest. Instead of buying or building generating plants now, BPA has purchased options to acquire power later if needed. Three option development agreements were signed in September 1993 with three proposed natural gas-fired, combined cycle combustion turbine CT projects near Chehalis and Satsop Washington and near Hermiston, Oregon. This environmental impact statement addresses the environmental consequences of purchasing power from these options. This environmental impact statement addresses the environmental consequences of purchasing power from these options.

  12. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  13. Dissociating Contingency Awareness and Conditioned Attitudes: Evidence of Contingency-Unaware Evaluative Conditioning

    Science.gov (United States)

    Hutter, Mandy; Sweldens, Steven; Stahl, Christoph; Unkelbach, Christian; Klauer, Karl Christoph

    2012-01-01

    Whether human evaluative conditioning can occur without contingency awareness has been the subject of an intense and ongoing debate for decades, troubled by a wide array of methodological difficulties. Following recent methodological innovations, the available evidence currently points to the conclusion that evaluative conditioning effects do not…

  14. Contingency Contractor Optimization Phase 3 Sustainment Database Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Frazier, Christopher Rawls; Durfee, Justin David; Bandlow, Alisa; Gearhart, Jared Lee; Jones, Katherine A

    2016-05-01

    The Contingency Contractor Optimization Tool – Prototype (CCOT-P) database is used to store input and output data for the linear program model described in [1]. The database allows queries to retrieve this data and updating and inserting new input data.

  15. The Necessity of Contingency or Contingent Necessity: Meillassoux, Hegel, and the Subject

    Directory of Open Access Journals (Sweden)

    John Van Houdt

    2011-06-01

    Full Text Available This article addresses the relationship of contingency to necessity as developed by Quentin Meillassoux and G.W.F. Hegel. Meillassoux criticizes the restriction of possibility by modern philosophy to the conditions of the transcendental subject, which he calls ‘correlationism’, and opposes to this correlationism, mathematics as an absolute form of thought. The arch-figure of a metaphysical version of correlationism for Meillassoux is Hegel. This article argues that, while Meillassoux is right to criticize a version of correlationism for restricting the range of contingency, he overlooks Hegel’s unique contribution to this issue. Hegel provides us a version of necessity modeled on the mathematical proof which answers Meillassoux’s concerns about correlationist versions of necessity but does not altogether jettison the concept of the subject. Instead, the subject in Hegel is a contingent interruption which emerges from the breaks in the kinds of necessity we posit about the world. Hegel offers us a way of tying these two concepts together in what I call ‘contingent necessity’.

  16. Contingency in the Cosmos and the Contingency of the Cosmos : Two Theological Approaches

    NARCIS (Netherlands)

    Drees, W.B.

    Contingency in reality may be epistemic, due to incomplete knowledge or the intersection of unrelated causal trajectories. In quantum physics, it appears to be ontological. More fundamental and interesting is the limit-question ‘why is there something rather than nothing,’ pointing out the

  17. Aspects of modern fracture statistics

    International Nuclear Information System (INIS)

    Tradinik, W.; Pabst, R.F.; Kromp, K.

    1981-01-01

    This contribution begins with introductory general remarks about fracture statistics. Then the fundamentals of the distribution of fracture probability are described. In the following part the application of the Weibull Statistics is justified. In the fourth chapter the microstructure of the material is considered in connection with calculations made in order to determine the fracture probability or risk of fracture. (RW) [de

  18. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  19. Learning, awareness, and instruction: subjective contingency awareness does matter in the colour-word contingency learning paradigm.

    Science.gov (United States)

    Schmidt, James R; De Houwer, Jan

    2012-12-01

    In three experiments, each of a set colour-unrelated distracting words was presented most often in a particular target print colour (e.g., "month" most often in red). In Experiment 1, half of the participants were told the word-colour contingencies in advance (instructed) and half were not (control). The instructed group showed a larger learning effect. This instruction effect was fully explained by increases in subjective awareness with instruction. In Experiment 2, contingency instructions were again given, but no contingencies were actually present. Although many participants claimed to be aware of these (non-existent) contingencies, they did not produce an instructed contingency effect. In Experiment 3, half of the participants were given contingency instructions that did not correspond to the correct contingencies. Participants with these false instructions learned the actual contingencies worse than controls. Collectively, our results suggest that conscious contingency knowledge might play a moderating role in the strength of implicit learning. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Significant others and contingencies of self-worth: activation and consequences of relationship-specific contingencies of self-worth.

    Science.gov (United States)

    Horberg, E J; Chen, Serena

    2010-01-01

    Three studies tested the activation and consequences of contingencies of self-worth associated with specific significant others, that is, relationship-specific contingencies of self-worth. The results showed that activating the mental representation of a significant other with whom one strongly desires closeness led participants to stake their self-esteem in domains in which the significant other wanted them to excel. This was shown in terms of self-reported contingencies of self-worth (Study 1), in terms of self-worth after receiving feedback on a successful or unsatisfactory performance in a relationship-specific contingency domain (Study 2), and in terms of feelings of reduced self-worth after thinking about a failure in a relationship-specific contingency domain (Study 3). Across studies, a variety of contingency domains were examined. Furthermore, Study 3 showed that failing in an activated relationship-specific contingency domain had negative implications for current feelings of closeness and acceptance in the significant-other relationship. Overall, the findings suggest that people's contingencies of self-worth depend on the social situation and that performance in relationship-specific contingency domains can influence people's perceptions of their relationships.

  1. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  2. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  3. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  4. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  5. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  6. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  7. Patterns of medicinal plant use: an examination of the Ecuadorian Shuar medicinal flora using contingency table and binomial analyses.

    Science.gov (United States)

    Bennett, Bradley C; Husby, Chad E

    2008-03-28

    Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.

  8. A Study on Contingency Learning in Introductory Physics Concepts

    Science.gov (United States)

    Scaife, Thomas M.

    investigation of their behavior, students were asked what rule they used when answering questions. Although the self-reported rules might not be congruent with their behavior, training with specific examples might affect how students explicitly think about physics problems. In addition to exploring the effectiveness of various training examples, the results were also compared to a cognitive theory of causality: the contingency model. Physical concepts can often be expressed in terms of causal relations (e.g., a net force causes an object to accelerate), and a large body of work has found that people make many decisions that are consistent with causal reasoning. The contingency model, in particular, explains how certain statistical regularities in the co-occurrence of two events can be interpreted by individuals as causal relations, and was chosen primarily because it of its robust results and simple, parsimonious form. The empirical results demonstrate that different categories of training examples did affect student answers differently. Furthermore, these effects were mostly consistent with the predictions made by the contingency model. When rule use was explored, the self-reported rules were consistent with contingency model predictions, but indicated that examples alone were insufficient to teach complex functional relationships between physical dimensions, such as torque.

  9. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  10. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  11. Some aspects of statistical modeling of human-error probability

    International Nuclear Information System (INIS)

    Prairie, R.R.

    1982-01-01

    Human reliability analyses (HRA) are often performed as part of risk assessment and reliability projects. Recent events in nuclear power have shown the potential importance of the human element. There are several on-going efforts in the US and elsewhere with the purpose of modeling human error such that the human contribution can be incorporated into an overall risk assessment associated with one or more aspects of nuclear power. An effort that is described here uses the HRA (event tree) to quantify and model the human contribution to risk. As an example, risk analyses are being prepared on several nuclear power plants as part of the Interim Reliability Assessment Program (IREP). In this process the risk analyst selects the elements of his fault tree that could be contributed to by human error. He then solicits the HF analyst to do a HRA on this element

  12. Advice for New and Student Lecturers on Probability and Statistics

    Science.gov (United States)

    Larsen, Michael D.

    2006-01-01

    Lecture is a common presentation style that gives instructors a lot of control over topics and time allocation, but can limit active student participation and learning. This article presents some ideas to increase the level of student involvement in lecture. The examples and suggestions are based on the author's experience as a senior lecturer for…

  13. Motivating Inquiry in Statistics and Probability in the Primary Classroom

    Science.gov (United States)

    Leavy, Aisling; Hourigan, Mairéad

    2015-01-01

    We describe how the use of a games environment combined with technology supports upper primary children in engaging with a concept traditionally considered too advanced for the primary classes: "The Law of Large Numbers."

  14. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  15. Statistical thermodynamics

    International Nuclear Information System (INIS)

    Lim, Gyeong Hui

    2008-03-01

    This book consists of 15 chapters, which are basic conception and meaning of statistical thermodynamics, Maxwell-Boltzmann's statistics, ensemble, thermodynamics function and fluctuation, statistical dynamics with independent particle system, ideal molecular system, chemical equilibrium and chemical reaction rate in ideal gas mixture, classical statistical thermodynamics, ideal lattice model, lattice statistics and nonideal lattice model, imperfect gas theory on liquid, theory on solution, statistical thermodynamics of interface, statistical thermodynamics of a high molecule system and quantum statistics

  16. Computational statistics handbook with Matlab

    CERN Document Server

    Martinez, Wendy L

    2007-01-01

    Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...

  17. Students' Understanding of Conditional Probability on Entering University

    Science.gov (United States)

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  18. The estimation of small probabilities and risk assessment

    International Nuclear Information System (INIS)

    Kalbfleisch, J.D.; Lawless, J.F.; MacKay, R.J.

    1982-01-01

    The primary contribution of statistics to risk assessment is in the estimation of probabilities. Frequently the probabilities in question are small, and their estimation is particularly difficult. The authors consider three examples illustrating some problems inherent in the estimation of small probabilities

  19. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  20. Probability of Boulders

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    To collect background information for formulating a description of the expected soil properties along the tunnel line, in 1987 Storebælt initiated a statistical investigation of the occurrence and size of boulders in the Great Belt area. The data for the boulder size distribution were obtained....... The data collection part of the investigation was made on the basis of geological expert advice (Gunnar Larsen, Århus) by the Danish Geotechnical Institute (DGI).The statistical data analysis combined with stochastic modeling based on geometry and sound wave diffraction theory gave a point estimate...