WorldWideScience

Sample records for contingent probability statistics

  1. Comparison of accuracy in predicting emotional instability from MMPI data: fisherian versus contingent probability statistics

    Energy Technology Data Exchange (ETDEWEB)

    Berghausen, P.E. Jr.; Mathews, T.W.

    1987-01-01

    The security plans of nuclear power plants generally require that all personnel who are to have access to protected areas or vital islands be screened for emotional stability. In virtually all instances, the screening involves the administration of one or more psychological tests, usually including the Minnesota Multiphasic Personality Inventory (MMPI). At some plants, all employees receive a structured clinical interview after they have taken the MMPI and results have been obtained. At other plants, only those employees with dirty MMPI are interviewed. This latter protocol is referred to as interviews by exception. Behaviordyne Psychological Corp. has succeeded in removing some of the uncertainty associated with interview-by-exception protocols by developing an empirically based, predictive equation. This equation permits utility companies to make informed choices regarding the risks they are assuming. A conceptual problem exists with the predictive equation, however. Like most predictive equations currently in use, it is based on Fisherian statistics, involving least-squares analyses. Consequently, Behaviordyne Psychological Corp., in conjunction with T.W. Mathews and Associates, has just developed a second predictive equation, one based on contingent probability statistics. The particular technique used in the multi-contingent analysis of probability systems (MAPS) approach. The present paper presents a comparison of predictive accuracy of the two equations: the one derived using Fisherian techniques versus the one thing contingent probability techniques.

  2. Comparison of accuracy in predicting emotional instability from MMPI data: fisherian versus contingent probability statistics

    International Nuclear Information System (INIS)

    Berghausen, P.E. Jr.; Mathews, T.W.

    1987-01-01

    The security plans of nuclear power plants generally require that all personnel who are to have access to protected areas or vital islands be screened for emotional stability. In virtually all instances, the screening involves the administration of one or more psychological tests, usually including the Minnesota Multiphasic Personality Inventory (MMPI). At some plants, all employees receive a structured clinical interview after they have taken the MMPI and results have been obtained. At other plants, only those employees with dirty MMPI are interviewed. This latter protocol is referred to as interviews by exception. Behaviordyne Psychological Corp. has succeeded in removing some of the uncertainty associated with interview-by-exception protocols by developing an empirically based, predictive equation. This equation permits utility companies to make informed choices regarding the risks they are assuming. A conceptual problem exists with the predictive equation, however. Like most predictive equations currently in use, it is based on Fisherian statistics, involving least-squares analyses. Consequently, Behaviordyne Psychological Corp., in conjunction with T.W. Mathews and Associates, has just developed a second predictive equation, one based on contingent probability statistics. The particular technique used in the multi-contingent analysis of probability systems (MAPS) approach. The present paper presents a comparison of predictive accuracy of the two equations: the one derived using Fisherian techniques versus the one thing contingent probability techniques

  3. Contingency bias in probability judgement may arise from ambiguity regarding additional causes.

    Science.gov (United States)

    Mitchell, Chris J; Griffiths, Oren; More, Pranjal; Lovibond, Peter F

    2013-09-01

    In laboratory contingency learning tasks, people usually give accurate estimates of the degree of contingency between a cue and an outcome. However, if they are asked to estimate the probability of the outcome in the presence of the cue, they tend to be biased by the probability of the outcome in the absence of the cue. This bias is often attributed to an automatic contingency detection mechanism, which is said to act via an excitatory associative link to activate the outcome representation at the time of testing. We conducted 3 experiments to test alternative accounts of contingency bias. Participants were exposed to the same outcome probability in the presence of the cue, but different outcome probabilities in the absence of the cue. Phrasing the test question in terms of frequency rather than probability and clarifying the test instructions reduced but did not eliminate contingency bias. However, removal of ambiguity regarding the presence of additional causes during the test phase did eliminate contingency bias. We conclude that contingency bias may be due to ambiguity in the test question, and therefore it does not require postulation of a separate associative link-based mechanism.

  4. Simplified Freeman-Tukey test statistics for testing probabilities in ...

    African Journals Online (AJOL)

    This paper presents the simplified version of the Freeman-Tukey test statistic for testing hypothesis about multinomial probabilities in one, two and multidimensional contingency tables that does not require calculating the expected cell frequencies before test of significance. The simplified method established new criteria of ...

  5. Model selection for contingency tables with algebraic statistics

    NARCIS (Netherlands)

    Krampe, A.; Kuhnt, S.; Gibilisco, P.; Riccimagno, E.; Rogantin, M.P.; Wynn, H.P.

    2009-01-01

    Goodness-of-fit tests based on chi-square approximations are commonly used in the analysis of contingency tables. Results from algebraic statistics combined with MCMC methods provide alternatives to the chi-square approximation. However, within a model selection procedure usually a large number of

  6. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  7. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  8. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  9. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  10. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  11. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  12. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  13. A comparator-hypothesis account of biased contingency detection.

    Science.gov (United States)

    Vadillo, Miguel A; Barberia, Itxaso

    2018-02-12

    Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Statistics with JMP graphs, descriptive statistics and probability

    CERN Document Server

    Goos, Peter

    2015-01-01

    Peter Goos, Department of Statistics, University ofLeuven, Faculty of Bio-Science Engineering and University ofAntwerp, Faculty of Applied Economics, BelgiumDavid Meintrup, Department of Mathematics and Statistics,University of Applied Sciences Ingolstadt, Faculty of MechanicalEngineering, GermanyThorough presentation of introductory statistics and probabilitytheory, with numerous examples and applications using JMPDescriptive Statistics and Probability provides anaccessible and thorough overview of the most important descriptivestatistics for nominal, ordinal and quantitative data withpartic

  15. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  16. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  17. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  18. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  19. Statistical Efficiency of Double-Bounded Dichotomous Choice Contingent Valuation

    OpenAIRE

    Michael Hanemann; John Loomis; Barbara Kanninen

    1991-01-01

    The statistical efficiency of conventional dichotomous choice contingent valuation surveys can be improved by asking each respondent a second dichotomous choice question which depends on the response to the first question—if the first response is "yes," the second bid is some amount greater than the first bid; while, if the first response is "no," the second bid is some amount smaller. This "double-bounded" approach is shown to be asymptotically more efficient than the conventional, "singlebo...

  20. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  1. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  2. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  3. Efficient statistical tests to compare Youden index: accounting for contingency correlation.

    Science.gov (United States)

    Chen, Fangyao; Xue, Yuqiang; Tan, Ming T; Chen, Pingyan

    2015-04-30

    Youden index is widely utilized in studies evaluating accuracy of diagnostic tests and performance of predictive, prognostic, or risk models. However, both one and two independent sample tests on Youden index have been derived ignoring the dependence (association) between sensitivity and specificity, resulting in potentially misleading findings. Besides, paired sample test on Youden index is currently unavailable. This article develops efficient statistical inference procedures for one sample, independent, and paired sample tests on Youden index by accounting for contingency correlation, namely associations between sensitivity and specificity and paired samples typically represented in contingency tables. For one and two independent sample tests, the variances are estimated by Delta method, and the statistical inference is based on the central limit theory, which are then verified by bootstrap estimates. For paired samples test, we show that the estimated covariance of the two sensitivities and specificities can be represented as a function of kappa statistic so the test can be readily carried out. We then show the remarkable accuracy of the estimated variance using a constrained optimization approach. Simulation is performed to evaluate the statistical properties of the derived tests. The proposed approaches yield more stable type I errors at the nominal level and substantially higher power (efficiency) than does the original Youden's approach. Therefore, the simple explicit large sample solution performs very well. Because we can readily implement the asymptotic and exact bootstrap computation with common software like R, the method is broadly applicable to the evaluation of diagnostic tests and model performance. Copyright © 2015 John Wiley & Sons, Ltd.

  4. On the limits of statistical learning: Intertrial contextual cueing is confined to temporally close contingencies.

    Science.gov (United States)

    Thomas, Cyril; Didierjean, André; Maquestiaux, François; Goujon, Annabelle

    2018-04-12

    Since the seminal study by Chun and Jiang (Cognitive Psychology, 36, 28-71, 1998), a large body of research based on the contextual-cueing paradigm has shown that the cognitive system is capable of extracting statistical contingencies from visual environments. Most of these studies have focused on how individuals learn regularities found within an intratrial temporal window: A context predicts the target position within a given trial. However, Ono, Jiang, and Kawahara (Journal of Experimental Psychology, 31, 703-712, 2005) provided evidence of an intertrial implicit-learning effect when a distractor configuration in preceding trials N - 1 predicted the target location in trials N. The aim of the present study was to gain further insight into this effect by examining whether it occurs when predictive relationships are impeded by interfering task-relevant noise (Experiments 2 and 3) or by a long delay (Experiments 1, 4, and 5). Our results replicated the intertrial contextual-cueing effect, which occurred in the condition of temporally close contingencies. However, there was no evidence of integration across long-range spatiotemporal contingencies, suggesting a temporal limitation of statistical learning.

  5. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  6. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  7. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  8. Decision-making in probability and statistics Chilean curriculum

    DEFF Research Database (Denmark)

    Elicer, Raimundo

    2018-01-01

    Probability and statistics have become prominent subjects in school mathematics curricula. As an exemplary case, I investigate the role of decision making in the justification for probability and statistics in the current Chilean upper secondary mathematics curriculum. For addressing this concern......, I draw upon Fairclough’s model for Critical Discourse Analysis to analyse selected texts as examples of discourse practices. The texts are interconnected with politically driven ideas of stochastics “for all”, the notion of statistical literacy coined by statisticians’ communities, schooling...

  9. Introduction to probability and statistics for engineers and scientists

    CERN Document Server

    Ross, Sheldon M

    2009-01-01

    This updated text provides a superior introduction to applied probability and statistics for engineering or science majors. Ross emphasizes the manner in which probability yields insight into statistical problems; ultimately resulting in an intuitive understanding of the statistical procedures most often used by practicing engineers and scientists. Real data sets are incorporated in a wide variety of exercises and examples throughout the book, and this emphasis on data motivates the probability coverage.As with the previous editions, Ross' text has remendously clear exposition, plus real-data

  10. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  11. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  12. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  13. Statistical learning of action: the role of conditional probability.

    Science.gov (United States)

    Meyer, Meredith; Baldwin, Dare

    2011-12-01

    Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.

  14. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  15. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  16. A Multidisciplinary Approach for Teaching Statistics and Probability

    Science.gov (United States)

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  17. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  18. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  19. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  20. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  1. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  2. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Do doctors need statistics? Doctors' use of and attitudes to probability and statistics.

    Science.gov (United States)

    Swift, Louise; Miles, Susan; Price, Gill M; Shepstone, Lee; Leinster, Sam J

    2009-07-10

    There is little published evidence on what doctors do in their work that requires probability and statistics, yet the General Medical Council (GMC) requires new doctors to have these skills. This study investigated doctors' use of and attitudes to probability and statistics with a view to informing undergraduate teaching.An email questionnaire was sent to 473 clinicians with an affiliation to the University of East Anglia's Medical School.Of 130 respondents approximately 90 per cent of doctors who performed each of the following activities found probability and statistics useful for that activity: accessing clinical guidelines and evidence summaries, explaining levels of risk to patients, assessing medical marketing and advertising material, interpreting the results of a screening test, reading research publications for general professional interest, and using research publications to explore non-standard treatment and management options.Seventy-nine per cent (103/130, 95 per cent CI 71 per cent, 86 per cent) of participants considered probability and statistics important in their work. Sixty-three per cent (78/124, 95 per cent CI 54 per cent, 71 per cent) said that there were activities that they could do better or start doing if they had an improved understanding of these areas and 74 of these participants elaborated on this. Themes highlighted by participants included: being better able to critically evaluate other people's research; becoming more research-active, having a better understanding of risk; and being better able to explain things to, or teach, other people.Our results can be used to inform how probability and statistics should be taught to medical undergraduates and should encourage today's medical students of the subjects' relevance to their future careers. Copyright 2009 John Wiley & Sons, Ltd.

  4. Introduction to Statistics - eNotes

    DEFF Research Database (Denmark)

    Brockhoff, Per B.; Møller, Jan Kloppenborg; Andersen, Elisabeth Wreford

    2015-01-01

    Online textbook used in the introductory statistics courses at DTU. It provides a basic introduction to applied statistics for engineers. The necessary elements from probability theory are introduced (stochastic variable, density and distribution function, mean and variance, etc.) and thereafter...... the most basic statistical analysis methods are presented: Confidence band, hypothesis testing, simulation, simple and muliple regression, ANOVA and analysis of contingency tables. Examples with the software R are included for all presented theory and methods....

  5. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  6. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data......: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying...

  7. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  8. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  10. A simplification of the likelihood ratio test statistic for testing ...

    African Journals Online (AJOL)

    The traditional likelihood ratio test statistic for testing hypothesis about goodness of fit of multinomial probabilities in one, two and multi – dimensional contingency table was simplified. Advantageously, using the simplified version of the statistic to test the null hypothesis is easier and faster because calculating the expected ...

  11. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  12. Introduction to probability and statistics for ecosystem managers simulation and resampling

    CERN Document Server

    Haas, Timothy C

    2013-01-01

    Explores computer-intensive probability and statistics for ecosystem management decision making Simulation is an accessible way to explain probability and stochastic model behavior to beginners. This book introduces probability and statistics to future and practicing ecosystem managers by providing a comprehensive treatment of these two areas. The author presents a self-contained introduction for individuals involved in monitoring, assessing, and managing ecosystems and features intuitive, simulation-based explanations of probabilistic and statistical concepts. Mathematical programming details are provided for estimating ecosystem model parameters with Minimum Distance, a robust and computer-intensive method. The majority of examples illustrate how probability and statistics can be applied to ecosystem management challenges. There are over 50 exercises - making this book suitable for a lecture course in a natural resource and/or wildlife management department, or as the main text in a program of self-stud...

  13. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  14. Probability and logical structure of statistical theories

    International Nuclear Information System (INIS)

    Hall, M.J.W.

    1988-01-01

    A characterization of statistical theories is given which incorporates both classical and quantum mechanics. It is shown that each statistical theory induces an associated logic and joint probability structure, and simple conditions are given for the structure to be of a classical or quantum type. This provides an alternative for the quantum logic approach to axiomatic quantum mechanics. The Bell inequalities may be derived for those statistical theories that have a classical structure and satisfy a locality condition weaker than factorizability. The relation of these inequalities to the issue of hidden variable theories for quantum mechanics is discussed and clarified

  15. Disruptive Effects of Contingent Food on High-Probability Behavior

    Science.gov (United States)

    Frank-Crawford, Michelle A.; Borrero, John C.; Nguyen, Linda; Leon-Enriquez, Yanerys; Carreau-Webster, Abbey B.; DeLeon, Iser G.

    2012-01-01

    The delivery of food contingent on 10 s of consecutive toy engagement resulted in a decrease in engagement and a corresponding increase in other responses that had been previously reinforced with food. Similar effects were not observed when tokens exchangeable for the same food were delivered, suggesting that engagement was disrupted by the…

  16. Statistical methods in epidemiology. VII. An overview of the chi2 test for 2 x 2 contingency table analysis.

    Science.gov (United States)

    Rigby, A S

    2001-11-10

    The odds ratio is an appropriate method of analysis for data in 2 x 2 contingency tables. However, other methods of analysis exist. One such method is based on the chi2 test of goodness-of-fit. Key players in the development of statistical theory include Pearson, Fisher and Yates. Data are presented in the form of 2 x 2 contingency tables and a method of analysis based on the chi2 test is introduced. There are many variations of the basic test statistic, one of which is the chi2 test with Yates' continuity correction. The usefulness (or not) of Yates' continuity correction is discussed. Problems of interpretation when the method is applied to k x m tables are highlighted. Some properties of the chi2 the test are illustrated by taking examples from the author's teaching experiences. Journal editors should be encouraged to give both observed and expected cell frequencies so that better information comes out of the chi2 test statistic.

  17. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  18. Statistics and Probability Theory In Pursuit of Engineering Decision Support

    CERN Document Server

    Faber, Michael Havbro

    2012-01-01

    This book provides the reader with the basic skills and tools of statistics and probability in the context of engineering modeling and analysis. The emphasis is on the application and the reasoning behind the application of these skills and tools for the purpose of enhancing  decision making in engineering. The purpose of the book is to ensure that the reader will acquire the required theoretical basis and technical skills such as to feel comfortable with the theory of basic statistics and probability. Moreover, in this book, as opposed to many standard books on the same subject, the perspective is to focus on the use of the theory for the purpose of engineering model building and decision making.  This work is suitable for readers with little or no prior knowledge on the subject of statistics and probability.

  19. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided

  20. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  1. Contingent and Alternative Work Arrangements, Defined.

    Science.gov (United States)

    Polivka, Anne E.

    1996-01-01

    Discusses the definitions of contingent workers and alternative work arrangements used by the Bureau of Labor Statistics to analyze data, and presents aggregate estimates of the number of workers in each group. Discusses the overlap between contingent workers and workers in alternative arrangements. (Author/JOW)

  2. Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood (1/3)

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    These lectures cover those principles and practices of statistics that are most relevant for work at the LHC. The first lecture discusses the basic ideas of descriptive statistics, probability and likelihood. The second lecture covers the key ideas in the frequentist approach, including confidence limits, profile likelihoods, p-values, and hypothesis testing. The third lecture covers inference in the Bayesian approach. Throughout, real-world examples will be used to illustrate the practical application of the ideas. No previous knowledge is assumed.

  3. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  4. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  5. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  6. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  7. Dealing with customers enquiries simultaneously under contingent situation

    Directory of Open Access Journals (Sweden)

    Sujan Piya

    2015-09-01

    Full Text Available This paper proposes a method to quote the due date and the price of incoming orders to multiple customers simultaneously when the contingent orders exist. The proposed method utilizes probabilistic information on contingent orders and incorporates some negotiation theories to generate quotations. Rather than improving the acceptance probability of quotation for single customer, the method improves the overall acceptance probability of quotations being submitted to the multiple customers. This helps increase the total expected contribution of company and acceptance probability of entire new orders rather than increasing these measures only for a single customer. Numerical analysis is conducted to demonstrate the working mechanism of proposed method and its effectiveness in contrast to sequential method of quotation.

  8. Contingency proportion systematically influences contingency learning.

    Science.gov (United States)

    Forrin, Noah D; MacLeod, Colin M

    2018-01-01

    In the color-word contingency learning paradigm, each word appears more often in one color (high contingency) than in the other colors (low contingency). Shortly after beginning the task, color identification responses become faster on the high-contingency trials than on the low-contingency trials-the contingency learning effect. Across five groups, we varied the high-contingency proportion in 10% steps, from 80% to 40%. The size of the contingency learning effect was positively related to high-contingency proportion, with the effect disappearing when high contingency was reduced to 40%. At the two highest contingency proportions, the magnitude of the effect increased over trials, the pattern suggesting that there was an increasing cost for the low-contingency trials rather than an increasing benefit for the high-contingency trials. Overall, the results fit a modified version of Schmidt's (2013, Acta Psychologica, 142, 119-126) parallel episodic processing account in which prior trial instances are routinely retrieved from memory and influence current trial performance.

  9. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    Science.gov (United States)

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  10. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  11. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    Science.gov (United States)

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  12. No Win, No Fee: Some Economics of Contingent Legal Fees.

    OpenAIRE

    Gravelle, Hugh; Waterson, Michael

    1993-01-01

    This paper analyzes the effects on the litigation process of alternative contracts between plaintiffs and their lawyers. Three contracts are compared: normal (hourly fee), contingent mark up fees, and contingent share contracts. The focus is on the first two, a recent change in English law governing legal fees providing the motivation. The influences of the contract type on the acceptance of settlement offers, the settlement probability, the accident probability, the demand for trials, and th...

  13. Teaching Basic Probability in Undergraduate Statistics or Management Science Courses

    Science.gov (United States)

    Naidu, Jaideep T.; Sanford, John F.

    2017-01-01

    Standard textbooks in core Statistics and Management Science classes present various examples to introduce basic probability concepts to undergraduate business students. These include tossing of a coin, throwing a die, and examples of that nature. While these are good examples to introduce basic probability, we use improvised versions of Russian…

  14. Statistical complexity without explicit reference to underlying probabilities

    Science.gov (United States)

    Pennini, F.; Plastino, A.

    2018-06-01

    We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.

  15. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  16. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  17. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  18. On Asymptotically Lacunary Statistical Equivalent Sequences of Order α in Probability

    Directory of Open Access Journals (Sweden)

    Işık Mahmut

    2017-01-01

    Full Text Available In this study, we introduce and examine the concepts of asymptotically lacunary statistical equivalent of order α in probability and strong asymptotically lacunary equivalent of order α in probability. We give some relations connected to these concepts.

  19. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  20. The exact probability distribution of the rank product statistics for replicated experiments.

    Science.gov (United States)

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  1. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  2. The Work Sample Verification and the Calculation of the Statistical, Mathematical and Economical Probability for the Risks of the Direct Procurement

    Directory of Open Access Journals (Sweden)

    Lazăr Cristiana Daniela

    2017-01-01

    Full Text Available Each organization has among its multiple secondary endpoints subordinated to a centralobjective that one of avoiding the contingencies. The direct procurement is carried out on themarket in SEAP (Electronic System of Public Procurement, and a performing management in apublic institution has as sub-base and risk management. The risks may be investigated byeconometric simulation, which is calculated by the use of calculus of probability and the sample fordetermining the relevance of these probabilities.

  3. 8th International Conference on Soft Methods in Probability and Statistics

    CERN Document Server

    Giordani, Paolo; Vantaggi, Barbara; Gagolewski, Marek; Gil, María; Grzegorzewski, Przemysław; Hryniewicz, Olgierd

    2017-01-01

    This proceedings volume is a collection of peer reviewed papers presented at the 8th International Conference on Soft Methods in Probability and Statistics (SMPS 2016) held in Rome (Italy). The book is dedicated to Data science which aims at developing automated methods to analyze massive amounts of data and to extract knowledge from them. It shows how Data science employs various programming techniques and methods of data wrangling, data visualization, machine learning, probability and statistics. The soft methods proposed in this volume represent a collection of tools in these fields that can also be useful for data science.

  4. New exponential, logarithm and q-probability in the non-extensive statistical physics

    OpenAIRE

    Chung, Won Sang

    2013-01-01

    In this paper, a new exponential and logarithm related to the non-extensive statistical physics is proposed by using the q-sum and q-product which satisfy the distributivity. And we discuss the q-mapping from an ordinary probability to q-probability. The q-entropy defined by the idea of q-probability is shown to be q-additive.

  5. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  6. Numerical comparison of improved methods of testing in contingency tables with small frequencies

    Energy Technology Data Exchange (ETDEWEB)

    Sugiura, Nariaki; Otake, Masanori

    1968-11-14

    The significance levels of various tests for a general c x k contingency table are usually given by large sample theory. But they are not accurate for the one having small frequencies. In this paper, a numerical evaluation was made to determine how good the approximation of significance level is for various improved tests that have been developed by Nass, Yoshimura, Gart, etc. for c x k contingency table with small frequencies in some of cells. For this purpose we compared the significance levels of the various approximate methods (i) with those of one-sided tail defined in terms of exact probabilities for given marginals in 2 x 2 table; (ii) with those of exact probabilities accumulated in the order of magnitude of Chi/sup 2/ statistic or likelihood ratio (=LR) statistic in 2 x 3 table mentioned by Yates. In 2 x 2 table it is well known that Yates' correction gives satisfactory result for small cell frequencies and the other methods that we have not referred here, can be considered if we devote our attention only to 2 x 2 or 2 x k table. But we are mainly interested in comparing the methods that are applicable to a general c x k table. It appears that such a comparison for the various improved methods in the same example has not been made explicitly, even though these tests are frequently used in biological and medical research. 9 references, 6 figures, 6 tables.

  7. Academic Training Lecture | Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood | 7-9 April

    CERN Multimedia

    2015-01-01

    Please note that our next series of Academic Training Lectures will take place on the 7, 8 and 9 April 2015   Practical Statistics for LHC Physicists: Descriptive Statistics, Probability and Likelihood, by Harrison Prosper, Floridia State University, USA. from 11.00 a.m. to 12.00 p.m. in the Council Chamber (503-1-001) https://indico.cern.ch/event/358542/

  8. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  10. A new expression of the probability distribution in Incomplete Statistics and fundamental thermodynamic relations

    International Nuclear Information System (INIS)

    Huang Zhifu; Lin Bihong; ChenJincan

    2009-01-01

    In order to overcome the limitations of the original expression of the probability distribution appearing in literature of Incomplete Statistics, a new expression of the probability distribution is derived, where the Lagrange multiplier β introduced here is proved to be identical with that introduced in the second and third choices for the internal energy constraint in Tsallis' statistics and to be just equal to the physical inverse temperature. It is expounded that the probability distribution described by the new expression is invariant through uniform translation of the energy spectrum. Moreover, several fundamental thermodynamic relations are given and the relationship between the new and the original expressions of the probability distribution is discussed.

  11. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  12. Contingent valuation and incentives

    Science.gov (United States)

    Patricia A. Champ; Nicholas E. Flores; Thomas C. Brown; James Chivers

    2002-01-01

    We empirically investigate the effect of the payment mechanism on contingent values by asking a willingness-to-pay question with one of three different payment mechanisms: individual contribution, contribution with provision point, and referendum. We find statistical evidence of more affirmative responses in the referendum treatment relative to the individual...

  13. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  14. Applications of Statistics and Probability in Civil Engineering

    DEFF Research Database (Denmark)

    Faber, Michael Havbro

    contains the proceedings of the 11th International Conference on Applications of Statistics and Probability in Civil Engineering (ICASP11, Zürich, Switzerland, 1-4 August 2011). The book focuses not only on the more traditional technical issues, but also emphasizes the societal context...... and reliability in engineering; to professionals and engineers, including insurance and consulting companies working with natural hazards, design, operation and maintenance of civil engineering and industrial facilities; and to decision makers and professionals in the public sector, including nongovernmental...

  15. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    Science.gov (United States)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  16. A Brief Look at the History of Probability and Statistics.

    Science.gov (United States)

    Lightner, James E.

    1991-01-01

    The historical development of probability theory is traced from its early origins in games of chance through its mathematical foundations in the work of Pascal and Fermat. The roots of statistics are also presented beginning with early actuarial developments through the work of Laplace, Gauss, and others. (MDH)

  17. Quantum probability, choice in large worlds, and the statistical structure of reality.

    Science.gov (United States)

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  18. STATLIB, Interactive Statistics Program Library of Tutorial System

    International Nuclear Information System (INIS)

    Anderson, H.E.

    1986-01-01

    1 - Description of program or function: STATLIB is a conversational statistical program library developed in conjunction with a Sandia National Laboratories applied statistics course intended for practicing engineers and scientists. STATLIB is a group of 15 interactive, argument-free, statistical routines. Included are analysis of sensitivity tests; sample statistics for the normal, exponential, hypergeometric, Weibull, and extreme value distributions; three models of multiple regression analysis; x-y data plots; exact probabilities for RxC tables; n sets of m permuted integers in the range 1 to m; simple linear regression and correlation; K different random integers in the range m to n; and Fisher's exact test of independence for a 2 by 2 contingency table. Forty-five other subroutines in the library support the basic 15

  19. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  20. Teaching Probability with the Support of the R Statistical Software

    Science.gov (United States)

    dos Santos Ferreira, Robson; Kataoka, Verônica Yumi; Karrer, Monica

    2014-01-01

    The objective of this paper is to discuss aspects of high school students' learning of probability in a context where they are supported by the statistical software R. We report on the application of a teaching experiment, constructed using the perspective of Gal's probabilistic literacy and Papert's constructionism. The results show improvement…

  1. Lectures on probability and statistics. Revision

    International Nuclear Information System (INIS)

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion

  2. Contingent Commitments: Bringing Part-Time Faculty into Focus. Methodology Supplement

    Science.gov (United States)

    Center for Community College Student Engagement, 2014

    2014-01-01

    Center reporting prior to 2013 focused primarily on descriptive statistics (frequencies and means) of student and faculty behaviors. The goal of the analyses reported here and in "Contingent Commitments: Bringing Part-Time Faculty into Focus" is to understand the engagement of part-time or contingent faculty in various activities that…

  3. Ergodic theory, interpretations of probability and the foundations of statistical mechanics

    NARCIS (Netherlands)

    van Lith, J.H.

    2001-01-01

    The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time

  4. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  5. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    Science.gov (United States)

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  6. Statistics and Probability at Secondary Schools in the Federal State of Salzburg: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Wolfgang Voit

    2014-12-01

    Full Text Available Knowledge about the practical use of statistics and probability in today's mathematics instruction at secondary schools is vital in order to improve the academic education for future teachers. We have conducted an empirical study among school teachers to inform towards improved mathematics instruction and teacher preparation. The study provides a snapshot into the daily practice of instruction at school. Centered around the four following questions, the status of statistics and probability was examined. Where did  the current mathematics teachers study? What relevance do statistics and probability have in school? Which contents are actually taught in class? What kind of continuing education would be desirable for teachers? The study population consisted of all teachers of mathematics at secondary schools in the federal state of Salzburg.

  7. Bayes factor and posterior probability: Complementary statistical evidence to p-value.

    Science.gov (United States)

    Lin, Ruitao; Yin, Guosheng

    2015-09-01

    As a convention, a p-value is often computed in hypothesis testing and compared with the nominal level of 0.05 to determine whether to reject the null hypothesis. Although the smaller the p-value, the more significant the statistical test, it is difficult to perceive the p-value in a probability scale and quantify it as the strength of the data against the null hypothesis. In contrast, the Bayesian posterior probability of the null hypothesis has an explicit interpretation of how strong the data support the null. We make a comparison of the p-value and the posterior probability by considering a recent clinical trial. The results show that even when we reject the null hypothesis, there is still a substantial probability (around 20%) that the null is true. Not only should we examine whether the data would have rarely occurred under the null hypothesis, but we also need to know whether the data would be rare under the alternative. As a result, the p-value only provides one side of the information, for which the Bayes factor and posterior probability may offer complementary evidence. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Statistical analysis on failure-to-open/close probability of motor-operated valve in sodium system

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1998-08-01

    The objective of this work is to develop basic data for examination on efficiency of preventive maintenance and actuation test from the standpoint of failure probability. This work consists of a statistical trend analysis of valve failure probability in a failure-to-open/close mode on time since installation and time since last open/close action, based on the field data of operating- and failure-experience. In this work, the terms both dependent and independent on time were considered in the failure probability. The linear aging model was modified and applied to the first term. In this model there are two terms with both failure rates in proportion to time since installation and to time since last open/close-demand. Because of sufficient statistical population, motor-operated valves (MOV's) in sodium system were selected to be analyzed from the CORDS database which contains operating data and failure data of components in the fast reactors and sodium test facilities. According to these data, the functional parameters were statistically estimated to quantify the valve failure probability in a failure-to-open/close mode, with consideration of uncertainty. (J.P.N.)

  9. Dynamic Graphics in Excel for Teaching Statistics: Understanding the Probability Density Function

    Science.gov (United States)

    Coll-Serrano, Vicente; Blasco-Blasco, Olga; Alvarez-Jareno, Jose A.

    2011-01-01

    In this article, we show a dynamic graphic in Excel that is used to introduce an important concept in our subject, Statistics I: the probability density function. This interactive graphic seeks to facilitate conceptual understanding of the main aspects analysed by the learners.

  10. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    Science.gov (United States)

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  11. A statistical model for investigating binding probabilities of DNA nucleotide sequences using microarrays.

    Science.gov (United States)

    Lee, Mei-Ling Ting; Bulyk, Martha L; Whitmore, G A; Church, George M

    2002-12-01

    There is considerable scientific interest in knowing the probability that a site-specific transcription factor will bind to a given DNA sequence. Microarray methods provide an effective means for assessing the binding affinities of a large number of DNA sequences as demonstrated by Bulyk et al. (2001, Proceedings of the National Academy of Sciences, USA 98, 7158-7163) in their study of the DNA-binding specificities of Zif268 zinc fingers using microarray technology. In a follow-up investigation, Bulyk, Johnson, and Church (2002, Nucleic Acid Research 30, 1255-1261) studied the interdependence of nucleotides on the binding affinities of transcription proteins. Our article is motivated by this pair of studies. We present a general statistical methodology for analyzing microarray intensity measurements reflecting DNA-protein interactions. The log probability of a protein binding to a DNA sequence on an array is modeled using a linear ANOVA model. This model is convenient because it employs familiar statistical concepts and procedures and also because it is effective for investigating the probability structure of the binding mechanism.

  12. Temporal contingency.

    Science.gov (United States)

    Gallistel, C R; Craig, Andrew R; Shahan, Timothy A

    2014-01-01

    Contingency, and more particularly temporal contingency, has often figured in thinking about the nature of learning. However, it has never been formally defined in such a way as to make it a measure that can be applied to most animal learning protocols. We use elementary information theory to define contingency in such a way as to make it a measurable property of almost any conditioning protocol. We discuss how making it a measurable construct enables the exploration of the role of different contingencies in the acquisition and performance of classically and operantly conditioned behavior. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Temporal contingency

    Science.gov (United States)

    Gallistel, C.R.; Craig, Andrew R.; Shahan, Timothy A.

    2015-01-01

    Contingency, and more particularly temporal contingency, has often figured in thinking about the nature of learning. However, it has never been formally defined in such a way as to make it a measure that can be applied to most animal learning protocols. We use elementary information theory to define contingency in such a way as to make it a measurable property of almost any conditioning protocol. We discuss how making it a measurable construct enables the exploration of the role of different contingencies in the acquisition and performance of classically and operantly conditioned behavior. PMID:23994260

  14. Pairwise contact energy statistical potentials can help to find probability of point mutations.

    Science.gov (United States)

    Saravanan, K M; Suvaithenamudhan, S; Parthasarathy, S; Selvaraj, S

    2017-01-01

    To adopt a particular fold, a protein requires several interactions between its amino acid residues. The energetic contribution of these residue-residue interactions can be approximated by extracting statistical potentials from known high resolution structures. Several methods based on statistical potentials extracted from unrelated proteins are found to make a better prediction of probability of point mutations. We postulate that the statistical potentials extracted from known structures of similar folds with varying sequence identity can be a powerful tool to examine probability of point mutation. By keeping this in mind, we have derived pairwise residue and atomic contact energy potentials for the different functional families that adopt the (α/β) 8 TIM-Barrel fold. We carried out computational point mutations at various conserved residue positions in yeast Triose phosphate isomerase enzyme for which experimental results are already reported. We have also performed molecular dynamics simulations on a subset of point mutants to make a comparative study. The difference in pairwise residue and atomic contact energy of wildtype and various point mutations reveals probability of mutations at a particular position. Interestingly, we found that our computational prediction agrees with the experimental studies of Silverman et al. (Proc Natl Acad Sci 2001;98:3092-3097) and perform better prediction than i Mutant and Cologne University Protein Stability Analysis Tool. The present work thus suggests deriving pairwise contact energy potentials and molecular dynamics simulations of functionally important folds could help us to predict probability of point mutations which may ultimately reduce the time and cost of mutation experiments. Proteins 2016; 85:54-64. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. On Dobrushin's way from probability theory to statistical physics

    CERN Document Server

    Minlos, R A; Suhov, Yu M; Suhov, Yu

    2000-01-01

    R. Dobrushin worked in several branches of mathematics (probability theory, information theory), but his deepest influence was on mathematical physics. He was one of the founders of the rigorous study of statistical physics. When Dobrushin began working in that direction in the early sixties, only a few people worldwide were thinking along the same lines. Now there is an army of researchers in the field. This collection is devoted to the memory of R. L. Dobrushin. The authors who contributed to this collection knew him quite well and were his colleagues. The title, "On Dobrushin's Way", is mea

  16. A statistical analysis on failure-to open/close probability of pneumatic valve in sodium cooling systems

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1999-11-01

    The objective of this study is to develop fundamental data for examination on efficiency of preventive maintenance and surveillance test from the standpoint of failure probability. In this study, as a major standby component, a pneumatic valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve failure-to-open/close (FTOC) probability depending on number of demands ('n'), time since installation ('t') and standby time since last open/close action ('T'). The analysis is based on the field data of operating- and failure-experiences stored in the Component Reliability Database and Statistical Analysis System for LMFBR's (CORDS). In the analysis, the FTOC probability ('P') was expressed as follows: P=1-exp{-C-En-F/n-λT-aT(t-T/2)-AT 2 /2}. The functional parameters, 'C', 'E', 'F', 'λ', 'a' and 'A', were estimated with the maximum likelihood estimation method. As a result, the FTOC probability is almost expressed with the failure probability being derived from the failure rate under assumption of the Poisson distribution only when valve cycle (i.e. open-close-open cycle) exceeds about 100 days. When the valve cycle is shorter than about 100 days, the FTOC probability can be adequately estimated with the parameter model proposed in this study. The results obtained from this study may make it possible to derive an adequate frequency of surveillance test for a given target of the FTOC probability. (author)

  17. Joint probability of statistical success of multiple phase III trials.

    Science.gov (United States)

    Zhang, Jianliang; Zhang, Jenny J

    2013-01-01

    In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.

  18. A scan statistic for continuous data based on the normal probability model

    Directory of Open Access Journals (Sweden)

    Huang Lan

    2009-10-01

    Full Text Available Abstract Temporal, spatial and space-time scan statistics are commonly used to detect and evaluate the statistical significance of temporal and/or geographical disease clusters, without any prior assumptions on the location, time period or size of those clusters. Scan statistics are mostly used for count data, such as disease incidence or mortality. Sometimes there is an interest in looking for clusters with respect to a continuous variable, such as lead levels in children or low birth weight. For such continuous data, we present a scan statistic where the likelihood is calculated using the the normal probability model. It may also be used for other distributions, while still maintaining the correct alpha level. In an application of the new method, we look for geographical clusters of low birth weight in New York City.

  19. Probability density cloud as a geometrical tool to describe statistics of scattered light.

    Science.gov (United States)

    Yaitskova, Natalia

    2017-04-01

    First-order statistics of scattered light is described using the representation of the probability density cloud, which visualizes a two-dimensional distribution for complex amplitude. The geometric parameters of the cloud are studied in detail and are connected to the statistical properties of phase. The moment-generating function for intensity is obtained in a closed form through these parameters. An example of exponentially modified normal distribution is provided to illustrate the functioning of this geometrical approach.

  20. Probability cueing of distractor locations: both intertrial facilitation and statistical learning mediate interference reduction.

    Science.gov (United States)

    Goschy, Harriet; Bakos, Sarolta; Müller, Hermann J; Zehetleitner, Michael

    2014-01-01

    Targets in a visual search task are detected faster if they appear in a probable target region as compared to a less probable target region, an effect which has been termed "probability cueing." The present study investigated whether probability cueing cannot only speed up target detection, but also minimize distraction by distractors in probable distractor regions as compared to distractors in less probable distractor regions. To this end, three visual search experiments with a salient, but task-irrelevant, distractor ("additional singleton") were conducted. Experiment 1 demonstrated that observers can utilize uneven spatial distractor distributions to selectively reduce interference by distractors in frequent distractor regions as compared to distractors in rare distractor regions. Experiments 2 and 3 showed that intertrial facilitation, i.e., distractor position repetitions, and statistical learning (independent of distractor position repetitions) both contribute to the probability cueing effect for distractor locations. Taken together, the present results demonstrate that probability cueing of distractor locations has the potential to serve as a strong attentional cue for the shielding of likely distractor locations.

  1. Four hundred or more participants needed for stable contingency table estimates of clinical prediction rule performance

    DEFF Research Database (Denmark)

    Kent, Peter; Boyle, Eleanor; Keating, Jennifer L

    2017-01-01

    OBJECTIVE: To quantify variability in the results of statistical analyses based on contingency tables and discuss the implications for the choice of sample size for studies that derive clinical prediction rules. STUDY DESIGN AND SETTING: An analysis of three pre-existing sets of large cohort data......, odds ratios and risk/prevalence ratios, for each sample size was calculated. RESULTS: There were very wide, and statistically significant, differences in estimates derived from contingency tables from the same dataset when calculated in sample sizes below 400 people, and typically this variability...... stabilized in samples of 400 to 600 people. Although estimates of prevalence also varied significantly in samples below 600 people, that relationship only explains a small component of the variability in these statistical parameters. CONCLUSION: To reduce sample-specific variability, contingency tables...

  2. Schema bias in source monitoring varies with encoding conditions: support for a probability-matching account.

    Science.gov (United States)

    Kuhlmann, Beatrice G; Vaterrodt, Bianca; Bayen, Ute J

    2012-09-01

    Two experiments examined reliance on schematic knowledge in source monitoring. Based on a probability-matching account of source guessing, a schema bias will only emerge if participants do not have a representation of the source-item contingency in the study list, or if the perceived contingency is consistent with schematic expectations. Thus, the account predicts that encoding conditions that affect contingency detection also affect schema bias. In Experiment 1, the schema bias commonly found when schematic information about the sources is not provided before encoding was diminished by an intentional source-memory instruction. In Experiment 2, the depth of processing of schema-consistent and schema-inconsistent source-item pairings was manipulated. Participants consequently overestimated the occurrence of the pairing type they processed in a deep manner, and their source guessing reflected this biased contingency perception. Results support the probability-matching account of source guessing. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  3. Changes in reward contingency modulate the trial-to-trial variability of hippocampal place cells.

    Science.gov (United States)

    Wikenheiser, Andrew M; Redish, A David

    2011-08-01

    Pyramidal cells in the rodent hippocampus often exhibit clear spatial tuning. Theories of hippocampal function suggest that these "place cells" implement multiple, independent neural representations of position (maps), based on different reference frames or environmental features. Consistent with the "multiple maps" theory, previous studies have shown that manipulating spatial factors related to task performance modulates the within-session variability (overdispersion) of cells in the hippocampus. However, the influence of changes in reward contingency on overdispersion has not been examined. To test this, we first trained rats to collect food from three feeders positioned around a circular track (task(1)). When subjects were proficient, the reward contingency was altered such that every other feeder delivered food (task(2)). We recorded ensembles of hippocampal neurons as rats performed both tasks. Place cell overdispersion was high during task(1) but decreased significantly during task(2), and this increased reliability could not be accounted for by changes in running speed or familiarity with the task. Intuitively, decreased variability might be expected to improve neural representations of position. To test this, we used Bayesian decoding of hippocampal spike trains to estimate subjects' location. Neither the amount of probability decoded to subjects' position (local probability) nor the difference between estimated position and true location (decoding accuracy) differed between tasks. However, we found that hippocampal ensembles were significantly more self-consistent during task(2) performance. These results suggest that changes in task demands can affect the firing statistics of hippocampal neurons, leading to changes in the properties of decoded neural representations.

  4. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K

    2016-01-01

    Genome-wide Association Studies (GWAS) result in millions of summary statistics ("z-scores") for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric......-scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing...... for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately...

  5. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  6. Hanford Facility contingency plan

    International Nuclear Information System (INIS)

    Sutton, L.N.; Miskho, A.G.; Brunke, R.C.

    1993-10-01

    The Hanford Facility Contingency Plan, together with each TSD unit-specific contingency plan, meets the WAC 173-303 requirements for a contingency plan. This plan includes descriptions of responses to a nonradiological hazardous materials spill or release at Hanford Facility locations not covered by TSD unit-specific contingency plans or building emergency plans. This plan includes descriptions of responses for spills or releases as a result of transportation activities, movement of materials, packaging, and storage of hazardous materials

  7. Using contingency management procedures to reduce at-risk drinking in heavy drinkers.

    Science.gov (United States)

    Dougherty, Donald M; Lake, Sarah L; Hill-Kapturczak, Nathalie; Liang, Yuanyuan; Karns, Tara E; Mullen, Jillian; Roache, John D

    2015-04-01

    Treatments for alcohol use disorders typically have been abstinence based, but harm reduction approaches that encourage drinkers to alter their drinking behavior to reduce the probability of alcohol-related consequences, have gained in popularity. This study used a contingency management procedure to determine its effectiveness in reducing alcohol consumption among heavy drinkers. Eighty-two nontreatment-seeking heavy drinkers (ages 21 to 54, M = 30.20) who did not meet diagnostic criteria for alcohol dependence participated in the study. The study had 3 phases: (i) an Observation phase (4 weeks) where participants drank normally; (ii) a Contingency Management phase (12 weeks) where participants were paid $50 weekly for not exceeding low levels of alcohol consumption as measured by transdermal alcohol concentrations, contingencies were removed. Transdermal alcohol monitors were used to verify meeting contingency requirements; all other analyses were conducted on self-reported alcohol use. On average 42.3% of participants met the contingency criteria and were paid an average of $222 during the Contingency Management phase, with an average $1,998 in total compensation throughout the study. Compared to the Observation phase, the percent of any self-reported drinking days significantly decreased from 59.9 to 40.0% in the Contingency Management and 32.0% in the Follow-up phases. The percent of self-reported heavy drinking days reported also significantly decreased from 42.4% in the Observation phase to 19.7% in the Contingency Management phase, which was accompanied by a significant increase in percent days of self-reported no (from 40.1 to 60.0%) and low-level drinking (from 9.9 to 15.4%). Self-reported reductions in drinking either persisted, or became more pronounced, during the Follow-up phase. Contingency management was associated with a reduction in self-reported episodes of heavy drinking among nontreatment-seeking heavy drinkers. These effects persisted even

  8. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  9. Fuzzy-set based contingency ranking

    International Nuclear Information System (INIS)

    Hsu, Y.Y.; Kuo, H.C.

    1992-01-01

    In this paper, a new approach based on fuzzy set theory is developed for contingency ranking of Taiwan power system. To examine whether a power system can remain in a secure and reliable operating state under contingency conditions, those contingency cases that will result in loss-of-load, loss-of generation, or islanding are first identified. Then 1P-1Q iteration of fast decoupled load flow is preformed to estimate post-contingent quantities (line flows, bus voltages) for other contingency cases. Based on system operators' past experience, each post-contingent quantity is assigned a degree of severity according to the potential damage that could be imposed on the power system by the quantity, should the contingency occurs. An approach based on fuzzy set theory is developed to deal with the imprecision of linguistic terms

  10. Contingency management for patients with dual disorders in intensive outpatient treatment for addiction.

    Science.gov (United States)

    Kelly, Thomas M; Daley, Dennis C; Douaihy, Antoine B

    2014-01-01

    This quality improvement program evaluation investigated the effectiveness of contingency management for improving retention in treatment and positive outcomes among patients with dual disorders in intensive outpatient treatment for addiction. The effect of contingency management was explored among a group of 160 patients exposed to contingency management (n = 88) and not exposed to contingency management (no contingency management, n = 72) in a six-week partial hospitalization program. Patients referred to the partial hospitalization program for treatment of substance use and comorbid psychiatric disorders received diagnoses from psychiatrists and specialist clinicians according to the Diagnostic and Statistical Manual of the American Psychiatric Association. A unique application of the contingency management "fishbowl" method was used to improve the consistency of attendance at treatment sessions, which patients attended 5 days a week. Days attending treatment and drug-free days were the main outcome variables. Other outcomes of interest were depression, anxiety and psychological stress, coping ability, and intensity of drug cravings. Patients in the contingency management group attended more treatment days compared to patients in the no contingency management group; M = 16.2 days (SD = 10.0) versus M = 9.9 days (SD = 8.5), respectively; t = 4.2, df = 158, p contingency management and self-reported drug-free days. Contingency management is a valuable adjunct for increasing retention in treatment among patients with dual disorders in partial hospitalization treatment. Exposure to contingency management increases retention in treatment, which in turn contributes to increased drug-free days. Interventions for coping with psychological stress and drug cravings should be emphasized in intensive dual diagnosis group therapy.

  11. Contingency planning: preparation of contingency plans

    DEFF Research Database (Denmark)

    Westergaard, J M

    2008-01-01

    . The risk of introducing disease pathogens into a country and the spread of the agent within a country depends on a number of factors including import controls, movement of animals and animal products and the biosecurity applied by livestock producers. An adequate contingency plan is an important instrument...... in the preparation for and the handling of an epidemic. The legislation of the European Union requires that all Member States draw up a contingency plan which specifies the national measures required to maintain a high level of awareness and preparedness and is to be implemented in the event of disease outbreak...

  12. Learning, awareness, and instruction: subjective contingency awareness does matter in the colour-word contingency learning paradigm.

    Science.gov (United States)

    Schmidt, James R; De Houwer, Jan

    2012-12-01

    In three experiments, each of a set colour-unrelated distracting words was presented most often in a particular target print colour (e.g., "month" most often in red). In Experiment 1, half of the participants were told the word-colour contingencies in advance (instructed) and half were not (control). The instructed group showed a larger learning effect. This instruction effect was fully explained by increases in subjective awareness with instruction. In Experiment 2, contingency instructions were again given, but no contingencies were actually present. Although many participants claimed to be aware of these (non-existent) contingencies, they did not produce an instructed contingency effect. In Experiment 3, half of the participants were given contingency instructions that did not correspond to the correct contingencies. Participants with these false instructions learned the actual contingencies worse than controls. Collectively, our results suggest that conscious contingency knowledge might play a moderating role in the strength of implicit learning. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. The Effects and Side-Effects of Statistics Education: Psychology Students' (Mis-)Conceptions of Probability

    Science.gov (United States)

    Morsanyi, Kinga; Primi, Caterina; Chiesi, Francesca; Handley, Simon

    2009-01-01

    In three studies we looked at two typical misconceptions of probability: the representativeness heuristic, and the equiprobability bias. The literature on statistics education predicts that some typical errors and biases (e.g., the equiprobability bias) increase with education, whereas others decrease. This is in contrast with reasoning theorists'…

  14. Spectral clustering and biclustering learning large graphs and contingency tables

    CERN Document Server

    Bolla, Marianna

    2013-01-01

    Explores regular structures in graphs and contingency tables by spectral theory and statistical methods This book bridges the gap between graph theory and statistics by giving answers to the demanding questions which arise when statisticians are confronted with large weighted graphs or rectangular arrays. Classical and modern statistical methods applicable to biological, social, communication networks, or microarrays are presented together with the theoretical background and proofs. This book is suitable for a one-semester course for graduate students in data mining, mult

  15. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  16. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  17. Promoting Active Learning When Teaching Introductory Statistics and Probability Using a Portfolio Curriculum Approach

    Science.gov (United States)

    Adair, Desmond; Jaeger, Martin; Price, Owen M.

    2018-01-01

    The use of a portfolio curriculum approach, when teaching a university introductory statistics and probability course to engineering students, is developed and evaluated. The portfolio curriculum approach, so called, as the students need to keep extensive records both as hard copies and digitally of reading materials, interactions with faculty,…

  18. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  19. Outlier identification procedures for contingency tables using maximum likelihood and $L_1$ estimates

    NARCIS (Netherlands)

    Kuhnt, S.

    2004-01-01

    Observed cell counts in contingency tables are perceived as outliers if they have low probability under an anticipated loglinear Poisson model. New procedures for the identification of such outliers are derived using the classical maximum likelihood estimator and an estimator based on the L1 norm.

  20. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  1. Waste Management Project Contingency Analysis

    International Nuclear Information System (INIS)

    Edward L. Parsons, Jr.

    1999-01-01

    The purpose of this report is to provide the office of Waste Management (WM) with recommended contingency calculation procedures for typical WM projects. Typical projects were defined as conventional construction-type activities that use innovative elements when necessary to meet the project objectives. Projects involve treatment, storage, and disposal of low level, mixed low level, hazardous, transuranic, and high level waste. Cost contingencies are an essential part of Total Cost Management. A contingency is an amount added to a cost estimate to compensate for unexpected expenses resulting from incomplete design, unforeseen and unpredictable conditions, or uncertainties in the project scope (DOE 1994, AACE 1998). Contingency allowances are expressed as percentages of estimated cost and improve cost estimates by accounting for uncertainties. The contingency allowance is large at the beginning of a project because there are more uncertainties, but as a project develops, the allowance shrinks to adjust for costs already incurred. Ideally, the total estimated cost remains the same throughout a project. Project contingency reflects the degree of uncertainty caused by lack of project definition, and process contingency reflects the degree of uncertainty caused by use of new technology. Different cost estimation methods were reviewed and compared with respect to terminology, accuracy, and Cost Guide standards. The Association for the Advancement of Cost Engineering (AACE) methods for cost estimation were selected to represent best industry practice. AACE methodology for contingency analysis can be readily applied to WM Projects, accounts for uncertainties associated with different stages of a project, and considers both project and process contingencies and the stage of technical readiness. As recommended, AACE contingency allowances taper off linearly as a project nears completion

  2. When Is Statistical Evidence Superior to Anecdotal Evidence in Supporting Probability Claims? The Role of Argument Type

    Science.gov (United States)

    Hoeken, Hans; Hustinx, Lettica

    2009-01-01

    Under certain conditions, statistical evidence is more persuasive than anecdotal evidence in supporting a claim about the probability that a certain event will occur. In three experiments, it is shown that the type of argument is an important condition in this respect. If the evidence is part of an argument by generalization, statistical evidence…

  3. Portable EMG devices, Biofeedback and Contingent Electrical Stimulation applications in Bruxism

    DEFF Research Database (Denmark)

    Castrillon, Eduardo

    Portable EMG devices, Biofeedback and Contingent Electrical Stimulation applications in Bruxism Eduardo Enrique, Castrillon Watanabe, DDS, MSc, PhD Section of Orofacial Pain and Jaw Function, Department of Dentistry, Aarhus University, Aarhus, Denmark; Scandinavian Center for Orofacial Neuroscience...... Summary: Bruxism is a parafunctional activity, which involves the masticatory muscles and probably it is as old as human mankind. Different methods such as portable EMG devices have been proposed to diagnose and understand the pathophysiology of bruxism. Biofeedback / contingent electrical stimulation...... characteristics make it complicated to assess bruxism using portable EMG devices. The possibility to assess bruxism like EMG activity on a portable device made it possible to use biofeedback and CES approaches in order to treat / manage bruxism. The available scientific information about CES effects on bruxism...

  4. Aarhus Conference on Probability, Statistics and Their Applications : Celebrating the Scientific Achievements of Ole E. Barndorff-Nielsen

    CERN Document Server

    Stelzer, Robert; Thorbjørnsen, Steen; Veraart, Almut

    2016-01-01

    Collecting together twenty-three self-contained articles, this volume presents the current research of a number of renowned scientists in both probability theory and statistics as well as their various applications in economics, finance, the physics of wind-blown sand, queueing systems, risk assessment, turbulence and other areas. The contributions are dedicated to and inspired by the research of Ole E. Barndorff-Nielsen who, since the early 1960s, has been and continues to be a very active and influential researcher working on a wide range of important problems. The topics covered include, but are not limited to, econometrics, exponential families, Lévy processes and infinitely divisible distributions, limit theory, mathematical finance, random matrices, risk assessment, statistical inference for stochastic processes, stochastic analysis and optimal control, time series, and turbulence. The book will be of interest to researchers and graduate students in probability, statistics and their applications. .

  5. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2015-01-01

    contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...... about the future. Finally, it should be mentioned that temporal logic has found a remarkable application in computer science and applied mathematics. In the late 1970s the first computer scientists realised the relevance of temporal logic for the purposes of computer science (see Hasle and Øhrstrøm 2004)....

  6. Future Contingents

    DEFF Research Database (Denmark)

    Øhrstrøm, Peter; Hasle., Per F. V.

    2011-01-01

    contingent statements. The problem of future contingents is interwoven with a number of issues in theology, philosophy, logic, semantics of natural language, computer science, and applied mathematics. The theological issue of how to reconcile the assumption of God's foreknowledge with the freedom and moral...... accountability of human beings has been a main impetus to the discussion and a major inspiration to the development of various logical models of time and future contingents. This theological issue is connected with the general philosophical question of determinism versus indeterminism. Within logic, the relation...... about the future. Finally, it should be mentioned that temporal logic has found a remarkable application in computer science and applied mathematics. In the late 1970s the first computer scientists realised the relevance of temporal logic for the purposes of computer science (see Hasle and Øhrstrøm 2004)....

  7. There Once Was a 9-Block ...--A Middle-School Design for Probability and Statistics

    Science.gov (United States)

    Abrahamson, Dor; Janusz, Ruth M.; Wilensky, Uri

    2006-01-01

    ProbLab is a probability-and-statistics unit developed at the Center for Connected Learning and Computer-Based Modeling, Northwestern University. Students analyze the combinatorial space of the 9-block, a 3-by-3 grid of squares, in which each square can be either green or blue. All 512 possible 9-blocks are constructed and assembled in a "bar…

  8. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  9. Financial derivative pricing under probability operator via Esscher transfomation

    Energy Technology Data Exchange (ETDEWEB)

    Achi, Godswill U., E-mail: achigods@yahoo.com [Department of Mathematics, Abia State Polytechnic Aba, P.M.B. 7166, Aba, Abia State (Nigeria)

    2014-10-24

    The problem of pricing contingent claims has been extensively studied for non-Gaussian models, and in particular, Black- Scholes formula has been derived for the NIG asset pricing model. This approach was first developed in insurance pricing{sup 9} where the original distortion function was defined in terms of the normal distribution. This approach was later studied6 where they compared the standard Black-Scholes contingent pricing and distortion based contingent pricing. So, in this paper, we aim at using distortion operators by Cauchy distribution under a simple transformation to price contingent claim. We also show that we can recuperate the Black-Sholes formula using the distribution. Similarly, in a financial market in which the asset price represented by a stochastic differential equation with respect to Brownian Motion, the price mechanism based on characteristic Esscher measure can generate approximate arbitrage free financial derivative prices. The price representation derived involves probability Esscher measure and Esscher Martingale measure and under a new complex valued measure φ (u) evaluated at the characteristic exponents φ{sub x}(u) of X{sub t} we recuperate the Black-Scholes formula for financial derivative prices.

  10. Financial derivative pricing under probability operator via Esscher transfomation

    International Nuclear Information System (INIS)

    Achi, Godswill U.

    2014-01-01

    The problem of pricing contingent claims has been extensively studied for non-Gaussian models, and in particular, Black- Scholes formula has been derived for the NIG asset pricing model. This approach was first developed in insurance pricing 9 where the original distortion function was defined in terms of the normal distribution. This approach was later studied6 where they compared the standard Black-Scholes contingent pricing and distortion based contingent pricing. So, in this paper, we aim at using distortion operators by Cauchy distribution under a simple transformation to price contingent claim. We also show that we can recuperate the Black-Sholes formula using the distribution. Similarly, in a financial market in which the asset price represented by a stochastic differential equation with respect to Brownian Motion, the price mechanism based on characteristic Esscher measure can generate approximate arbitrage free financial derivative prices. The price representation derived involves probability Esscher measure and Esscher Martingale measure and under a new complex valued measure φ (u) evaluated at the characteristic exponents φ x (u) of X t we recuperate the Black-Scholes formula for financial derivative prices

  11. Financial derivative pricing under probability operator via Esscher transfomation

    Science.gov (United States)

    Achi, Godswill U.

    2014-10-01

    The problem of pricing contingent claims has been extensively studied for non-Gaussian models, and in particular, Black- Scholes formula has been derived for the NIG asset pricing model. This approach was first developed in insurance pricing9 where the original distortion function was defined in terms of the normal distribution. This approach was later studied6 where they compared the standard Black-Scholes contingent pricing and distortion based contingent pricing. So, in this paper, we aim at using distortion operators by Cauchy distribution under a simple transformation to price contingent claim. We also show that we can recuperate the Black-Sholes formula using the distribution. Similarly, in a financial market in which the asset price represented by a stochastic differential equation with respect to Brownian Motion, the price mechanism based on characteristic Esscher measure can generate approximate arbitrage free financial derivative prices. The price representation derived involves probability Esscher measure and Esscher Martingale measure and under a new complex valued measure φ (u) evaluated at the characteristic exponents φx(u) of Xt we recuperate the Black-Scholes formula for financial derivative prices.

  12. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  13. 49 CFR 1544.301 - Contingency plan.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Contingency plan. 1544.301 Section 1544.301... COMMERCIAL OPERATORS Threat and Threat Response § 1544.301 Contingency plan. Each aircraft operator must adopt a contingency plan and must: (a) Implement its contingency plan when directed by TSA. (b) Ensure...

  14. Simulation of statistical systems with not necessarily real and positive probabilities

    International Nuclear Information System (INIS)

    Kalkreuter, T.

    1991-01-01

    A new method to determine expectation values of observables in statistical systems with not necessarily real and positive probabilities is proposed. It is tested in a numerical study of the two-dimensional O(3)-symmetric nonlinear σ-model with Symanzik's one-loop improved lattice action. This model is simulated as polymer system with field dependent activities which can be made positive definite or indefinite by adjusting additive constants of the action. For a system with indefinite activities the new proposal is found to work. It is also verified that local observables are not affected by far-away ploymers with indefinite activities when the system has no long-range order. (orig.)

  15. Supervisory Styles: A Contingency Framework

    Science.gov (United States)

    Boehe, Dirk Michael

    2016-01-01

    While the contingent nature of doctoral supervision has been acknowledged, the literature on supervisory styles has yet to deliver a theory-based contingency framework. A contingency framework can assist supervisors and research students in identifying appropriate supervisory styles under varying circumstances. The conceptual study reported here…

  16. Significant others and contingencies of self-worth: activation and consequences of relationship-specific contingencies of self-worth.

    Science.gov (United States)

    Horberg, E J; Chen, Serena

    2010-01-01

    Three studies tested the activation and consequences of contingencies of self-worth associated with specific significant others, that is, relationship-specific contingencies of self-worth. The results showed that activating the mental representation of a significant other with whom one strongly desires closeness led participants to stake their self-esteem in domains in which the significant other wanted them to excel. This was shown in terms of self-reported contingencies of self-worth (Study 1), in terms of self-worth after receiving feedback on a successful or unsatisfactory performance in a relationship-specific contingency domain (Study 2), and in terms of feelings of reduced self-worth after thinking about a failure in a relationship-specific contingency domain (Study 3). Across studies, a variety of contingency domains were examined. Furthermore, Study 3 showed that failing in an activated relationship-specific contingency domain had negative implications for current feelings of closeness and acceptance in the significant-other relationship. Overall, the findings suggest that people's contingencies of self-worth depend on the social situation and that performance in relationship-specific contingency domains can influence people's perceptions of their relationships.

  17. Licensee safeguards contingency plans

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The Nuclear Regulatory Commission is amending its regulations to require that licensees authorized to operate a nuclear reactor (other than certain research and test reactors), and those authorized to possess strategic quantities of plutonium, uranium-233, or uranium-235 develop and implement acceptable plans for responding to threats, thefts, and industrial sabotage of licensed nuclear materials and facilities. The plans will provide a structured, orderly, and timely response to safeguards contingencies and will be an important segment of NRC's contingency planning programs. Licensee safeguards contingency plans will result in organizing licensee's safeguards resources in such a way that, in the unlikely event of a safeguards contingency, the responding participants will be identified, their several responsibilities specified, and their responses coordinated

  18. Contingency Operations of Americas Next Moon Rocket, Ares V

    Science.gov (United States)

    Jaap, John; Richardson, Lea

    2010-01-01

    America has begun the development of a new space vehicle system which will enable humans to return to the moon and reach even farther destinations. The system is called Constellation: it has 2 earth-launch vehicles, Ares I and Ares V; a crew module, Orion; and a lander, Altair with descent and ascent stages. Ares V will launch an Earth Departure Stage (EDS) and Altair into low earth orbit. Ares I will launch the Orion crew module into low earth orbit where it will rendezvous and dock with the Altair and EDS "stack". After rendezvous, the stack will contain four complete rocket systems, each capable of independent operations. Of course this multiplicity of vehicles provides a multiplicity of opportunities for off-nominal behavior and multiple mitigation options for each. Contingency operations are complicated by the issues of crew safety and the possibility of debris from the very large components impacting the ground. This paper examines contingency operations of the EDS in low earth orbit, during the boost to translunar orbit, and after the translunar boost. Contingency operations under these conditions have not been a consideration since the Apollo era and analysis of the possible contingencies and mitigations will take some time to evolve. Since the vehicle has not been designed, much less built, it is not possible to evaluate contingencies from a root-cause basis or from a probability basis; rather they are discussed at an effects level (such as the reaction control system is consuming propellant at a high rate). Mitigations for the contingencies are based on the severity of the off-nominal condition, the time of occurrence, recovery options, options for alternate missions, crew safety, evaluation of the condition (forensics) and future prevention. Some proposed mitigations reflect innovation in thinking and make use of the multiplicity of on-orbit resources including the crew; example: Orion could do a "fly around" to allow the crew to determine the condition

  19. 30 CFR 282.26 - Contingency Plan.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Contingency Plan. 282.26 Section 282.26 Mineral... § 282.26 Contingency Plan. (a) When required by the Director, a lessee shall include a Contingency Plan as part of its request for approval of a Delineation, Testing, or Mining Plan. The Contingency Plan...

  20. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  1. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  2. State-of-the-art risk-based approach to spill contingency planning and risk management

    International Nuclear Information System (INIS)

    Schmidt Etkin, Dagmar; Reilly, Timothy; French McCay, Deborah

    2011-01-01

    The paper proposes incorporating a comprehensive examination of spill risk into risk management and contingency planning, and applying state-of-the-art modeling tools to evaluate various alternatives for appropriate spill response measures and optimize protective responses. The approach allows spill contingency planners and decision-makers to determine the types of spill scenarios that may occur in a particular location or from a particular source and calculate the probability distribution of the various scenarios. The spill probability information is useful in assessing and putting into perspective the various costs options for spill control systems that will be recommended ultimately. Using advanced modeling tools helps in estimating the potential environmental and socioeconomic consequences of each spill scenario based on location-specific factors over a range of stochastic possibilities, simulating spill scenarios and determining optimal responses and protection strategies. The benefits and costs of various response alternatives and variations in response time can be calculated and modeling tools for training and risk allocation/transfer purposes used.

  3. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  4. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    Science.gov (United States)

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  5. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information.

    Science.gov (United States)

    Perlin, Mark William

    2015-01-01

    DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI), used by crime labs for over 15 years. When testing 13 short tandem repeat (STR) genetic loci, the CPI(-1) value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR), spans a much broader range. This study examined probability of inclusion (PI) mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI(-1)) values were examined and compared with corresponding log(LR) values. The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN), CPI(-1) increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN rather than measuring identification information. A quantitative

  6. Contingent Conspiracies: Art, Philosophy, Science

    DEFF Research Database (Denmark)

    Wilson, Alexander

    2013-01-01

    The question of whether creativity comes from being “open” or “closed” to contingent processes, deeply intersects art-historical discourse on authorship, style, technique and practice: from the Greek notion of the Daimon, through commedia dell'arte’s improvised styles and romanticism’s investment......, Hegel) contain a deeper tension between contingency and necessity, often revealed in correlate discussions of the sublime. But as artists find themselves returning again to a concern or care for contingency (a thread running through Heidegger, Levinas and Derrida) or the question how to conspire...... with contingency (Negarestani), they do so today with a new paradigm of scientific knowledge at their disposal. For science too has increasingly been forced to respond to the notion of contingency. Progressively discovering the ubiquity of non-linear dynamics, deterministic chaos and emergent complexity...

  7. Contingency diagrams as teaching tools

    OpenAIRE

    Mattaini, Mark A.

    1995-01-01

    Contingency diagrams are particularly effective teaching tools, because they provide a means for students to view the complexities of contingency networks present in natural and laboratory settings while displaying the elementary processes that constitute those networks. This paper sketches recent developments in this visualization technology and illustrates approaches for using contingency diagrams in teaching.

  8. Statistical Study of Aircraft Icing Probabilities at the 700- and 500- Millibar Levels over Ocean Areas in the Northern Hemisphere

    Science.gov (United States)

    Perkins, Porter J.; Lewis, William; Mulholland, Donald R.

    1957-01-01

    A statistical study is made of icing data reported from weather reconnaissance aircraft flown by Air Weather Service (USAF). The weather missions studied were flown at fixed flight levels of 500 millibars (18,000 ft) and 700 millibars (10,000 ft) over wide areas of the Pacific, Atlantic, and Arctic Oceans. This report is presented as part of a program conducted by the NACA to obtain extensive icing statistics relevant to aircraft design and operation. The thousands of in-flight observations recorded over a 2- to 4-year period provide reliable statistics on icing encounters for the specific areas, altitudes, and seasons included in the data. The relative frequencies of icing occurrence are presented, together with the estimated icing probabilities and the relation of these probabilities to the frequencies of flight in clouds and cloud temperatures. The results show that aircraft operators can expect icing probabilities to vary widely throughout the year from near zero in the cold Arctic areas in winter up to 7 percent in areas where greater cloudiness and warmer temperatures prevail. The data also reveal a general tendency of colder cloud temperatures to reduce the probability of icing in equally cloudy conditions.

  9. A Study of Strengthening Secondary Mathematics Teachers' Knowledge of Statistics and Probability via Professional Development

    Science.gov (United States)

    DeVaul, Lina

    2017-01-01

    A professional development program (PSPD) was implemented to improve in-service secondary mathematics teachers' content knowledge, pedagogical knowledge, and self-efficacy in teaching secondary school statistics and probability. Participants generated a teaching resource website at the conclusion of the PSPD program. Participants' content…

  10. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  11. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  12. Two-Way Tables: Issues at the Heart of Statistics and Probability for Students and Teachers

    Science.gov (United States)

    Watson, Jane; Callingham, Rosemary

    2014-01-01

    Some problems exist at the intersection of statistics and probability, creating a dilemma in relation to the best approach to assist student understanding. Such is the case with problems presented in two-way tables representing conditional information. The difficulty can be confounded if the context within which the problem is set is one where…

  13. Dysphoric mood states are related to sensitivity to temporal changes in contingency

    Directory of Open Access Journals (Sweden)

    Rachel M. eMsetfi

    2012-09-01

    Full Text Available A controversial finding in the field of causal learning is that mood contributes to the accuracy of perceptions of uncorrelated relationships. When asked to report the degree of control between an action and its outcome, people with dysphoria or depression are claimed to be more realistic in reporting non-contingency (e.g., Alloy & Abramson, 1979. The strongest evidence for this depressive realism (DR effect is derived from data collected with experimental procedures in which the dependent variables are verbal or written ratings of contingency or cause, and, perhaps more importantly, the independent variable in these procedures may be ambiguous and difficult to define. In order to address these possible confounds, we used a two-response free-operant causal learning task in which the dependent measures were performance based. Participants were required to respond to maximise the occurrence of a temporally contiguous outcome that was programmed with different probabilities, which also varied temporally across two responses. Dysphoric participants were more sensitive to the changing outcome contingencies than controls even though they responded at a similar rate. During probe trials, in which the outcome was masked, their performance recovered more quickly than that of the control group. These data provide unexpected support for the depressive realism hypothesis suggesting that dysphoria is associated with heightened sensitivity to temporal shifts in contingency.

  14. Dysphoric Mood States are Related to Sensitivity to Temporal Changes in Contingency.

    Science.gov (United States)

    Msetfi, Rachel M; Murphy, Robin A; Kornbrot, Diana E

    2012-01-01

    A controversial finding in the field of causal learning is that mood contributes to the accuracy of perceptions of uncorrelated relationships. When asked to report the degree of control between an action and its outcome, people with dysphoria or depression are claimed to be more realistic in reporting non-contingency (e.g., Alloy and Abramson, 1979). The strongest evidence for this depressive realism (DR) effect is derived from data collected with experimental procedures in which the dependent variables are verbal or written ratings of contingency or cause, and, perhaps more importantly, the independent variable in these procedures may be ambiguous and difficult to define. In order to address these possible confounds, we used a two-response free-operant causal learning task in which the dependent measures were performance based. Participants were required to respond to maximize the occurrence of a temporally contiguous outcome that was programmed with different probabilities, which also varied temporally across two responses. Dysphoric participants were more sensitive to the changing outcome contingencies than controls even though they responded at a similar rate. During probe trials, in which the outcome was masked, their performance recovered more quickly than that of the control group. These data provide unexpected support for the DR hypothesis suggesting that dysphoria is associated with heightened sensitivity to temporal shifts in contingency.

  15. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  16. Contingencies of Value

    DEFF Research Database (Denmark)

    Strandvad, Sara Malou

    2014-01-01

    Based on a study of the admission test at a design school, this paper investigates the contingencies of aesthetic values as these become visible in assessment practices. Theoretically, the paper takes its starting point in Herrnstein Smith’s notion of ‘contingencies of values’ and outlines...... a pragmatist ground where cultural sociology and economic sociology meet. Informed by the literature on cultural intermediaries, the paper discusses the role of evaluators and the devices which accompany them. Whereas studies of cultural intermediaries traditionally apply a Bourdieusian perspective, recent......, the paper does not accept this storyline. As an alternative, the paper outlines the contingencies of values which are at play at the admission test, composed of official assessment criteria and scoring devices together with conventions within the world of design, and set in motion by interactions...

  17. 48 CFR 18.201 - Contingency operation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Contingency operation. 18... METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 18.201 Contingency operation. (a) Contingency operation is defined in 2.101. (b) Micro-purchase threshold. The threshold...

  18. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  19. Problems with Contingency Theory: Testing Assumptions Hidden within the Language of Contingency "Theory".

    Science.gov (United States)

    Schoonhoven, Clausia Bird

    1981-01-01

    Discusses problems in contingency theory, which relates organizational structure to the tasks performed and the information needed. Analysis of data from 17 hospitals suggests that traditional contingency theory underrepresents the complexity of relations among technological uncertainty, structure, and organizational effectiveness. (Author/RW)

  20. Optimal self-esteem is contingent: Intrinsic versus extrinsic and upward versus downward contingencies

    NARCIS (Netherlands)

    Vonk, R.; Smit, H.M.M.

    2012-01-01

    We argue that noncontingent, unconditional self-esteem is not optimal but defensive. We introduce the concept of intrinsic contingency, where self-esteem is affected by whether one's actions are self-congruent and conducive to personal growth. Whereas external contingencies, especially social and

  1. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    Directory of Open Access Journals (Sweden)

    Mark William Perlin

    2015-01-01

    Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN

  2. 48 CFR 218.201 - Contingency operation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Contingency operation. 218... Flexibilities 218.201 Contingency operation. (1) Selection, appointment, and termination of appointment... in a contingency contracting force. See 201.603-2(2). (2) Policy for unique item identification...

  3. Thoughts Without Content are Empty, Intuitions Without Concepts are Blind - Determinism and Contingency Revisited

    Science.gov (United States)

    Pohorille, Andrew

    2011-01-01

    similar reasoning is common in other fields of science, for example in statistical mechanics. Some trajectories lead to life, perhaps in different forms, whereas others do not. Of our true interest is the ratio of these two outcomes. The issue of determinism does not directly enter the picture. The debate about the likelihood of the emergence of life is quite old. One view holds that the origin of life is an event governed by chance, and the result of so many random events (contingencies) is unpredictable. This view was eloquently expressed by Monod. In his book "Chance or Necessity" he argued that life was a product of "nature's roulette." In an alternative view, expressed in particular by deDuve and Morowitz, the origin of life is considered a highly probable or even inevitable event (although its details need not be determined in every respect). Only in this sense the origin of life can be considered a "deterministic event".

  4. The Necessity of Contingency or Contingent Necessity: Meillassoux, Hegel, and the Subject

    Directory of Open Access Journals (Sweden)

    John Van Houdt

    2011-06-01

    Full Text Available This article addresses the relationship of contingency to necessity as developed by Quentin Meillassoux and G.W.F. Hegel. Meillassoux criticizes the restriction of possibility by modern philosophy to the conditions of the transcendental subject, which he calls ‘correlationism’, and opposes to this correlationism, mathematics as an absolute form of thought. The arch-figure of a metaphysical version of correlationism for Meillassoux is Hegel. This article argues that, while Meillassoux is right to criticize a version of correlationism for restricting the range of contingency, he overlooks Hegel’s unique contribution to this issue. Hegel provides us a version of necessity modeled on the mathematical proof which answers Meillassoux’s concerns about correlationist versions of necessity but does not altogether jettison the concept of the subject. Instead, the subject in Hegel is a contingent interruption which emerges from the breaks in the kinds of necessity we posit about the world. Hegel offers us a way of tying these two concepts together in what I call ‘contingent necessity’.

  5. 40 CFR 264.53 - Copies of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Copies of contingency plan. 264.53... Contingency Plan and Emergency Procedures § 264.53 Copies of contingency plan. A copy of the contingency plan... called upon to provide emergency services. [Comment: The contingency plan must be submitted to the...

  6. Impaired awareness of action-outcome contingency and causality during healthy ageing and following ventromedial prefrontal cortex lesions.

    Science.gov (United States)

    O'Callaghan, Claire; Vaghi, Matilde M; Brummerloh, Berit; Cardinal, Rudolf N; Robbins, Trevor W

    2018-02-02

    Detecting causal relationships between actions and their outcomes is fundamental to guiding goal-directed behaviour. The ventromedial prefrontal cortex (vmPFC) has been extensively implicated in computing these environmental contingencies, via animal lesion models and human neuroimaging. However, whether the vmPFC is critical for contingency learning, and whether it can occur without subjective awareness of those contingencies, has not been established. To address this, we measured response adaption to contingency and subjective awareness of action-outcome relationships in individuals with vmPFC lesions and healthy elderly subjects. We showed that in both vmPFC damage and ageing, successful behavioural adaptation to variations in action-outcome contingencies was maintained, but subjective awareness of these contingencies was reduced. These results highlight two contexts where performance and awareness have been dissociated, and show that learning response-outcome contingencies to guide behaviour can occur without subjective awareness. Preserved responding in the vmPFC group suggests that this region is not critical for computing action-outcome contingencies to guide behaviour. In contrast, our findings highlight a critical role for the vmPFC in supporting awareness, or metacognitive ability, during learning. We further advance the hypothesis that responding to changing environmental contingencies, whilst simultaneously maintaining conscious awareness of those statistical regularities, is a form of dual-tasking that is impaired in ageing due to reduced prefrontal function. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    Science.gov (United States)

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  8. 49 CFR 1542.301 - Contingency plan.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Contingency plan. 1542.301 Section 1542.301..., DEPARTMENT OF HOMELAND SECURITY CIVIL AVIATION SECURITY AIRPORT SECURITY Contingency Measures § 1542.301 Contingency plan. (a) Each airport operator required to have a security program under § 1542.103(a) and (b...

  9. Patterns of medicinal plant use: an examination of the Ecuadorian Shuar medicinal flora using contingency table and binomial analyses.

    Science.gov (United States)

    Bennett, Bradley C; Husby, Chad E

    2008-03-28

    Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.

  10. Some Tests for Evaluation of Contingency Tables (for Biomedical Applications)

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 7, č. 1 (2011), s. 37-50 ISSN 1336-9180 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : contingency tables * hypothesis testing Subject RIV: BB - Applied Statistics , Operational Research http://jamsi.fpv.ucm.sk/docs/v07n01_05_2011/v07_n01_03_KALINA.pdf

  11. The Role of the Rat Medial Prefrontal Cortex in Adapting to Changes in Instrumental Contingency

    Science.gov (United States)

    Coutureau, Etienne; Esclassan, Frederic; Di Scala, Georges; Marchand, Alain R.

    2012-01-01

    In order to select actions appropriate to current needs, a subject must identify relationships between actions and events. Control over the environment is determined by the degree to which action consequences can be predicted, as described by action-outcome contingencies – i.e. performing an action should affect the probability of the outcome. We evaluated in a first experiment adaptation to contingency changes in rats with neurotoxic lesions of the medial prefrontal cortex. Results indicate that this brain region is not critical to adjust instrumental responding to a negative contingency where the rats must refrain from pressing a lever, as this action prevents reward delivery. By contrast, this brain region is required to reduce responding in a non-contingent situation where the same number of rewards is freely delivered and actions do not affect the outcome any more. In a second experiment, we determined that this effect does not result from a different perception of temporal relationships between actions and outcomes since lesioned rats adapted normally to gradually increasing delays in reward delivery. These data indicate that the medial prefrontal cortex is not directly involved in evaluating the correlation between action-and reward-rates or in the perception of reward delays. The deficit in lesioned rats appears to consist of an abnormal response to the balance between contingent and non-contingent rewards. By highlighting the role of prefrontal regions in adapting to the causal status of actions, these data contribute to our understanding of the neural basis of choice tasks. PMID:22496747

  12. ACCOUNTING FOR CONTINGENT CONSIDERATIONS IN BUSINESS COMBINATIONS

    Directory of Open Access Journals (Sweden)

    Gurgen KALASHYAN

    2017-07-01

    Full Text Available According to IFRS 3 Business Combinations contingent considerations must be included in the total consideration given for the acquired entity along with cash, other assets, ordinary or preference equity instruments, options, warrants. The contingent consideration is the determined amount which acquiring entity has to pay to acquired entity provided, that certain conditions will be fulfilled in the future. In case the provisions are not satisfied, we will get the situation when the amount of contingent consideration has been included in the total consideration given in the business combination, but in fact, the acquirer has not paid that amount. In its turn, the acquired entity will recognize the contingent consideration as a financial asset according to IFRS 9 Financial Instruments. In that case, it would be appropriately to recognize the contingent consideration as a contingent asset applying IAS 37. In the Article the author will explore the challenges of contingent consideration accounting and suggest the ways of solving the above mentioned problems.

  13. Contingency management: perspectives of Australian service providers.

    Science.gov (United States)

    Cameron, Jacqui; Ritter, Alison

    2007-03-01

    Given the very positive and extensive research evidence demonstrating efficacy and effectiveness of contingency management, it is important that Australia explore whether contingency management has a role to play in our own treatment context. Qualitative interviews were conducted with 30 experienced alcohol and drug practitioners, service managers and policy-makers in Victoria. Interviewees were selected to represent the range of drug treatment services types and included rural representation. A semi-structured interview schedule, covering their perceptions and practices of contingency management was used. All interviews were transcribed verbatim and analysed using N2 qualitative data analysis program. The majority of key informants were positively inclined toward contingency management, notwithstanding some concerns about the philosophical underpinnings. Concerns were raised in relation to the use of monetary rewards. Examples of the use of contingency management provided by key informants demonstrated an over-inclusive definition: all the examples did not adhere to the key principles of contingency management. This may create problems if a structured contingency management were to be introduced in Australia. Contingency management is an important adjunctive treatment intervention and its use in Australia has the potential to enhance treatment outcomes. No unmanageable barriers were identified in this study.

  14. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    Science.gov (United States)

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  15. Alternative Forms of Fit in Contingency Theory.

    Science.gov (United States)

    Drazin, Robert; Van de Ven, Andrew H.

    1985-01-01

    This paper examines the selection, interaction, and systems approaches to fit in structural contingency theory. The concepts of fit evaluated may be applied not only to structural contingency theory but to contingency theories in general. (MD)

  16. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  17. Appendix F - Sample Contingency Plan

    Science.gov (United States)

    This sample Contingency Plan in Appendix F is intended to provide examples of contingency planning as a reference when a facility determines that the required secondary containment is impracticable, pursuant to 40 CFR §112.7(d).

  18. Thevenin Equivalent Method for Dynamic Contingency Assessment

    DEFF Research Database (Denmark)

    Møller, Jakob Glarbo; Jóhannsson, Hjörtur; Østergaard, Jacob

    2015-01-01

    A method that exploits Thevenin equivalent representation for obtaining post-contingency steady-state nodal voltages is integrated with a method of detecting post-contingency aperiodic small-signal instability. The task of integrating stability assessment with contingency assessment is challenged...... by the cases of unstable post-contingency conditions. For unstable postcontingency conditions there exists no credible steady-state which can be used for basis of a stability assessment. This paper demonstrates how Thevenin Equivalent methods can be applied in algebraic representation of such bifurcation...... points which may be used in assessment of post-contingency aperiodic small-signal stability. The assessment method is introduced with a numeric example....

  19. Yampa River Valley sub-area contingency plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-08-01

    The Yampa River Valley sub-area contingency plan (Contingency Plan) has been prepared for two counties in northwestern Colorado: Moffat County and Routt County. The Contingency Plan is provided in two parts, the Contingency Plan and the Emergency Response Action Plan (ERAP). The Contingency Plan provides information that should be helpful in planning to minimize the impact of an oil spill or hazardous material incident. It contains discussions of planning and response role, hazards identification, vulnerability analysis, risk analysis, cleanup, cost recovery, training, and health and safety. It includes information on the incident command system, notifications, response capabilities, emergency response organizations, evacuation and shelter-in-place, and immediate actions.

  20. 40 CFR 265.54 - Amendment of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Amendment of contingency plan. 265.54... DISPOSAL FACILITIES Contingency Plan and Emergency Procedures § 265.54 Amendment of contingency plan. The contingency plan must be reviewed, and immediately amended, if necessary, whenever: (a) Applicable regulations...

  1. 40 CFR 264.54 - Amendment of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Amendment of contingency plan. 264.54 Section 264.54 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES... Contingency Plan and Emergency Procedures § 264.54 Amendment of contingency plan. The contingency plan must be...

  2. Sufficient Statistics for Divergence and the Probability of Misclassification

    Science.gov (United States)

    Quirein, J.

    1972-01-01

    One particular aspect is considered of the feature selection problem which results from the transformation x=Bz, where B is a k by n matrix of rank k and k is or = to n. It is shown that in general, such a transformation results in a loss of information. In terms of the divergence, this is equivalent to the fact that the average divergence computed using the variable x is less than or equal to the average divergence computed using the variable z. A loss of information in terms of the probability of misclassification is shown to be equivalent to the fact that the probability of misclassification computed using variable x is greater than or equal to the probability of misclassification computed using variable z. First, the necessary facts relating k-dimensional and n-dimensional integrals are derived. Then the mentioned results about the divergence and probability of misclassification are derived. Finally it is shown that if no information is lost (in x = Bz) as measured by the divergence, then no information is lost as measured by the probability of misclassification.

  3. Network location theory and contingency planning

    Energy Technology Data Exchange (ETDEWEB)

    Hakimi, S L

    1983-08-01

    A brief survey of results in network location theory is first presented. Then, a systems view of contingency planning is described. Finally, some results in location theory are re-examined and it is shown that they are motivated by contingency planning considerations. Some new issues and problems in location theory are described, which, if properly tackled, will have a substantial impact on contingency planning in transportation.

  4. Binomial distribution of Poisson statistics and tracks overlapping probability to estimate total tracks count with low uncertainty

    International Nuclear Information System (INIS)

    Khayat, Omid; Afarideh, Hossein; Mohammadnia, Meisam

    2015-01-01

    In the solid state nuclear track detectors of chemically etched type, nuclear tracks with center-to-center neighborhood of distance shorter than two times the radius of tracks will emerge as overlapping tracks. Track overlapping in this type of detectors causes tracks count losses and it becomes rather severe in high track densities. Therefore, tracks counting in this condition should include a correction factor for count losses of different tracks overlapping orders since a number of overlapping tracks may be counted as one track. Another aspect of the problem is the cases where imaging the whole area of the detector and counting all tracks are not possible. In these conditions a statistical generalization method is desired to be applicable in counting a segmented area of the detector and the results can be generalized to the whole surface of the detector. Also there is a challenge in counting the tracks in densely overlapped tracks because not sufficient geometrical or contextual information are available. It this paper we present a statistical counting method which gives the user a relation between the tracks overlapping probabilities on a segmented area of the detector surface and the total number of tracks. To apply the proposed method one can estimate the total number of tracks on a solid state detector of arbitrary shape and dimensions by approximating the tracks averaged area, whole detector surface area and some orders of tracks overlapping probabilities. It will be shown that this method is applicable in high and ultra high density tracks images and the count loss error can be enervated using a statistical generalization approach. - Highlights: • A correction factor for count losses of different tracks overlapping orders. • For the cases imaging the whole area of the detector is not possible. • Presenting a statistical generalization method for segmented areas. • Giving a relation between the tracks overlapping probabilities and the total tracks

  5. 30 CFR 218.152 - Fishermen's Contingency Fund.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Fishermen's Contingency Fund. 218.152 Section 218.152 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR MINERALS REVENUE..., Offshore § 218.152 Fishermen's Contingency Fund. Upon the establishment of the Fishermen's Contingency Fund...

  6. 40 CFR 265.53 - Copies of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Copies of contingency plan. 265.53... DISPOSAL FACILITIES Contingency Plan and Emergency Procedures § 265.53 Copies of contingency plan. A copy of the contingency plan and all revisions to the plan must be: (a) Maintained at the facility; and (b...

  7. Tethered Satellite System Contingency Investigation Board

    Science.gov (United States)

    1992-11-01

    The Tethered Satellite System (TSS-1) was launched aboard the Space Shuttle Atlantis (STS-46) on July 31, 1992. During the attempted on-orbit operations, the Tethered Satellite System failed to deploy successfully beyond 256 meters. The satellite was retrieved successfully and was returned on August 6, 1992. The National Aeronautics and Space Administration (NASA) Associate Administrator for Space Flight formed the Tethered Satellite System (TSS-1) Contingency Investigation Board on August 12, 1992. The TSS-1 Contingency Investigation Board was asked to review the anomalies which occurred, to determine the probable cause, and to recommend corrective measures to prevent recurrence. The board was supported by the TSS Systems Working group as identified in MSFC-TSS-11-90, 'Tethered Satellite System (TSS) Contingency Plan'. The board identified five anomalies for investigation: initial failure to retract the U2 umbilical; initial failure to flyaway; unplanned tether deployment stop at 179 meters; unplanned tether deployment stop at 256 meters; and failure to move tether in either direction at 224 meters. Initial observations of the returned flight hardware revealed evidence of mechanical interference by a bolt with the level wind mechanism travel as well as a helical shaped wrap of tether which indicated that the tether had been unwound from the reel beyond the travel by the level wind mechanism. Examination of the detailed mission events from flight data and mission logs related to the initial failure to flyaway and the failure to move in either direction at 224 meters, together with known preflight concerns regarding slack tether, focused the assessment of these anomalies on the upper tether control mechanism. After the second meeting, the board requested the working group to complete and validate a detailed integrated mission sequence to focus the fault tree analysis on a stuck U2 umbilical, level wind mechanical interference, and slack tether in upper tether

  8. Equilibria of perceptrons for simple contingency problems.

    Science.gov (United States)

    Dawson, Michael R W; Dupuis, Brian

    2012-08-01

    The contingency between cues and outcomes is fundamentally important to theories of causal reasoning and to theories of associative learning. Researchers have computed the equilibria of Rescorla-Wagner models for a variety of contingency problems, and have used these equilibria to identify situations in which the Rescorla-Wagner model is consistent, or inconsistent, with normative models of contingency. Mathematical analyses that directly compare artificial neural networks to contingency theory have not been performed, because of the assumed equivalence between the Rescorla-Wagner learning rule and the delta rule training of artificial neural networks. However, recent results indicate that this equivalence is not as straightforward as typically assumed, suggesting a strong need for mathematical accounts of how networks deal with contingency problems. One such analysis is presented here, where it is proven that the structure of the equilibrium for a simple network trained on a basic contingency problem is quite different from the structure of the equilibrium for a Rescorla-Wagner model faced with the same problem. However, these structural differences lead to functionally equivalent behavior. The implications of this result for the relationships between associative learning, contingency theory, and connectionism are discussed.

  9. Nonparametric statistics for social and behavioral sciences

    CERN Document Server

    Kraska-MIller, M

    2013-01-01

    Introduction to Research in Social and Behavioral SciencesBasic Principles of ResearchPlanning for ResearchTypes of Research Designs Sampling ProceduresValidity and Reliability of Measurement InstrumentsSteps of the Research Process Introduction to Nonparametric StatisticsData AnalysisOverview of Nonparametric Statistics and Parametric Statistics Overview of Parametric Statistics Overview of Nonparametric StatisticsImportance of Nonparametric MethodsMeasurement InstrumentsAnalysis of Data to Determine Association and Agreement Pearson Chi-Square Test of Association and IndependenceContingency

  10. 50 CFR 296.3 - Fishermen's contingency fund.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false Fishermen's contingency fund. 296.3... ADMINISTRATION, DEPARTMENT OF COMMERCE CONTINENTAL SHELF FISHERMEN'S CONTINGENCY FUND § 296.3 Fishermen's contingency fund. (a) General. There is established in the Treasury of the United States the Fishermen's...

  11. The intensity detection of single-photon detectors based on photon counting probability density statistics

    International Nuclear Information System (INIS)

    Zhang Zijing; Song Jie; Zhao Yuan; Wu Long

    2017-01-01

    Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)

  12. Motor contingency learning and infants with Spina Bifida.

    Science.gov (United States)

    Taylor, Heather B; Barnes, Marcia A; Landry, Susan H; Swank, Paul; Fletcher, Jack M; Huang, Furong

    2013-02-01

    Infants with Spina Bifida (SB) were compared to typically developing infants (TD) using a conjugate reinforcement paradigm at 6 months-of-age (n = 98) to evaluate learning, and retention of a sensory-motor contingency. Analyses evaluated infant arm-waving rates at baseline (wrist not tethered to mobile), during acquisition of the sensory-motor contingency (wrist tethered), and immediately after the acquisition phase and then after a delay (wrist not tethered), controlling for arm reaching ability, gestational age, and socioeconomic status. Although both groups responded to the contingency with increased arm-waving from baseline to acquisition, 15% to 29% fewer infants with SB than TD were found to learn the contingency depending on the criterion used to determine contingency learning. In addition, infants with SB who had learned the contingency had more difficulty retaining the contingency over time when sensory feedback was absent. The findings suggest that infants with SB do not learn motor contingencies as easily or at the same rate as TD infants, and are more likely to decrease motor responses when sensory feedback is absent. Results are discussed with reference to research on contingency learning in infants with and without neurodevelopmental disorders, and with reference to motor learning in school-age children with SB.

  13. 10 CFR 72.184 - Safeguards contingency plan.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Safeguards contingency plan. 72.184 Section 72.184 Energy... Protection § 72.184 Safeguards contingency plan. (a) The requirements of the licensee's safeguards contingency plan for responding to threats and radiological sabotage must be as defined in appendix C to part...

  14. Searching for Plausible N-k Contingencies Endangering Voltage Stability

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel; Van Cutsem, Thierry

    2017-01-01

    This paper presents a novel search algorithm using time-domain simulations to identify plausible N − k contingencies endangering voltage stability. Starting from an initial list of disturbances, progressively more severe contingencies are investigated. After simulation of a N − k contingency......, the simulation results are assessed. If the system response is unstable, a plausible harmful contingency sequence has been found. Otherwise, components affected by the contingencies are considered as candidate next event leading to N − (k + 1) contingencies. This implicitly takes into account hidden failures...

  15. Psychophysics of associative learning: Quantitative properties of subjective contingency.

    Science.gov (United States)

    Maia, Susana; Lefèvre, Françoise; Jozefowiez, Jérémie

    2018-01-01

    Allan and collaborators (Allan, Hannah, Crump, & Siegel, 2008; Allan, Siegel, & Tangen, 2005; Siegel, Allan, Hannah, & Crump, 2009) recently proposed to apply signal detection theory to the analysis of contingency judgment tasks. When exposed to a flow of stimuli, participants are asked to judge whether there is a contingent relation between a cue and an outcome, that is, whether the subjective cue-outcome contingency exceeds a decision threshold. In this context, we tested the following hypotheses regarding the relation between objective and subjective cue-outcome contingency: (a) The underlying distributions of subjective cue-outcome contingency are Gaussian; (b) The mean distribution of subjective contingency is a linear function of objective cue-outcome contingency; and (c) The variance in the distribution of subjective contingency is constant. The hypotheses were tested by combining a streamed-trial contingency assessment task with a confidence rating procedure. Participants were exposed to rapid flows of stimuli at the end of which they had to judge whether an outcome was more (Experiment 1) or less (Experiment 2) likely to appear following a cue and how sure they were of their judgment. We found that although Hypothesis A seems reasonable, Hypotheses B and C were not. Regarding Hypothesis B, participants were more sensitive to positive than to negative contingencies. Regarding Hypothesis C, the perceived cue-outcome contingency became more variable when the contingency became more positive or negative, but only to a slight extent. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  17. 48 CFR 225.7303-4 - Contingent fees.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Contingent fees. 225.7303....7303-4 Contingent fees. (a) Except as provided in paragraph (b) of this subsection, contingent fees are generally allowable under DoD contracts, provided— (1) The fees are paid to a bona fide employee or a bona...

  18. 40 CFR 264.51 - Purpose and implementation of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... contingency plan. 264.51 Section 264.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... DISPOSAL FACILITIES Contingency Plan and Emergency Procedures § 264.51 Purpose and implementation of contingency plan. (a) Each owner or operator must have a contingency plan for his facility. The contingency...

  19. Contingency Theories of Leadership: A Study.

    Science.gov (United States)

    Saha, Sunhir K.

    1979-01-01

    Some of the major contingency theories of leadership are reviewed; some results from the author's study of Fiedler's contingency model are reported; and some thoughts for the future of leadership research are provided. (Author/MLF)

  20. Understanding advanced statistical methods

    CERN Document Server

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  1. Contingency Management of Health Care Organizations: It Depends.

    Science.gov (United States)

    Olden, Peter C

    Managers in health care organizations (HCOs) must perform many processes and activities, such as planning goals, designing organization structure, leading people, motivating employees, making decisions, and resolving conflict. How they do all this strongly affects the performance and outcomes of their organizations and themselves. Some managers develop a usual way of performing their jobs and achieve some success with a preferred method of leading or a favorite approach to motivating. However, their success will be limited if they always rely on a standard "1-size-fits-all" approach. This is because contingency factors influence the effectiveness of a given approach to managing. The "best" approach depends on contingency factors, including the situation and the people involved. Managers should choose an approach to fit with the changing contingency factors. This article explains why and how managers should develop a contingency approach to managing HCOs. The development of contingency theory is briefly described. Practical application of contingency management is explained for leading, motivating, decision making, and resolving conflict. By using a contingency approach, managers can be more effective when managing their HCOs.

  2. [Contingencies of self-worth in Japanese culture: validation of the Japanese contingencies of self-worth scale].

    Science.gov (United States)

    Uchida, Yukiko

    2008-08-01

    The author developed a Japanese version of the Contingencies of Self-Worth Scale (CSWS) that was originally developed in the United States (Crocker, Luhtanen, Cooper, & Bouvrette, 2003). The Japanese version of the scale measures seven contingencies of self-esteem: Defeating others in competition, appearance, relationship harmony, other's approval, academic competence, virtue, and support of family and friends. Scores on the scale had systematic relationships with related variables, and the scale therefore exhibited satisfactory levels of construct validity: Relationship harmony, other's approval, and support of family and friends were positively correlated with sympathy and interdependence, whereas competitiveness was negatively correlated with sympathy. Moreover, competitiveness and academic achievement contingencies predicted competitive motivation, whereas the support of family and friends contingency predicted self-sufficient motivation. The scale has adequate test-retest reliability and a seven-factor structural model was confirmed. The implications for self-esteem and interpersonal relationships in Japanese culture are discussed.

  3. Contingent Faculty Perceptions of Organizational Support, Workplace Attitudes, and Teaching Evaluations at a Public Research University

    Directory of Open Access Journals (Sweden)

    Min Young Cha

    2016-03-01

    Full Text Available This research examines contingent faculty’s perception of organizational support, workplace attitudes, and Student Ratings of Teaching (SRT in a large public research university to investigate their employee-organization relationship. According to t-tests and regression analyses for samples of 2,229 faculty and instructional staff who answered the survey and had SRT data (tenured and tenure-track faculty: 1,708, 76.6% of total; contingent faculty: 521, 23.4% of total, employment relationship of contingent faculty in this institution was closer to a combined economic and social exchange model than to a pure economic exchange model or underinvestment model. Contingent faculty’s satisfaction with work, satisfaction with coworkers, perception of being supported at work, and affective organizational commitment were higher than tenured and tenure-track faculty at a statistically significant level. In addition, contingent faculty had higher SRT mean results in all areas of SRT items in medium-size (10-30 classes and in ‘class presentation,’ ‘feedback,’ ‘deeper understanding,’ and ‘interest stimulated’ in large-size (30-50 classes than Tenured and Tenure-track Faculty. These results not only refute the misconception that contingent faculty have too little time to provide students with feedback but also support that they provide students with good teaching, at least in medium-size and large-size classes. Whereas these results might be partially attributable to the relatively stable status of contingent faculty in this study (who work for more than 50 percent FTE, they indicate that, as a collective, contingent faculty also represent a significant contributor to the university, who are satisfied with their work, enjoy the community they are in, and are committed to their institution.

  4. Inevitability, contingency, and epistemic humility.

    Science.gov (United States)

    Kidd, Ian James

    2016-02-01

    This paper offers an epistemological framework for the debate about whether the results of scientific enquiry are inevitable or contingent. I argue in Sections 2 and 3 that inevitabilist stances are doubly guilty of epistemic hubris--a lack of epistemic humility--and that the real question concerns the scope and strength of our contingentism. The latter stages of the paper-Sections 4 and 5-address some epistemological and historiographical worries and sketch some examples of deep contingencies to guide further debate. I conclude by affirming that the concept of epistemic humility can usefully inform critical reflection on the contingency of the sciences and the practice of history of science. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. [A factor analysis method for contingency table data with unlimited multiple choice questions].

    Science.gov (United States)

    Toyoda, Hideki; Haiden, Reina; Kubo, Saori; Ikehara, Kazuya; Isobe, Yurie

    2016-02-01

    The purpose of this study is to propose a method of factor analysis for analyzing contingency tables developed from the data of unlimited multiple-choice questions. This method assumes that the element of each cell of the contingency table has a binominal distribution and a factor analysis model is applied to the logit of the selection probability. Scree plot and WAIC are used to decide the number of factors, and the standardized residual, the standardized difference between the sample, and the proportion ratio, is used to select items. The proposed method was applied to real product impression research data on advertised chips and energy drinks. Since the results of the analysis showed that this method could be used in conjunction with conventional factor analysis model, and extracted factors were fully interpretable, and suggests the usefulness of the proposed method in the study of psychology using unlimited multiple-choice questions.

  6. Quantile selection procedure and assoiated distribution of ratios of order statistics from a restricted family of probability distributions

    International Nuclear Information System (INIS)

    Gupta, S.S.; Panchapakesan, S.

    1975-01-01

    A quantile selection procedure in reliability problems pertaining to a restricted family of probability distributions is discussed. This family is assumed to be star-ordered with respect to the standard normal distribution folded at the origin. Motivation for this formulation of the problem is described. Both exact and asymptotic results dealing with the distribution of the maximum of ratios of order statistics from such a family are obtained and tables of the appropriate constants, percentiles of this statistic, are given in order to facilitate the use of the selection procedure

  7. 48 CFR 1318.201 - Contingency operation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Contingency operation. 1318.201 Section 1318.201 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.201 Contingency...

  8. 7 CFR 457.9 - Appropriation contingency.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 6 2010-01-01 2010-01-01 false Appropriation contingency. 457.9 Section 457.9 Agriculture Regulations of the Department of Agriculture (Continued) FEDERAL CROP INSURANCE CORPORATION, DEPARTMENT OF AGRICULTURE COMMON CROP INSURANCE REGULATIONS § 457.9 Appropriation contingency...

  9. Relative speed of processing determines color-word contingency learning.

    Science.gov (United States)

    Forrin, Noah D; MacLeod, Colin M

    2017-10-01

    In three experiments, we tested a relative-speed-of-processing account of color-word contingency learning, a phenomenon in which color identification responses to high-contingency stimuli (words that appear most often in particular colors) are faster than those to low-contingency stimuli. Experiment 1 showed equally large contingency-learning effects whether responding was to the colors or to the words, likely due to slow responding to both dimensions because of the unfamiliar mapping required by the key press responses. For Experiment 2, participants switched to vocal responding, in which reading words is considerably faster than naming colors, and we obtained a contingency-learning effect only for color naming, the slower dimension. In Experiment 3, previewing the color information resulted in a reduced contingency-learning effect for color naming, but it enhanced the contingency-learning effect for word reading. These results are all consistent with contingency learning influencing performance only when the nominally irrelevant feature is faster to process than the relevant feature, and therefore are entirely in accord with a relative-speed-of-processing explanation.

  10. Statistical equivalence and test-retest reliability of delay and probability discounting using real and hypothetical rewards.

    Science.gov (United States)

    Matusiewicz, Alexis K; Carter, Anne E; Landes, Reid D; Yi, Richard

    2013-11-01

    Delay discounting (DD) and probability discounting (PD) refer to the reduction in the subjective value of outcomes as a function of delay and uncertainty, respectively. Elevated measures of discounting are associated with a variety of maladaptive behaviors, and confidence in the validity of these measures is imperative. The present research examined (1) the statistical equivalence of discounting measures when rewards were hypothetical or real, and (2) their 1-week reliability. While previous research has partially explored these issues using the low threshold of nonsignificant difference, the present study fully addressed this issue using the more-compelling threshold of statistical equivalence. DD and PD measures were collected from 28 healthy adults using real and hypothetical $50 rewards during each of two experimental sessions, one week apart. Analyses using area-under-the-curve measures revealed a general pattern of statistical equivalence, indicating equivalence of real/hypothetical conditions as well as 1-week reliability. Exceptions are identified and discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Mini-Membrane Evaporator for Contingency Spacesuit Cooling

    Science.gov (United States)

    Makinen, Janice V.; Bue, Grant C.; Campbell, Colin; Petty, Brian; Craft, Jesse; Lynch, William; Wilkes, Robert; Vogel, Matthew

    2015-01-01

    The next-generation Advanced Extravehicular Mobility Unit (AEMU) Portable Life Support System (PLSS) is integrating a number of new technologies to improve reliability and functionality. One of these improvements is the development of the Auxiliary Cooling Loop (ACL) for contingency crewmember cooling. The ACL is a completely redundant, independent cooling system that consists of a small evaporative cooler--the Mini Membrane Evaporator (Mini-ME), independent pump, independent feedwater assembly and independent Liquid Cooling Garment (LCG). The Mini-ME utilizes the same hollow fiber technology featured in the full-sized AEMU PLSS cooling device, the Spacesuit Water Membrane Evaporator (SWME), but Mini-ME occupies only approximately 25% of the volume of SWME, thereby providing only the necessary crewmember cooling in a contingency situation. The ACL provides a number of benefits when compared with the current EMU PLSS contingency cooling technology, which relies upon a Secondary Oxygen Vessel; contingency crewmember cooling can be provided for a longer period of time, more contingency situations can be accounted for, no reliance on a Secondary Oxygen Vessel (SOV) for contingency cooling--thereby allowing a reduction in SOV size and pressure, and the ACL can be recharged-allowing the AEMU PLSS to be reused, even after a contingency event. The first iteration of Mini-ME was developed and tested in-house. Mini-ME is currently packaged in AEMU PLSS 2.0, where it is being tested in environments and situations that are representative of potential future Extravehicular Activities (EVA's). The second iteration of Mini-ME, known as Mini-ME2, is currently being developed to offer more heat rejection capability. The development of this contingency evaporative cooling system will contribute to a more robust and comprehensive AEMU PLSS.

  12. Parallel auto-correlative statistics with VTK.

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre; Bennett, Janine Camille

    2013-08-01

    This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.

  13. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    OpenAIRE

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...

  14. Contingency learning in human fear conditioning involves the ventral striatum.

    Science.gov (United States)

    Klucken, Tim; Tabbert, Katharina; Schweckendiek, Jan; Merz, Christian Josef; Kagerer, Sabine; Vaitl, Dieter; Stark, Rudolf

    2009-11-01

    The ability to detect and learn contingencies between fearful stimuli and their predictive cues is an important capacity to cope with the environment. Contingency awareness refers to the ability to verbalize the relationships between conditioned and unconditioned stimuli. Although there is a heated debate about the influence of contingency awareness on conditioned fear responses, neural correlates behind the formation process of contingency awareness have gained only little attention in human fear conditioning. Recent animal studies indicate that the ventral striatum (VS) could be involved in this process, but in human studies the VS is mostly associated with positive emotions. To examine this question, we reanalyzed four recently published classical fear conditioning studies (n = 117) with respect to the VS at three distinct levels of contingency awareness: subjects, who did not learn the contingencies (unaware), subjects, who learned the contingencies during the experiment (learned aware) and subjects, who were informed about the contingencies in advance (instructed aware). The results showed significantly increased activations in the left and right VS in learned aware compared to unaware subjects. Interestingly, this activation pattern was only found in learned but not in instructed aware subjects. We assume that the VS is not involved when contingency awareness does not develop during conditioning or when contingency awareness is unambiguously induced already prior to conditioning. VS involvement seems to be important for the transition from a contingency unaware to a contingency aware state. Implications for fear conditioning models as well as for the contingency awareness debate are discussed.

  15. 40 CFR 264.227 - Emergency repairs; contingency plans.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 25 2010-07-01 2010-07-01 false Emergency repairs; contingency plans... FACILITIES Surface Impoundments § 264.227 Emergency repairs; contingency plans. (a) A surface impoundment... days after detecting the problem. (c) As part of the contingency plan required in subpart D of this...

  16. Development of a statistical model for the determination of the probability of riverbank erosion in a Meditteranean river basin

    Science.gov (United States)

    Varouchakis, Emmanouil; Kourgialas, Nektarios; Karatzas, George; Giannakis, Georgios; Lilli, Maria; Nikolaidis, Nikolaos

    2014-05-01

    Riverbank erosion affects the river morphology and the local habitat and results in riparian land loss, damage to property and infrastructures, ultimately weakening flood defences. An important issue concerning riverbank erosion is the identification of the areas vulnerable to erosion, as it allows for predicting changes and assists with stream management and restoration. One way to predict the vulnerable to erosion areas is to determine the erosion probability by identifying the underlying relations between riverbank erosion and the geomorphological and/or hydrological variables that prevent or stimulate erosion. A statistical model for evaluating the probability of erosion based on a series of independent local variables and by using logistic regression is developed in this work. The main variables affecting erosion are vegetation index (stability), the presence or absence of meanders, bank material (classification), stream power, bank height, river bank slope, riverbed slope, cross section width and water velocities (Luppi et al. 2009). In statistics, logistic regression is a type of regression analysis used for predicting the outcome of a categorical dependent variable, e.g. binary response, based on one or more predictor variables (continuous or categorical). The probabilities of the possible outcomes are modelled as a function of independent variables using a logistic function. Logistic regression measures the relationship between a categorical dependent variable and, usually, one or several continuous independent variables by converting the dependent variable to probability scores. Then, a logistic regression is formed, which predicts success or failure of a given binary variable (e.g. 1 = "presence of erosion" and 0 = "no erosion") for any value of the independent variables. The regression coefficients are estimated by using maximum likelihood estimation. The erosion occurrence probability can be calculated in conjunction with the model deviance regarding

  17. A new framework of statistical inferences based on the valid joint sampling distribution of the observed counts in an incomplete contingency table.

    Science.gov (United States)

    Tian, Guo-Liang; Li, Hui-Qiong

    2017-08-01

    Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.

  18. 40 CFR 51.1012 - Requirement for contingency measures.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Requirement for contingency measures... Implementation of PM2.5 National Ambient Air Quality Standards § 51.1012 Requirement for contingency measures... contingency measures to be undertaken if the area fails to make reasonable further progress, or fails to...

  19. The Role of Feedback Contingency in Perceptual Category Learning

    Science.gov (United States)

    Ashby, F. Gregory; Vucovich, Lauren E.

    2016-01-01

    Feedback is highly contingent on behavior if it eventually becomes easy to predict, and weakly contingent on behavior if it remains difficult or impossible to predict even after learning is complete. Many studies have demonstrated that humans and nonhuman animals are highly sensitive to feedback contingency, but no known studies have examined how feedback contingency affects category learning, and current theories assign little or no importance to this variable. Two experiments examined the effects of contingency degradation on rule-based and information-integration category learning. In rule-based tasks, optimal accuracy is possible with a simple explicit rule, whereas optimal accuracy in information-integration tasks requires integrating information from two or more incommensurable perceptual dimensions. In both experiments, participants each learned rule-based or information-integration categories under either high or low levels of feedback contingency. The exact same stimuli were used in all four conditions and optimal accuracy was identical in every condition. Learning was good in both high-contingency conditions, but most participants showed little or no evidence of learning in either low-contingency condition. Possible causes of these effects are discussed, as well as their theoretical implications. PMID:27149393

  20. 48 CFR 1632.770 - Contingency reserve payments.

    Science.gov (United States)

    2010-10-01

    ... FINANCING Contract Funding 1632.770 Contingency reserve payments. (a) Payments from the contingency reserve... advise the carrier of its decision. However, OPM shall not unreasonably withhold approval for amounts...

  1. INVESTIGATION OF CONTINGENCY PATTERNS OF TEACHERS’ SCAFFOLDING IN TEACHING AND LEARNING MATHEMATICS

    Directory of Open Access Journals (Sweden)

    Anwar Anwar

    2016-12-01

    Full Text Available The purpose of this study is to investigate the patterns of scaffolding contingency in teaching and learning mathematics carried out by three teachers. Contingency patterns are obtained by examining the transcription from video recording of conversation fragments between teachers and students during the provision of scaffolding. The contingency patterns are drawn in three strategies: diagnostic strategy, intervention strategy, and checking diagnosis. The result shows that the three teachers expressed different interaction contingencies in their scaffolding activities: contingent dominant, non-contingent dominant, and pseudo-contingent. It is also found that the learning interaction performed by experienced teachers tends to be contingent dominant compared to novice teachers. Keywords: Contingency, Contingent Dominant, Non-Contingent Dominant, Pseudo Contingent, Scaffolding DOI: http://dx.doi.org/10.22342/jme.8.1.3410.65-76

  2. Cellular Analysis of Boltzmann Most Probable Ideal Gas Statistics

    Science.gov (United States)

    Cahill, Michael E.

    2018-04-01

    Exact treatment of Boltzmann's Most Probable Statistics for an Ideal Gas of Identical Mass Particles having Translational Kinetic Energy gives a Distribution Law for Velocity Phase Space Cell j which relates the Particle Energy and the Particle Population according toB e(j) = A - Ψ(n(j) + 1)where A & B are the Lagrange Multipliers and Ψ is the Digamma Function defined byΨ(x + 1) = d/dx ln(x!)A useful sufficiently accurate approximation for Ψ is given byΨ(x +1) ≈ ln(e-γ + x)where γ is the Euler constant (≈.5772156649) & so the above distribution equation is approximatelyB e(j) = A - ln(e-γ + n(j))which can be inverted to solve for n(j) givingn(j) = (eB (eH - e(j)) - 1) e-γwhere B eH = A + γ& where B eH is a unitless particle energy which replaces the parameter A. The 2 approximate distribution equations imply that eH is the highest particle energy and the highest particle population isnH = (eB eH - 1) e-γwhich is due to the facts that population becomes negative if e(j) > eH and kinetic energy becomes negative if n(j) > nH.An explicit construction of Cells in Velocity Space which are equal in volume and homogeneous for almost all cells is shown to be useful in the analysis.Plots for sample distribution properties using e(j) as the independent variable are presented.

  3. Comments on contingency management and conditional cash transfers.

    Science.gov (United States)

    Higgins, Stephen T

    2010-10-01

    This essay discusses research on incentive-based interventions to promote healthy behavior change, contingency management (CM) and conditional cash transfers (CCT). The overarching point of the essay is that CM and CCT are often treated as distinct areas of inquiry when at their core they represent a common approach. Some potential bi-directional benefits of recognizing this commonality are discussed. Distinct intellectual traditions probably account for the separate paths of CM and CCT to date, with the former being rooted in behavioral psychology and the latter in microeconomics. It is concluded that the emerging field of behavioral economics, which is informed by and integrates principles of each of those disciplines, may provide the proper conceptual framework for integrating CM and CCT.

  4. Resurgence of instrumental behavior after an abstinence contingency.

    Science.gov (United States)

    Bouton, Mark E; Schepers, Scott T

    2014-06-01

    In resurgence, an extinguished instrumental behavior (R1) recovers when a behavior that has replaced it (R2) is also extinguished. The phenomenon may be relevant to understanding relapse that can occur after the termination of "contingency management" treatments, in which an unwanted behavior (e.g., substance abuse) is reduced by reinforcing an alternative behavior. When reinforcement is discontinued, the unwanted behavior might resurge. However, unlike most resurgence experiments, contingency management treatments also introduce a negative contingency, in which reinforcers are not delivered unless the client has abstained from the unwanted behavior. In two experiments with rats, we therefore examined the effects of adding a negative "abstinence" contingency to the resurgence design. During response elimination, R2 was not reinforced unless R1 had not been emitted for a minimum period of time (45, 90, or 135 s). In both experiments, adding such a contingency to simple R1 extinction reduced, but did not eliminate, resurgence. In Experiment 2, we found the same effect in a yoked group that could earn reinforcers for R2 at the same points in time as the negative-contingency group, but without the requirement to abstain from R1. Thus, the negative contingency per se did not contribute to the reduction in resurgence. These results suggest that the contingency reduced resurgence by making reinforcers more difficult to earn and more widely spaced in time. This could have allowed the animal to learn that R1 was extinguished in the "context" of infrequent reinforcement-a context more like that of resurgence testing. The results are thus consistent with a contextual (renewal) account of resurgence. The method might provide a better model of relapse after termination of a contingency management treatment.

  5. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  6. 40 CFR 265.51 - Purpose and implementation of contingency plan.

    Science.gov (United States)

    2010-07-01

    ... contingency plan. 265.51 Section 265.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED..., STORAGE, AND DISPOSAL FACILITIES Contingency Plan and Emergency Procedures § 265.51 Purpose and implementation of contingency plan. (a) Each owner or operator must have a contingency plan for his facility. The...

  7. Contingency planning in southern Africa: Events rather than processes?

    Directory of Open Access Journals (Sweden)

    Elias Mabaso

    2013-09-01

    Full Text Available With the increasing frequency, magnitude and impact of disasters, there is growing focus on contingency planning as a tool for enhancing resilience. Yet, there is little empirical evidence that reflects on the practice of contingency planning systems within the context of disaster risk reduction. This article explores the practice of contingency planning in southern Africa, focussing on Malawi, Mozambique, Namibia, Zambia and Zimbabwe. A qualitative comparative analysis informed by fieldwork was used. The findings show that (1 there was a wide gap between theory and practice in contingency planning, (2 response activities rarely reflected projected scenarios and (3 resources were inadequate for effective contingency planning. We conclude that unless these issues are addressed, contingency planning is likely to remain a theoretical rather than a practical tool for building disaster-resilient communities in southern African countries. Although a generalisation cannot be made on the status of contingency planning and practice in southern Africa without a wider analysis of more examples, the findings may apply beyond the examined contexts and also offer insights into research gaps.

  8. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    Science.gov (United States)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  9. The foundations of statistics

    CERN Document Server

    Savage, Leonard J

    1972-01-01

    Classic analysis of the foundations of statistics and development of personal probability, one of the greatest controversies in modern statistical thought. Revised edition. Calculus, probability, statistics, and Boolean algebra are recommended.

  10. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  11. A comparison of Probability Of Detection (POD) data determined using different statistical methods

    Science.gov (United States)

    Fahr, A.; Forsyth, D.; Bullock, M.

    1993-12-01

    Different statistical methods have been suggested for determining probability of detection (POD) data for nondestructive inspection (NDI) techniques. A comparative assessment of various methods of determining POD was conducted using results of three NDI methods obtained by inspecting actual aircraft engine compressor disks which contained service induced cracks. The study found that the POD and 95 percent confidence curves as a function of crack size as well as the 90/95 percent crack length vary depending on the statistical method used and the type of data. The distribution function as well as the parameter estimation procedure used for determining POD and the confidence bound must be included when referencing information such as the 90/95 percent crack length. The POD curves and confidence bounds determined using the range interval method are very dependent on information that is not from the inspection data. The maximum likelihood estimators (MLE) method does not require such information and the POD results are more reasonable. The log-logistic function appears to model POD of hit/miss data relatively well and is easy to implement. The log-normal distribution using MLE provides more realistic POD results and is the preferred method. Although it is more complicated and slower to calculate, it can be implemented on a common spreadsheet program.

  12. Equivalence relations and the reinforcement contingency.

    Science.gov (United States)

    Sidman, M

    2000-07-01

    Where do equivalence relations come from? One possible answer is that they arise directly from the reinforcement contingency. That is to say, a reinforcement contingency produces two types of outcome: (a) 2-, 3-, 4-, 5-, or n-term units of analysis that are known, respectively, as operant reinforcement, simple discrimination, conditional discrimination, second-order conditional discrimination, and so on; and (b) equivalence relations that consist of ordered pairs of all positive elements that participate in the contingency. This conception of the origin of equivalence relations leads to a number of new and verifiable ways of conceptualizing equivalence relations and, more generally, the stimulus control of operant behavior. The theory is also capable of experimental disproof.

  13. Flood probability quantification for road infrastructure: Data-driven spatial-statistical approach and case study applications.

    Science.gov (United States)

    Kalantari, Zahra; Cavalli, Marco; Cantone, Carolina; Crema, Stefano; Destouni, Georgia

    2017-03-01

    Climate-driven increase in the frequency of extreme hydrological events is expected to impose greater strain on the built environment and major transport infrastructure, such as roads and railways. This study develops a data-driven spatial-statistical approach to quantifying and mapping the probability of flooding at critical road-stream intersection locations, where water flow and sediment transport may accumulate and cause serious road damage. The approach is based on novel integration of key watershed and road characteristics, including also measures of sediment connectivity. The approach is concretely applied to and quantified for two specific study case examples in southwest Sweden, with documented road flooding effects of recorded extreme rainfall. The novel contributions of this study in combining a sediment connectivity account with that of soil type, land use, spatial precipitation-runoff variability and road drainage in catchments, and in extending the connectivity measure use for different types of catchments, improve the accuracy of model results for road flood probability. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Optimal Nonlinear Pricing, Bundling Commodities and Contingent Services

    International Nuclear Information System (INIS)

    Podesta, Marion; Poudou, Jean-Christophe

    2008-01-01

    In this paper, we propose to analyze optimal nonlinear pricing when a firm offers in a bundle a commodity and a contingent service. The paper studies a mechanism design where all private information can be captured in a single scalar variable in a monopoly context. We show that to propose the package for commodity and service is less costly for the consumer, the firm has lower consumers' rent than the situation where it sells their good and contingent service under an independent pricing strategy. In fact, the possibility to use price discrimination via the supply of package is dominated by the fact that it is costly for the consumer to sign two contracts. Bundling energy and a contingent service is a profitable strategy for a energetician monopoly practising optimal nonlinear tariff. We show that the rates of the energy and the contingent service depend to the optional character of the contingent service and depend to the degree of complementarity between commodities and services. (authors)

  15. Contingency blindness: location-identity binding mismatches obscure awareness of spatial contingencies and produce profound interference in visual working memory.

    Science.gov (United States)

    Fiacconi, Chris M; Milliken, Bruce

    2012-08-01

    The purpose of the present study was to highlight the role of location-identity binding mismatches in obscuring explicit awareness of a strong contingency. In a spatial-priming procedure, we introduced a high likelihood of location-repeat trials. Experiments 1, 2a, and 2b demonstrated that participants' explicit awareness of this contingency was heavily influenced by the local match in location-identity bindings. In Experiment 3, we sought to determine why location-identity binding mismatches produce such low levels of contingency awareness. Our results suggest that binding mismatches can interfere substantially with visual-memory performance. We attribute the low levels of contingency awareness to participants' inability to remember the critical location-identity binding in the prime on a trial-to-trial basis. These results imply a close interplay between object files and visual working memory.

  16. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values

  17. Developing standardized facility contingency plans

    International Nuclear Information System (INIS)

    Davidson, D.A.

    1993-01-01

    Texaco consists of several operating departments that are, in effect, independent companies. Each of these departments is responsible for complying with all environmental laws and regulations. This includes the preparation by each facility to respond to an oil spill at that location. For larger spills, however, management of the response will rest with corporate regional response teams. Personnel from all departments make up the regional teams. In 1990, Congress passed the Oil Pollution Act. In 1991, the US Coast Guard began developing oil spill response contingency plan regulations, which they are still working on. Meanwhile, four of the five west coast states have also passed laws requiring contingency plans. (Only Hawaii has chosen to wait and see what the federal regulations will entail). Three of the states have already adopted regulations. Given these laws and regulations, along with its corporate structure, Texaco addressed the need to standardize local facility plans as well as its response organization. This paper discusses how, by working together, the Texaco corporate international oil spill response staff and the Texaco western region on-scene commander developed: A standard contingency plan format crossing corporate boundaries and meeting federal and state requirements. A response organization applicable to any size facility or spill. A strategy to sell the standard contingency plan and response organization to the operating units

  18. Skype me! Socially Contingent Interactions Help Toddlers Learn Language

    Science.gov (United States)

    Roseberry, Sarah; Hirsh-Pasek, Kathy; Golinkoff, Roberta Michnick

    2013-01-01

    Language learning takes place in the context of social interactions, yet the mechanisms that render social interactions useful for learning language remain unclear. This paper focuses on whether social contingency might support word learning. Toddlers aged 24- to 30-months (N=36) were exposed to novel verbs in one of three conditions: live interaction training, socially contingent video training over video chat, and non-contingent video training (yoked video). Results suggest that children only learned novel verbs in socially contingent interactions (live interactions and video chat). The current study highlights the importance of social contingency in interactions for language learning and informs the literature on learning through screen media as the first study to examine word learning through video chat technology. PMID:24112079

  19. QV modal distance displacement - a criterion for contingency ranking

    Energy Technology Data Exchange (ETDEWEB)

    Rios, M.A.; Sanchez, J.L.; Zapata, C.J. [Universidad de Los Andes (Colombia). Dept. of Electrical Engineering], Emails: mrios@uniandes.edu.co, josesan@uniandes.edu.co, cjzapata@utp.edu.co

    2009-07-01

    This paper proposes a new methodology using concepts of fast decoupled load flow, modal analysis and ranking of contingencies, where the impact of each contingency is measured hourly taking into account the influence of each contingency over the mathematical model of the system, i.e. the Jacobian Matrix. This method computes the displacement of the reduced Jacobian Matrix eigenvalues used in voltage stability analysis, as a criterion of contingency ranking, considering the fact that the lowest eigenvalue in the normal operation condition is not the same lowest eigenvalue in N-1 contingency condition. It is made using all branches in the system and specific branches according to the IBPF index. The test system used is the IEEE 118 nodes. (author)

  20. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    Science.gov (United States)

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  1. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  2. Prediction suppression in monkey inferotemporal cortex depends on the conditional probability between images.

    Science.gov (United States)

    Ramachandran, Suchitra; Meyer, Travis; Olson, Carl R

    2016-01-01

    When monkeys view two images in fixed sequence repeatedly over days and weeks, neurons in area TE of the inferotemporal cortex come to exhibit prediction suppression. The trailing image elicits only a weak response when presented following the leading image that preceded it during training. Induction of prediction suppression might depend either on the contiguity of the images, as determined by their co-occurrence and captured in the measure of joint probability P(A,B), or on their contingency, as determined by their correlation and as captured in the measures of conditional probability P(A|B) and P(B|A). To distinguish between these possibilities, we measured prediction suppression after imposing training regimens that held P(A,B) constant but varied P(A|B) and P(B|A). We found that reducing either P(A|B) or P(B|A) during training attenuated prediction suppression as measured during subsequent testing. We conclude that prediction suppression depends on contingency, as embodied in the predictive relations between the images, and not just on contiguity, as embodied in their co-occurrence. Copyright © 2016 the American Physiological Society.

  3. Contingent Employment in the Netherlands

    OpenAIRE

    Pot, F.; Koene, Bas; Paauwe, Jaap

    2001-01-01

    textabstractIn the last decade the Dutch labour market has demonstrated an admirable capacity to generate jobs. Consequently, the unemployment rate has significantly decreased. However, the newly generated jobs are a-typical in the sense that they are not full-time jobs based on open-ended contracts. Instead, the job growth has relied on the growth of part-time and contingent jobs. While the creation of part-time jobs seems to be employee-driven, contingent employment, in contrast, seems to b...

  4. Contingency Management and deliberative decision-making processes

    Directory of Open Access Journals (Sweden)

    Paul S. Regier

    2015-06-01

    Full Text Available Contingency Management is an effective treatment for drug addiction. The current explanation for its success is rooted in alternative reinforcement theory. We suggest that alternative reinforcement theory is inadequate to explain the success of Contingency Management and produce a model based on demand curves that show how little the monetary rewards offered in this treatment would affect drug use. Instead, we offer an explanation of its success based on the concept that it accesses deliberative decision-making processes. We suggest that Contingency Management is effective because it offers a concrete and immediate alternative to using drugs, which engages deliberative processes, improves the ability of those deliberative processes to attend to non-drug options, and offsets more automatic action-selection systems. This theory makes explicit predictions that can be tested, suggests which users will be most helped by Contingency Management, and suggests improvements in its implementation.

  5. Contingency Management and Deliberative Decision-Making Processes.

    Science.gov (United States)

    Regier, Paul S; Redish, A David

    2015-01-01

    Contingency management is an effective treatment for drug addiction. The current explanation for its success is rooted in alternative reinforcement theory. We suggest that alternative reinforcement theory is inadequate to explain the success of contingency management and produce a model based on demand curves that show how little the monetary rewards offered in this treatment would affect drug use. Instead, we offer an explanation of its success based on the concept that it accesses deliberative decision-making processes. We suggest that contingency management is effective because it offers a concrete and immediate alternative to using drugs, which engages deliberative processes, improves the ability of those deliberative processes to attend to non-drug options, and offsets more automatic action-selection systems. This theory makes explicit predictions that can be tested, suggests which users will be most helped by contingency management, and suggests improvements in its implementation.

  6. Assessment of climate change using methods of mathematic statistics and theory of probability

    International Nuclear Information System (INIS)

    Trajanoska, Lidija; Kaevski, Ivancho

    2004-01-01

    In simple terms: 'Climate' is the average of 'weather'. The Earth's weather system is a complex machine composed of coupled sub-systems (ocean, air, land, ice and the biosphere) between which energy are exchanged. The understanding and study of climate change does not only rely on the understanding of the physics of climate change but is linked to the following question: 'How we can detect change in a system that is changing all the time under its own volition'? What is even the meaning of 'change' in such a situation? The concept of 'change' we should transform into the concept of 'significant and long-term' then this re-phrasing allows for a definition in mathematical terms. Significant change in a system becomes a measure of how large an observed change is in terms of the variability one would see under 'normal' conditions. Example could be the analyses of the yearly temperature of the air and precipitations, like in this paper. A large amount of data are selected as representing the 'before' case (change) and another set of data are selected as being the 'after' case and then the average in these two cases are compared. These comparisons are in the form of 'hypothesis tests' in which one tests whether the hypothesis that there has Open no change can be rejected. Both parameter and nonparametric statistic methods are used in the theory of mathematic statistic. The most indicative changeable which show global change is an average, standard deviation and probability function distribution on examined time series. Examined meteorological series are taken like haphazard process so we can mathematic statistic applied.(Author)

  7. Turning negative memories around: Contingency versus devaluation techniques.

    Science.gov (United States)

    Dibbets, Pauline; Lemmens, Anke; Voncken, Marisol

    2018-09-01

    It is assumed that fear responses can be altered by changing the contingency between a conditioned stimulus (CS) and an unconditioned stimulus (US), or by devaluing the present mental representation of the US. The aim of the present study was to compare the efficacy of contingency- and devaluation-based intervention techniques on the diminishment in - and return of fear. We hypothesized that extinction (EXT, contingency-based) would outperform devaluation-based techniques regarding contingency measures, but that devaluation-based techniques would be most effective in reducing the mental representation of the US. Additionally, we expected that incorporations of the US during devaluation would result in less reinstatement of the US averseness. Healthy participants received a fear conditioning paradigm followed by one of three interventions: extinction (EXT, contingency-based), imagery rescripting (ImRs, devaluation-based) or eye movement desensitization and reprocessing (EMDR, devaluation-based). A reinstatement procedure and test followed the next day. EXT was indeed most successful in diminishing contingency-based US expectancies and skin conductance responses (SCRs), but all interventions were equally successful in reducing the averseness of the mental US representation. After reinstatement EXT showed lowest expectancies and SCRs; no differences were observed between the conditions concerning the mental US representation. A partial reinforcement schedule was used, resulting in a vast amount of contingency unaware participants. Additionally, a non-clinical sample was used, which may limit the generalizability to clinical populations. EXT is most effective in reducing conditioned fear responses. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Statistical inference

    CERN Document Server

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  9. Impact of Contingent Liabilities Reclassification on Financial Performance: An Analysis with Brazilian Companies of Electricity Utilities Sector

    Directory of Open Access Journals (Sweden)

    Carlos Henrique Silva do Carmo

    2018-01-01

    Full Text Available The main objective of this research is to verify the effect of a reconfiguration of contingent liabilities in the financial performance of Brazilian companies in the electricity sector in the years 2013 to 2015. The sample consisted of 53 companies and 153 financial statements were analyzed. The model used to reconfigure the contingent liabilities into provisions was developed by Rose (2014. The author classifies the reversal in 5 scenarios: Optimistic (20% Partially Optimistic (40%, moderate (60% Partially Pessimistic (80% and Pessimistic (100%. To represent the economic and financial performance we selected three indicators: General Liquidity (LG, General Indebtedness (EG and Return Over Assets (ROA. The results showed that for worst-case scenarios, there is a significant difference between the indicators calculated on the original data in comparison with those calculated within the reclassified scenarios. The Statistics D of Cohen (1988 pointed out that, in addition to statistically significant, in the worst scenarios, the size effect was also too high, especially for the ROA and the Indebtedness. For Liquidity the differences were not so significant. The findings of this research serve as a warning to users of financial statements and pointed out that they should be aware of the financial effect of different interpretations of the individuals involved in decisions about the likelihood of loss in provision and Contingent liabilities.

  10. Breaking the Myth of Flexible Work: Contingent Work in Toronto. A Study Conducted by the Contingent Workers Project.

    Science.gov (United States)

    de Wolff, Alice

    A survey of 205 people, 4 group interviews with approximately 30 people, and 6 design and analysis meetings involving approximately 40 people were conducted in a 1999 participatory study of contingent workers in Toronto. (Contingent work was defined to be lower-waged forms of non-permanent work arrangements that include contracting, employment…

  11. Disciplined Decision Making in an Interdisciplinary Environment: Some Implications for Clinical Applications of Statistical Process Control.

    Science.gov (United States)

    Hantula, Donald A.

    1995-01-01

    Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…

  12. Reporting, Recording, and Transferring Contingency Demand Data

    National Research Council Canada - National Science Library

    Smith, Bernard

    2000-01-01

    .... In this report, we develop a standard set of procedures for reporting and recording demand data at the contingency location and transferring contingency demand data to the home base - ensuring proper level allocation and valid worldwide peacetime operating stock (POS) and readiness spares package (RSP) requirements.

  13. Sartre's Contingency of Being and Asouzu's Principle of Causality ...

    African Journals Online (AJOL)

    The position of this work is that all contingent beings have a causal agent. This position is taken as a result of trying to delve into the issue of contingency and causality of being which has been discussed by many philosophers of diverse epochs of philosophy. This work tries to participate in the debate of whether contingent ...

  14. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  15. 40 CFR 267.54 - When must I amend the contingency plan?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 26 2010-07-01 2010-07-01 false When must I amend the contingency plan... STANDARDIZED PERMIT Contingency Plan and Emergency Procedures § 267.54 When must I amend the contingency plan? You must review, and immediately amend the contingency plan, if necessary, whenever: (a) The facility...

  16. 25 CFR 39.503 - How can a school use contingency funds?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false How can a school use contingency funds? 39.503 Section 39.503 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR EDUCATION THE INDIAN SCHOOL EQUALIZATION PROGRAM Contingency Fund § 39.503 How can a school use contingency funds? Contingency funds can be...

  17. Estimating state-contingent production functions

    DEFF Research Database (Denmark)

    Rasmussen, Svend; Karantininis, Kostas

    The paper reviews the empirical problem of estimating state-contingent production functions. The major problem is that states of nature may not be registered and/or that the number of observation per state is low. Monte Carlo simulation is used to generate an artificial, uncertain production...... environment based on Cobb Douglas production functions with state-contingent parameters. The pa-rameters are subsequently estimated based on different sizes of samples using Generalized Least Squares and Generalized Maximum Entropy and the results are compared. It is concluded that Maximum Entropy may...

  18. Between Certainty and Uncertainty Statistics and Probability in Five Units with Notes on Historical Origins and Illustrative Numerical Examples

    CERN Document Server

    Laudański, Ludomir M

    2013-01-01

    „Between Certainty & Uncertainty” is a one-of–a-kind short course on statistics for students, engineers  and researchers.  It is a fascinating introduction to statistics and probability with notes on historical origins and 80 illustrative numerical examples organized in the five units:   ·         Chapter 1  Descriptive Statistics:  Compressing small samples, basic averages - mean and variance, their main properties including God’s proof; linear transformations and z-scored statistics .   ·         Chapter 2 Grouped data: Udny Yule’s concept of qualitative and quantitative variables. Grouping these two kinds of data. Graphical tools. Combinatorial rules and qualitative variables.  Designing frequency histogram. Direct and coded evaluation of quantitative data. Significance of percentiles.   ·         Chapter 3 Regression and correlation: Geometrical distance and equivalent distances in two orthogonal directions  as a prerequisite to the concept of two regressi...

  19. 25 CFR 39.501 - What is an emergency or unforeseen contingency?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false What is an emergency or unforeseen contingency? 39.501... EQUALIZATION PROGRAM Contingency Fund § 39.501 What is an emergency or unforeseen contingency? An emergency or unforeseen contingency is an event that meets all of the following criteria: (a) It could not be planned for...

  20. 78 FR 46781 - Federal Acquisition Regulation; Definition of Contingency Operation

    Science.gov (United States)

    2013-08-01

    ... Federal Acquisition Regulation; Definition of Contingency Operation AGENCY: Department of Defense (DoD... the Federal Acquisition Regulation (FAR) to revise the definition of ``contingency operation'' to... ``contingency operation'' at FAR 2.101 in accordance with the statutory change to the definition made by...

  1. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  2. The double-contingency principle: An historical perspective

    International Nuclear Information System (INIS)

    Knief, R.A.

    1995-01-01

    Standard ANSI/ANS-8.1 states the double contingency principle as: Process designs should, in general, incorporate sufficient factors of safety to require at least two unlikely, independent, and concurrent changes in process conditions before a criticality accident is possible. This paper presents a perspective on the double contingency principle

  3. 40 CFR 300.210 - Federal contingency plans.

    Science.gov (United States)

    2010-07-01

    ... contingency plans under the national response system: The National Contingency Plan, RCPs, and ACPs. These... discharge under § 300.324, and to mitigate or prevent a substantial threat of such a discharge, from a vessel, offshore facility, or onshore facility operating in or near the area. (2) The areas of...

  4. A statistical model for deriving probability distributions of contamination for accidental releases

    International Nuclear Information System (INIS)

    ApSimon, H.M.; Davison, A.C.

    1986-01-01

    Results generated from a detailed long-range transport model, MESOS, simulating dispersal of a large number of hypothetical releases of radionuclides in a variety of meteorological situations over Western Europe have been used to derive a simpler statistical model, MESOSTAT. This model may be used to generate probability distributions of different levels of contamination at a receptor point 100-1000 km or so from the source (for example, across a frontier in another country) without considering individual release and dispersal scenarios. The model is embodied in a series of equations involving parameters which are determined from such factors as distance between source and receptor, nuclide decay and deposition characteristics, release duration, and geostrophic windrose at the source. Suitable geostrophic windrose data have been derived for source locations covering Western Europe. Special attention has been paid to the relatively improbable extreme values of contamination at the top end of the distribution. The MESOSTAT model and its development are described, with illustrations of its use and comparison with the original more detailed modelling techniques. (author)

  5. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    Science.gov (United States)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-06-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  6. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  7. Optimum Inductive Methods. A study in Inductive Probability, Bayesian Statistics, and Verisimilitude.

    NARCIS (Netherlands)

    Festa, Roberto

    1992-01-01

    According to the Bayesian view, scientific hypotheses must be appraised in terms of their posterior probabilities relative to the available experimental data. Such posterior probabilities are derived from the prior probabilities of the hypotheses by applying Bayes'theorem. One of the most important

  8. DETERMINE THE PROBABILITY OF PASSENGER SURVIVAL IN AN AVIATION INCIDENT WITH FIRE ON THE GROUND

    Directory of Open Access Journals (Sweden)

    Vladislav Pavlovich Turko

    2017-05-01

    Full Text Available Conducting the risk level of aviation incident with fire and the impacts of contingence affecting factors on people. Base on statistical data of aviation incident, the model of aircraft fire situation on the ground was offer.

  9. Contingency Management with Human Autonomy Teaming

    Science.gov (United States)

    Shively, Robert J.; Lachter, Joel B.

    2018-01-01

    Automation is playing an increasingly important role in many operations. It is often cheaper faster and more precise than human operators. However, automation is not perfect. There are many situations in which a human operator must step in. We refer to these instances as contingencies and the act of stepping in contingency management. Here we propose coupling Human Autonomy Teaming (HAT) with contingency management. We describe two aspects to HAT, bi-directional communication, and working agreements (or plays). Bi-directional communication like Crew Resource Management in traditional aviation, allows all parties to contribute to a decision. Working agreements specify roles and responsibilities. Importantly working agreements allow for the possibility of roles and responsibilities changing depending on environmental factors (e.g., situations the automation was not designed for, workload, risk, or trust). This allows for the automation to "automatically" become more autonomous as it becomes more trusted and/or it is updated to deal with a more complete set of possible situations. We present a concrete example using a prototype contingency management station one might find in a future airline operations center. Automation proposes reroutes for aircraft that encounter bad weather or are forced to divert for environmental or systems reasons. If specific conditions are met, these recommendations may be autonomously datalinked to the affected aircraft.

  10. Management issues regarding the contingent workforce

    Energy Technology Data Exchange (ETDEWEB)

    Bowen-Smed, S. [Bowen Workforce Solutions, Calgary, AB (Canada)

    2004-07-01

    Fifty per cent of corporate leaders in Calgary today will be eligible for retirement over the next 5 years. In addition, 53 per cent of the entire Calgary workforce is 45 years or older. This paper suggests that only companies that seek aggressive programs to engage immigrants and contractors will weather the skills shortages anticipated in the future. It was noted that contractors care about aligning values to organizations, regardless of the project length, and that professional development is a key consideration when it comes to selecting their next project. Contingent workforce issues include: effectiveness; classification; risk; and cost. It was stated that effectiveness of the contingent workforce is an employer's responsibility. Factors that would strengthen the relationship between corporations and contractors include: proper orientation to manage expectations; training to improve productivity; tracking to enhance the quality of the workforce; and a management process to ensure adherence to protocol. It was concluded that the contingent workforce is an essential component to human capital management strategy, but that key issues must be managed to avoid unnecessary costs. In addition, effectiveness improves when processes are implemented. It was also suggested that technology is an essential component of the solution. Outsourcing is an effective approach to managing the contingent workforce. tabs., figs.

  11. Sound-contingent visual motion aftereffect

    Directory of Open Access Journals (Sweden)

    Kobayashi Maori

    2011-05-01

    Full Text Available Abstract Background After a prolonged exposure to a paired presentation of different types of signals (e.g., color and motion, one of the signals (color becomes a driver for the other signal (motion. This phenomenon, which is known as contingent motion aftereffect, indicates that the brain can establish new neural representations even in the adult's brain. However, contingent motion aftereffect has been reported only in visual or auditory domain. Here, we demonstrate that a visual motion aftereffect can be contingent on a specific sound. Results Dynamic random dots moving in an alternating right or left direction were presented to the participants. Each direction of motion was accompanied by an auditory tone of a unique and specific frequency. After a 3-minutes exposure, the tones began to exert marked influence on the visual motion perception, and the percentage of dots required to trigger motion perception systematically changed depending on the tones. Furthermore, this effect lasted for at least 2 days. Conclusions These results indicate that a new neural representation can be rapidly established between auditory and visual modalities.

  12. 78 FR 13765 - Federal Acquisition Regulation; Definition of Contingency Operation

    Science.gov (United States)

    2013-02-28

    ... Federal Acquisition Regulation; Definition of Contingency Operation AGENCY: Department of Defense (DoD... Regulation (FAR) to revise the definition of ``contingency operation'' to address the statutory change to the... ``contingency operation'' at FAR 2.101 in accordance with the statutory change to the definition made by...

  13. Selection and ordering of contingencies for evaluation of voltage safety conditions; Selecao e ordenacao de contingencias para avaliacao das condicoes de seguranca de tensao

    Energy Technology Data Exchange (ETDEWEB)

    Moura, Ricardo Drumond de

    2002-10-01

    Although the use of compensation of reactive power allows a higher loading in the electric system, it leads it to work closer to voltage collapse situations. Therefore it's necessary to assess the behavior of the system regarding to this phenomena in the occurrence of the contingencies. In this work some existing methods are studied to check the capacity of ranking the contingencies that might affect the system, and how these methods rank them by severity and select those which are more damaging. The methods are studied having as the main focus the real-time operation. This work proposes a method which is able to rank and select a list of probable contingencies, having as a basis, nodal indexes of voltage security conditions assessment. These indexes are based on MV A margin to the maximum loading, indicate the region of operation on V x P,Q curve, and the relative importance among buses. The sensitivity index which indicates the reduction of the power margin before a contingency, is studied in detail. Besides the nodal analysis, it is proposed a form of a systemic analysis which is able to rank and select the contingencies according to their influence upon all electrical system. (author)

  14. Contingent Attentional Capture by Conceptually Relevant Images

    Science.gov (United States)

    Wyble, Brad; Folk, Charles; Potter, Mary C.

    2013-01-01

    Attentional capture is an unintentional shift of visuospatial attention to the location of a distractor that is either highly salient, or relevant to the current task set. The latter situation is referred to as contingent capture, in that the effect is contingent on a match between characteristics of the stimuli and the task-defined…

  15. Pure perceptual-based learning of second-, third-, and fourth-order sequential probabilities.

    Science.gov (United States)

    Remillard, Gilbert

    2011-07-01

    There is evidence that sequence learning in the traditional serial reaction time task (SRTT), where target location is the response dimension, and sequence learning in the perceptual SRTT, where target location is not the response dimension, are handled by different mechanisms. The ability of the latter mechanism to learn sequential contingencies that can be learned by the former mechanism was examined. Prior research has established that people can learn second-, third-, and fourth-order probabilities in the traditional SRTT. The present study reveals that people can learn such probabilities in the perceptual SRTT. This suggests that the two mechanisms may have similar architectures. A possible neural basis of the two mechanisms is discussed.

  16. The Effectiveness of Gaze-Contingent Control in Computer Games.

    Science.gov (United States)

    Orlov, Paul A; Apraksin, Nikolay

    2015-01-01

    Eye-tracking technology and gaze-contingent control in human-computer interaction have become an objective reality. This article reports on a series of eye-tracking experiments, in which we concentrated on one aspect of gaze-contingent interaction: Its effectiveness compared with mouse-based control in a computer strategy game. We propose a measure for evaluating the effectiveness of interaction based on "the time of recognition" the game unit. In this article, we use this measure to compare gaze- and mouse-contingent systems, and we present the analysis of the differences as a function of the number of game units. Our results indicate that performance of gaze-contingent interaction is typically higher than mouse manipulation in a visual searching task. When tested on 60 subjects, the results showed that the effectiveness of gaze-contingent systems over 1.5 times higher. In addition, we obtained that eye behavior stays quite stabile with or without mouse interaction. © The Author(s) 2015.

  17. Electrophysiological brain indices of risk behavior modification induced by contingent feedback.

    Science.gov (United States)

    Megías, Alberto; Torres, Miguel Angel; Catena, Andrés; Cándido, Antonio; Maldonado, Antonio

    2018-02-01

    The main aim of this research was to study the effects of response feedback on risk behavior and the neural and cognitive mechanisms involved, as a function of the feedback contingency. Sixty drivers were randomly assigned to one of three feedback groups: contingent, non-contingent and no feedback. The participants' task consisted of braking or not when confronted with a set of risky driving situations, while their electroencephalographic activity was continuously recorded. We observed that contingent feedback, as opposed to non-contingent feedback, promoted changes in the response bias towards safer decisions. This behavioral modification implied a higher demand on cognitive control, reflected in a larger amplitude of the N400 component. Moreover, the contingent feedback, being predictable and entailing more informative value, gave rise to smaller SPN and larger FRN scores when compared with non-contingent feedback. Taken together, these findings provide a new and complex insight into the neurophysiological basis of the influence of feedback contingency on the processing of decision-making under risk. We suggest that response feedback, when contingent upon the risky behavior, appears to improve the functionality of the brain mechanisms involved in decision-making and can be a powerful tool for reducing the tendency to choose risky options in risk-prone individuals. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Skype me! Socially Contingent Interactions Help Toddlers Learn Language

    OpenAIRE

    Roseberry, Sarah; Hirsh-Pasek, Kathy; Golinkoff, Roberta Michnick

    2013-01-01

    Language learning takes place in the context of social interactions, yet the mechanisms that render social interactions useful for learning language remain unclear. This paper focuses on whether social contingency might support word learning. Toddlers aged 24- to 30-months (N=36) were exposed to novel verbs in one of three conditions: live interaction training, socially contingent video training over video chat, and non-contingent video training (yoked video). Results sugges...

  19. Contingency planning in southern Africa: Events rather than processes?

    OpenAIRE

    Elias Mabaso; Siambabala B. Manyena

    2013-01-01

    With the increasing frequency, magnitude and impact of disasters, there is growing focus on contingency planning as a tool for enhancing resilience. Yet, there is little empirical evidence that reflects on the practice of contingency planning systems within the context of disaster risk reduction. This article explores the practice of contingency planning in southern Africa, focussing on Malawi, Mozambique, Namibia, Zambia and Zimbabwe. A qualitative comparative analysis informed by fieldwork ...

  20. MATERNAL ANXIETY SYMPTOMS AND MOTHER–INFANT SELF- AND INTERACTIVE CONTINGENCY

    Science.gov (United States)

    Beebe, Beatrice; Steele, Miriam; Jaffe, Joseph; Buck, Karen A.; Chen, Henian; Cohen, Patricia; Kaitz, Marsha; Markese, Sara; Andrews, Howard; Margolis, Amy; Feldstein, Stanley

    2014-01-01

    Associations of maternal self-report anxiety-related symptoms with mother–infant 4-month face-to-face play were investigated in 119 pairs. Attention, affect, spatial orientation, and touch were coded from split-screen videotape on a 1-s time base. Self- and interactive contingency were assessed by time-series methods. Because anxiety symptoms signal emotional dysregulation, we expected to find atypical patterns of mother–infant interactive contingencies, and of degree of stability/lability within an individual’s own rhythms of behavior (self-contingencies). Consistent with our optimum midrange model, maternal anxiety-related symptoms biased the interaction toward interactive contingencies that were both heightened (vigilant) in some modalities and lowered (withdrawn) in others; both may be efforts to adapt to stress. Infant self-contingency was lowered (“destabilized”) with maternal anxiety symptoms; however, maternal self-contingency was both lowered in some modalities and heightened (overly stable) in others. Interactive contingency patterns were characterized by intermodal discrepancies, confusing forms of communication. For example, mothers vigilantly monitored infants visually, but withdrew from contingently coordinating with infants emotionally, as if mothers were “looking through” them. This picture fits descriptions of mothers with anxiety symptoms as overaroused/fearful, leading to vigilance, but dealing with their fear through emotional distancing. Infants heightened facial affect coordination (vigilance), but dampened vocal affect coordination (withdrawal), with mother’s face—a pattern of conflict. The maternal and infant patterns together generated a mutual ambivalence. PMID:25983359

  1. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  2. The Impact of the Contingency of Robot Feedback for HRI

    DEFF Research Database (Denmark)

    Fischer, Kerstin; Lohan, Katrin Solveig; Saunders, Joe

    2013-01-01

    robot iCub on a set of shapes and on a stacking task in two conditions, once with socially contingent, nonverbal feedback implemented in response to different gaze and looming behaviors of the human tutor, and once with non-contingent, saliency-based feedback. The results of the analysis of participants......’ linguistic behaviors in the two conditions show that contingency has an impact on the complexity and the pre-structuring of the task for the robot, i.e. on the participants’ tutoring behaviors. Contingency thus plays a considerable role for learning by demonstration....

  3. Statistical inference based on divergence measures

    CERN Document Server

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  4. Probabilistic real-time contingency ranking method

    International Nuclear Information System (INIS)

    Mijuskovic, N.A.; Stojnic, D.

    2000-01-01

    This paper describes a real-time contingency method based on a probabilistic index-expected energy not supplied. This way it is possible to take into account the stochastic nature of the electric power system equipment outages. This approach enables more comprehensive ranking of contingencies and it is possible to form reliability cost values that can form the basis for hourly spot price calculations. The electric power system of Serbia is used as an example for the method proposed. (author)

  5. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  6. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  7. DETERMINING TYPE Ia SUPERNOVA HOST GALAXY EXTINCTION PROBABILITIES AND A STATISTICAL APPROACH TO ESTIMATING THE ABSORPTION-TO-REDDENING RATIO R{sub V}

    Energy Technology Data Exchange (ETDEWEB)

    Cikota, Aleksandar [European Southern Observatory, Karl-Schwarzschild-Strasse 2, D-85748 Garching b. München (Germany); Deustua, Susana [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Marleau, Francine, E-mail: acikota@eso.org [Institute for Astro- and Particle Physics, University of Innsbruck, Technikerstrasse 25/8, A-6020 Innsbruck (Austria)

    2016-03-10

    We investigate limits on the extinction values of Type Ia supernovae (SNe Ia) to statistically determine the most probable color excess, E(B – V), with galactocentric distance, and use these statistics to determine the absorption-to-reddening ratio, R{sub V}, for dust in the host galaxies. We determined pixel-based dust mass surface density maps for 59 galaxies from the Key Insight on Nearby Galaxies: a Far-infrared Survey with Herschel (KINGFISH). We use SN Ia spectral templates to develop a Monte Carlo simulation of color excess E(B – V) with R{sub V} = 3.1 and investigate the color excess probabilities E(B – V) with projected radial galaxy center distance. Additionally, we tested our model using observed spectra of SN 1989B, SN 2002bo, and SN 2006X, which occurred in three KINGFISH galaxies. Finally, we determined the most probable reddening for Sa–Sap, Sab–Sbp, Sbc–Scp, Scd–Sdm, S0, and irregular galaxy classes as a function of R/R{sub 25}. We find that the largest expected reddening probabilities are in Sab–Sb and Sbc–Sc galaxies, while S0 and irregular galaxies are very dust poor. We present a new approach for determining the absorption-to-reddening ratio R{sub V} using color excess probability functions and find values of R{sub V} = 2.71 ± 1.58 for 21 SNe Ia observed in Sab–Sbp galaxies, and R{sub V} = 1.70 ± 0.38, for 34 SNe Ia observed in Sbc–Scp galaxies.

  8. Does contingency in adults' responding influence 12-month-old infants' social referencing?

    Science.gov (United States)

    Stenberg, Gunilla

    2017-11-01

    In two experiments we examined the influence of contingent versus non-contingent responding on infant social referencing behavior. EXPERIMENT 1: Forty 12-month-old infants were exposed to an ambiguous toy in a social referencing situation. In one condition an unfamiliar adult who in a previous play situation had responded contingently to the infant's looks gave the infant positive information about the toy. In the other condition an unfamiliar adult who previously had not responded contingently delivered the positive information. EXPERIMENT 2: Forty-eight 12-month-old infants participated in Experiment 2. In this experiment it was examined whether the familiarity of the adult influences infants' reactions to contingency in responding. In one condition a parent who previously had responded contingently to the infant's looks provided positive information about the ambiguous toy, and in the other condition a parent who previously had not responded contingently provided the positive information. The infants looked more at the contingent experimenter in Experimenter 1, and also played more with the toy after receiving positive information from the contingent experimenter. No differences in looking at the parent and in playing with the toy were found in Experiment 2. The results indicate that contingency in responding, as well as the familiarity of the adult, influence infants' social referencing behavior. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Contingency learning without awareness: evidence for implicit control.

    Science.gov (United States)

    Schmidt, James R; Crump, Matthew J C; Cheesman, Jim; Besner, Derek

    2007-06-01

    The results of four experiments provide evidence for controlled processing in the absence of awareness. Participants identified the colour of a neutral distracter word. Each of four words (e.g., MOVE) was presented in one of the four colours 75% of the time (Experiments 1 and 4) or 50% of the time (Experiments 2 and 3). Colour identification was faster when the words appeared in the colour they were most often presented in relative to when they appeared in another colour, even for participants who were subjectively unaware of any contingencies between the words and the colours. An analysis of sequence effects showed that participants who were unaware of the relation between distracter words and colours nonetheless controlled the impact of the word on performance depending on the nature of the previous trial. A block analysis of contingency-unaware participants revealed that contingencies were learned rapidly in the first block of trials. Experiment 3 showed that the contingency effect does not depend on the level of awareness, thus ruling out explicit strategy accounts. Finally, Experiment 4 showed that the contingency effect results from behavioural control and not from semantic association or stimulus familiarity. These results thus provide evidence for implicit control.

  10. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  11. The Contingent Value of Organizational Integration

    Directory of Open Access Journals (Sweden)

    Virpi Turkulainen

    2013-08-01

    Full Text Available We elaborate the link between organizational design and effectiveness by examining organizational integration and performance in the context of modern manufacturing. Through careful contextualization and empirical analysis of 266 manufacturing organizations in three industries and nine countries, we uncover a joint effect of integration and complexity on organizational effectiveness. The results extend structural contingency theory, in particular the mechanisms that link organizational integration to organizational effectiveness. We conclude by discussing the continuing relevance of structural contingency theory.

  12. Distinct changes in CREB phosphorylation in frontal cortex and striatum during contingent and non-contingent performance of a visual attention task

    Directory of Open Access Journals (Sweden)

    Mirjana eCarli

    2011-10-01

    Full Text Available The cyclic-AMP response element binding protein (CREB family of transcription factors has been implicated in numerous forms of behavioural plasticity. We investigated CREB phosphorylation along some nodes of corticostriatal circuitry such as frontal cortex (FC and dorsal (caudate putamen, CPu and ventral (nucleus accumbens, NAC striatum in response to the contingent or non-contingent performance of the five-choice serial reaction time task (5-CSRTT used to assess visuospatial attention. Three experimental manipulations were used; an attentional performance group (contingent, master, a group trained previously on the task but for whom the instrumental contingency coupling responding with stimulus detection and reward was abolished (non-contingent, yoked and a control group matched for food deprivation and exposure to the test apparatus (untrained. Rats trained on the 5-CSRTT (both master and yoked had higher levels of CREB protein in the FC, CPu and NAC compared to untrained controls. Despite the divergent behaviour of master and yoked rats CREB activity in the FC was not substantially different. In rats performing the 5-CSRTT (master, CREB activity was completely abolished in the CPu whereas in the NAC it remained unchanged. In contrast, CREB phosphorylation in CPu and NAC increased only when the contingency changed from goal-dependent to goal-independent reinforcement (yoked. The present results indicate that up-regulation of CREB protein expression across cortical and striatal regions possibly reflects the extensive instrumental learning and performance whereas increased CREB activity in striatal regions may signal the unexpected change in the relationship between instrumental action and reinforcement.

  13. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    Science.gov (United States)

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  14. Limiting values of large deviation probabilities of quadratic statistics

    NARCIS (Netherlands)

    Jeurnink, Gerardus A.M.; Kallenberg, W.C.M.

    1990-01-01

    Application of exact Bahadur efficiencies in testing theory or exact inaccuracy rates in estimation theory needs evaluation of large deviation probabilities. Because of the complexity of the expressions, frequently a local limit of the nonlocal measure is considered. Local limits of large deviation

  15. Effectiveness evaluation of contingency sum as a risk management ...

    African Journals Online (AJOL)

    Ethiopian Journal of Environmental Studies and Management ... manage risks prone projects have adopted several methods, one of which is contingency sum. ... initial project cost, cost overrun and percentage allowed for contingency.

  16. Contingent capture effects in temporal order judgments.

    Science.gov (United States)

    Born, Sabine; Kerzel, Dirk; Pratt, Jay

    2015-08-01

    The contingent attentional capture hypothesis proposes that visual stimuli that do not possess characteristics relevant for the current task will not capture attention, irrespective of their bottom-up saliency. Typically, contingent capture is tested in a spatial cuing paradigm, comparing manual reaction times (RTs) across different conditions. However, attention may act through several mechanisms and RTs may not be ideal to disentangle those different components. In 3 experiments, we examined whether color singleton cues provoke cuing effects in temporal order judgments (TOJs) and whether they would be contingent on attentional control sets. Experiment 1 showed that color singleton cues indeed produce cuing effects in TOJs, even in a cluttered and dynamic target display containing multiple heterogeneous distractors. In Experiment 2, consistent with contingent capture, we observed reliable cuing effects only when the singleton cue matched participants' current attentional control set. Experiment 3 suggests that a sensory interaction account of the differences found in Experiment 2 is unlikely. Our results help to discern the attentional components that may play a role in contingent capture. Further, we discuss a number of other effects (e.g., reversed cuing effects) that are found in RTs, but so far have not been reported in TOJs. Those differences suggest that RTs are influenced by a multitude of mechanisms; however, not all of these mechanisms may affect TOJs. We conclude by highlighting how the study of attentional capture in TOJs provides valuable insights for the attention literature, but also for studies concerned with the perceived timing between stimuli. (c) 2015 APA, all rights reserved).

  17. 25 CFR 39.502 - How does a school apply for contingency funds?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false How does a school apply for contingency funds? 39.502... EQUALIZATION PROGRAM Contingency Fund § 39.502 How does a school apply for contingency funds? To apply for contingency funds, a school must send a request to the ELO. The ELO must send the request to the Director for...

  18. The detection of contingency and animacy from simple animations in the human brain.

    Science.gov (United States)

    Blakemore, S-J; Boyer, P; Pachot-Clouard, M; Meltzoff, A; Segebarth, C; Decety, J

    2003-08-01

    Contingencies between objects and people can be mechanical or intentional-social in nature. In this fMRI study we used simplified stimuli to investigate brain regions involved in the detection of mechanical and intentional contingencies. Using a factorial design we manipulated the 'animacy' and 'contingency' of stimulus movement, and the subject's attention to the contingencies. The detection of mechanical contingency between shapes whose movement was inanimate engaged the middle temporal gyrus and right intraparietal sulcus. The detection of intentional contingency between shapes whose movement was animate activated superior parietal networks bilaterally. These activations were unaffected by attention to contingency. Additional regions, the right middle frontal gyrus and left superior temporal sulcus, became activated by the animate-contingent stimuli when subjects specifically attended to the contingent nature of the stimuli. Our results help to clarify neural networks previously associated with 'theory of mind' and agency detection. In particular, the results suggest that low-level perception of agency in terms of objects reacting to other objects at a distance is processed by parietal networks. In contrast, the activation of brain regions traditionally associated with theory of mind tasks appears to require attention to be directed towards agency and contingency.

  19. CW-FIT: Group Contingency Effects across the Day

    Science.gov (United States)

    Wills, Howard P.; Iwaszuk, Wendy M.; Kamps, Debra; Shumate, Emily

    2014-01-01

    This study explored the effects of a group-contingency intervention on student behavior across academic instructional periods. Research suggests group contingencies are evidence-based practices, yet calls for investigation to determine the best conditions and groups suited for this type of intervention. CW-FIT (Class-Wide Function-related…

  20. Skype me! Socially contingent interactions help toddlers learn language.

    Science.gov (United States)

    Roseberry, Sarah; Hirsh-Pasek, Kathy; Golinkoff, Roberta M

    2014-01-01

    Language learning takes place in the context of social interactions, yet the mechanisms that render social interactions useful for learning language remain unclear. This study focuses on whether social contingency might support word learning. Toddlers aged 24-30 months (N = 36) were exposed to novel verbs in one of three conditions: live interaction training, socially contingent video training over video chat, and noncontingent video training (yoked video). Results suggest that children only learned novel verbs in socially contingent interactions (live interactions and video chat). This study highlights the importance of social contingency in interactions for language learning and informs the literature on learning through screen media as the first study to examine word learning through video chat technology. © 2013 The Authors. Child Development © 2013 Society for Research in Child Development, Inc.

  1. Statistics concerning the Apollo command module water landing, including the probability of occurrence of various impact conditions, sucessful impact, and body X-axis loads

    Science.gov (United States)

    Whitnah, A. M.; Howes, D. B.

    1971-01-01

    Statistical information for the Apollo command module water landings is presented. This information includes the probability of occurrence of various impact conditions, a successful impact, and body X-axis loads of various magnitudes.

  2. The Role of Relational Information in Contingent Capture

    Science.gov (United States)

    Becker, Stefanie I.; Folk, Charles L.; Remington, Roger W.

    2010-01-01

    On the contingent capture account, top-down attentional control settings restrict involuntary attentional capture to items that match the features of the search target. Attention capture is involuntary, but contingent on goals and intentions. The observation that only target-similar items can capture attention has usually been taken to show that…

  3. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  4. Contingency Contracting within the Department of Defense: A Comparative Analysis

    National Research Council Canada - National Science Library

    McMillion, Chester

    2000-01-01

    .... The thesis compares and contrasts the regulations governing the contingency contracting operations, the organization structure, contingency contracting support plans, and the training requirements...

  5. 48 CFR 403.405 - Misrepresentations or violations of the Covenant Against Contingent Fees.

    Science.gov (United States)

    2010-10-01

    ... violations of the Covenant Against Contingent Fees. 403.405 Section 403.405 Federal Acquisition Regulations... Contingent Fees 403.405 Misrepresentations or violations of the Covenant Against Contingent Fees. (a) A suspected misrepresentation or violation of the Covenant Against Contingent Fees shall be documented in...

  6. Dengue Contingency Planning: From Research to Policy and Practice

    Science.gov (United States)

    Runge-Ranzinger, Silvia; Kroeger, Axel; Olliaro, Piero; McCall, Philip J.; Sánchez Tejeda, Gustavo; Lloyd, Linda S.; Hakim, Lokman; Bowman, Leigh R.; Horstick, Olaf; Coelho, Giovanini

    2016-01-01

    Background Dengue is an increasingly incident disease across many parts of the world. In response, an evidence-based handbook to translate research into policy and practice was developed. This handbook facilitates contingency planning as well as the development and use of early warning and response systems for dengue fever epidemics, by identifying decision-making processes that contribute to the success or failure of dengue surveillance, as well as triggers that initiate effective responses to incipient outbreaks. Methodology/Principal findings Available evidence was evaluated using a step-wise process that included systematic literature reviews, policymaker and stakeholder interviews, a study to assess dengue contingency planning and outbreak management in 10 countries, and a retrospective logistic regression analysis to identify alarm signals for an outbreak warning system using datasets from five dengue endemic countries. Best practices for managing a dengue outbreak are provided for key elements of a dengue contingency plan including timely contingency planning, the importance of a detailed, context-specific dengue contingency plan that clearly distinguishes between routine and outbreak interventions, surveillance systems for outbreak preparedness, outbreak definitions, alert algorithms, managerial capacity, vector control capacity, and clinical management of large caseloads. Additionally, a computer-assisted early warning system, which enables countries to identify and respond to context-specific variables that predict forthcoming dengue outbreaks, has been developed. Conclusions/Significance Most countries do not have comprehensive, detailed contingency plans for dengue outbreaks. Countries tend to rely on intensified vector control as their outbreak response, with minimal focus on integrated management of clinical care, epidemiological, laboratory and vector surveillance, and risk communication. The Technical Handbook for Surveillance, Dengue Outbreak

  7. Category learning in the color-word contingency learning paradigm.

    Science.gov (United States)

    Schmidt, James R; Augustinova, Maria; De Houwer, Jan

    2018-04-01

    In the typical color-word contingency learning paradigm, participants respond to the print color of words where each word is presented most often in one color. Learning is indicated by faster and more accurate responses when a word is presented in its usual color, relative to another color. To eliminate the possibility that this effect is driven exclusively by the familiarity of item-specific word-color pairings, we examine whether contingency learning effects can be observed also when colors are related to categories of words rather than to individual words. To this end, the reported experiments used three categories of words (animals, verbs, and professions) that were each predictive of one color. Importantly, each individual word was presented only once, thus eliminating individual color-word contingencies. Nevertheless, for the first time, a category-based contingency effect was observed, with faster and more accurate responses when a category item was presented in the color in which most of the other items of that category were presented. This finding helps to constrain episodic learning models and sets the stage for new research on category-based contingency learning.

  8. Acquisition of automatic imitation is sensitive to sensorimotor contingency.

    Science.gov (United States)

    Cook, Richard; Press, Clare; Dickinson, Anthony; Heyes, Cecilia

    2010-08-01

    The associative sequence learning model proposes that the development of the mirror system depends on the same mechanisms of associative learning that mediate Pavlovian and instrumental conditioning. To test this model, two experiments used the reduction of automatic imitation through incompatible sensorimotor training to assess whether mirror system plasticity is sensitive to contingency (i.e., the extent to which activation of one representation predicts activation of another). In Experiment 1, residual automatic imitation was measured following incompatible training in which the action stimulus was a perfect predictor of the response (contingent) or not at all predictive of the response (noncontingent). A contingency effect was observed: There was less automatic imitation indicative of more learning in the contingent group. Experiment 2 replicated this contingency effect and showed that, as predicted by associative learning theory, it can be abolished by signaling trials in which the response occurs in the absence of an action stimulus. These findings support the view that mirror system development depends on associative learning and indicate that this learning is not purely Hebbian. If this is correct, associative learning theory could be used to explain, predict, and intervene in mirror system development.

  9. Contingency learning in alcohol dependence and pathological gambling: learning and unlearning reward contingencies

    NARCIS (Netherlands)

    Vanes, L.D.; Holst, R.J. van; Jansen, J.M.; Brink, W. van den; Oosterlaan, J.; Goudriaan, A.E.

    2014-01-01

    BACKGROUND: Patients with alcohol dependence (AD) and pathological gambling (PG) are characterized by dysfunctional reward processing and their ability to adapt to alterations of reward contingencies is impaired. However, most neurocognitive tasks investigating reward processing involve a complex

  10. Contingency Learning in Alcohol Dependence and Pathological Gambling: Learning and Unlearning Reward Contingencies

    NARCIS (Netherlands)

    Vanes, L.D.; Holst, R.; Jansen, J.D.; van den Brink, W.A.; Oosterlaan, J.; Goudriaan, A.E.

    2014-01-01

    Background: Patients with alcohol dependence (AD) and pathological gambling (PG) are characterized by dysfunctional reward processing and their ability to adapt to alterations of reward contingencies is impaired. However, most neurocognitive tasks investigating reward processing involve a complex

  11. 75 FR 27986 - Electronic Filing System-Web (EFS-Web) Contingency Option

    Science.gov (United States)

    2010-05-19

    ...] Electronic Filing System--Web (EFS-Web) Contingency Option AGENCY: United States Patent and Trademark Office... contingency option when the primary portal to EFS-Web has an unscheduled outage. Previously, the entire EFS-Web system is not available to the users during such an outage. The contingency option in EFS-Web will...

  12. The Graduate School and Its Organizational Structure: A Contingency Theory Approach.

    Science.gov (United States)

    Sanford, Judith Babcock

    Contingency theory, the formal structure of graduate schools, and the applicability of contingency variables to graduate schools as organizations are examined. Contingency theory is based on an open systems concept that views an organization as composed of many interdependent parts that are interacting with one another. It also holds that under…

  13. Contingency Table Browser - prediction of early stage protein structure.

    Science.gov (United States)

    Kalinowska, Barbara; Krzykalski, Artur; Roterman, Irena

    2015-01-01

    The Early Stage (ES) intermediate represents the starting structure in protein folding simulations based on the Fuzzy Oil Drop (FOD) model. The accuracy of FOD predictions is greatly dependent on the accuracy of the chosen intermediate. A suitable intermediate can be constructed using the sequence-structure relationship information contained in the so-called contingency table - this table expresses the likelihood of encountering various structural motifs for each tetrapeptide fragment in the amino acid sequence. The limited accuracy with which such structures could previously be predicted provided the motivation for a more indepth study of the contingency table itself. The Contingency Table Browser is a tool which can visualize, search and analyze the table. Our work presents possible applications of Contingency Table Browser, among them - analysis of specific protein sequences from the point of view of their structural ambiguity.

  14. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  15. Dissociating Contingency Awareness and Conditioned Attitudes: Evidence of Contingency-Unaware Evaluative Conditioning

    Science.gov (United States)

    Hutter, Mandy; Sweldens, Steven; Stahl, Christoph; Unkelbach, Christian; Klauer, Karl Christoph

    2012-01-01

    Whether human evaluative conditioning can occur without contingency awareness has been the subject of an intense and ongoing debate for decades, troubled by a wide array of methodological difficulties. Following recent methodological innovations, the available evidence currently points to the conclusion that evaluative conditioning effects do not…

  16. Color and Contingency in Robert Boyle's Works.

    Science.gov (United States)

    Baker, Tawrin

    2015-01-01

    This essay investigates the relationship between color and contingency in Robert Boyle's Experiments and Considerations Touching Colours (1664) and his essays on the unsuccessfulness of experiments in Certain Physiological Essays (1661). In these two works Boyle wrestles with a difficult practical and philosophical problem with experiments, which he calls the problem of contingency. In Touching Colours, the problem of contingency is magnified by the much-debated issue of whether color had any deep epistemic importance. His limited theoretical principle guiding him in Touching Colours, that color is but modified light, further exacerbated the problem. Rather than theory, Boyle often relied on craftsmen, whose mastery of color phenomena was, Boyle mentions, brought about by economic forces, to determine when colors were indicators of important 'inward' properties of substances, and thus to secure a solid foundation for his experimental history of color.

  17. Application of the double-contingency principle within BNFL

    International Nuclear Information System (INIS)

    Strafford, P.I.D.

    1995-01-01

    Historically, the double-contingency principle has been used for criticality assessment within British Nuclear Fuels plc (BNFL). This paper outlines what is understood by the double-contingency principle to illustrate how it is applied in criticality safety assessments and to highlight various problem areas that are encountered and, where possible, how they might be solved

  18. Suited Contingency Ops Food - 2

    Science.gov (United States)

    Glass, J. W.; Leong, M. L.; Douglas, G. L.

    2014-01-01

    The contingency scenario for an emergency cabin depressurization event may require crewmembers to subsist in a pressurized suit for up to 144 hours. This scenario requires the capability for safe nutrition delivery through a helmet feed port against a 4 psi pressure differential to enable crewmembers to maintain strength and cognition to perform critical tasks. Two nutritional delivery prototypes were developed and analyzed for compatibility with the helmet feed port interface and for operational effectiveness against the pressure differential. The bag-in-bag (BiB) prototype, designed to equalize the suit pressure with the beverage pouch and enable a crewmember to drink normally, delivered water successfully to three different subjects in suits pressurized to 4 psi. The Boa restrainer pouch, designed to provide mechanical leverage to overcome the pressure differential, did not operate sufficiently. Guidelines were developed and compiled for contingency beverages that provide macro-nutritional requirements, a minimum one-year shelf life, and compatibility with the delivery hardware. Evaluation results and food product parameters have the potential to be used to improve future prototype designs and develop complete nutritional beverages for contingency events. These feeding capabilities would have additional use on extended surface mission EVAs, where the current in-suit drinking device may be insufficient.

  19. Statistical processing of experimental data

    OpenAIRE

    NAVRÁTIL, Pavel

    2012-01-01

    This thesis contains theory of probability and statistical sets. Solved and unsolved problems of probability, random variable and distributions random variable, random vector, statistical sets, regression and correlation analysis. Unsolved problems contains solutions.

  20. Distinct Motivational Effects of Contingent and Noncontingent Rewards.

    Science.gov (United States)

    Manohar, Sanjay G; Finzi, Rebecca Dawn; Drew, Daniel; Husain, Masud

    2017-07-01

    When rewards are available, people expend more energy, increasing their motivational vigor. In theory, incentives might drive behavior for two distinct reasons: First, they increase expected reward; second, they increase the difference in subjective value between successful and unsuccessful performance, which increases contingency-the degree to which action determines outcome. Previous studies of motivational vigor have never compared these directly. Here, we indexed motivational vigor by measuring the speed of eye movements toward a target after participants heard a cue indicating how outcomes would be determined. Eye movements were faster when the cue indicated that monetary rewards would be contingent on performance than when the cue indicated that rewards would be random. But even when the cue indicated that a reward was guaranteed regardless of speed, movement was still faster than when no reward was available. Motivation by contingent and certain rewards was uncorrelated across individuals, which suggests that there are two separable, independent components of motivation. Contingent motivation generated autonomic arousal, and unlike noncontingent motivation, was effective with penalties as well as rewards.

  1. Contingency learning in alcohol dependence and pathological gambling: learning and unlearning reward contingencies

    NARCIS (Netherlands)

    Vanes, Lucy D.; van Holst, Ruth J.; Jansen, Jochem M.; van den Brink, Wim; Oosterlaan, Jaap; Goudriaan, Anna E.

    2014-01-01

    Patients with alcohol dependence (AD) and pathological gambling (PG) are characterized by dysfunctional reward processing and their ability to adapt to alterations of reward contingencies is impaired. However, most neurocognitive tasks investigating reward processing involve a complex mix of

  2. ACCOUNTING FOR CONTINGENT CONSIDERATIONS IN BUSINESS COMBINATIONS

    OpenAIRE

    Gurgen KALASHYAN

    2017-01-01

    According to IFRS 3 Business Combinations contingent considerations must be included in the total consideration given for the acquired entity along with cash, other assets, ordinary or preference equity instruments, options, warrants. The contingent consideration is the determined amount which acquiring entity has to pay to acquired entity provided, that certain conditions will be fulfilled in the future. In case the provisions are not satisfied, we will get the situation when the amount of c...

  3. An Improved On-line Contingency Screening for Power System Transient Stability Assessment

    DEFF Research Database (Denmark)

    Weckesser, Johannes Tilman Gabriel; Jóhannsson, Hjörtur; Glavic, Mevludin

    2017-01-01

    This paper presents a contingency screening method and a framework for its on-line implementation. The proposed method carries out contingency screening and on-line stability assessment with respect to first-swing transient stability. For that purpose, it utilizes the single machine equivalent...... method and aims at improving the prior developed contingency screening approaches. In order to determine vulnerability of the system with respect to a particular contingency, only one time-domain simulation needs to be performed. An early stop criteria is proposed so that in a majority of the cases...... the simulation can be terminated after a few hundred milliseconds of simulated system response. The method's outcome is an assessment of the system's stability and a classification of each considered contingency. The contingencies are categorized by exploiting parameters of an equivalent one machine infinite bus...

  4. Classical probabilities for Majorana and Weyl spinors

    International Nuclear Information System (INIS)

    Wetterich, C.

    2011-01-01

    Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  5. All of statistics a concise course in statistical inference

    CERN Document Server

    Wasserman, Larry

    2004-01-01

    This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...

  6. Contingency inferences driven by base rates: Valid by sampling

    Directory of Open Access Journals (Sweden)

    Florian Kutzner

    2011-04-01

    Full Text Available Fiedler et al. (2009, reviewed evidence for the utilization of a contingency inference strategy termed pseudocontingencies (PCs. In PCs, the more frequent levels (and, by implication, the less frequent levels are assumed to be associated. PCs have been obtained using a wide range of task settings and dependent measures. Yet, the readiness with which decision makers rely on PCs is poorly understood. A computer simulation explored two potential sources of subjective validity of PCs. First, PCs are shown to perform above chance level when the task is to infer the sign of moderate to strong population contingencies from a sample of observations. Second, contingency inferences based on PCs and inferences based on cell frequencies are shown to partially agree across samples. Intriguingly, this criterion and convergent validity are by-products of random sampling error, highlighting the inductive nature of contingency inferences.

  7. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  8. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  9. The estimation of small probabilities and risk assessment

    International Nuclear Information System (INIS)

    Kalbfleisch, J.D.; Lawless, J.F.; MacKay, R.J.

    1982-01-01

    The primary contribution of statistics to risk assessment is in the estimation of probabilities. Frequently the probabilities in question are small, and their estimation is particularly difficult. The authors consider three examples illustrating some problems inherent in the estimation of small probabilities

  10. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  11. The contingent valuation method: a review

    International Nuclear Information System (INIS)

    Venkatachalam, L.

    2004-01-01

    The contingent valuation method (CVM) is a simple, flexible nonmarket valuation method that is widely used in cost-benefit analysis and environmental impact assessment. However, this method is subject to severe criticism. The criticism revolves mainly around two aspects, namely, the validity and the reliability of the results, and the effects of various biases and errors. The major objective of this paper is to review the recent developments on measures to address the validity and reliability issues arising out of different kinds of biases/errors and other related empirical and methodological issues concerning contingent valuation method

  12. Computational statistics handbook with Matlab

    CERN Document Server

    Martinez, Wendy L

    2007-01-01

    Prefaces Introduction What Is Computational Statistics? An Overview of the Book Probability Concepts Introduction Probability Conditional Probability and Independence Expectation Common Distributions Sampling Concepts Introduction Sampling Terminology and Concepts Sampling Distributions Parameter Estimation Empirical Distribution Function Generating Random Variables Introduction General Techniques for Generating Random Variables Generating Continuous Random Variables Generating Discrete Random Variables Exploratory Data Analysis Introduction Exploring Univariate Data Exploring Bivariate and Trivariate Data Exploring Multidimensional Data Finding Structure Introduction Projecting Data Principal Component Analysis Projection Pursuit EDA Independent Component Analysis Grand Tour Nonlinear Dimensionality Reduction Monte Carlo Methods for Inferential Statistics Introduction Classical Inferential Statistics Monte Carlo Methods for Inferential Statist...

  13. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  14. Analyzing Contingency Contracting Purchases for Operation Iraqi Freedom (Unrestricted Version)

    National Research Council Canada - National Science Library

    Baldwin, Laura H; Ausink, John A; Campbell, Nancy F; Drew, John G; Roll, Jr, Charles R

    2008-01-01

    ...) in an effort to determine the size and extent of contractor support, and how plans for and the organization and execution of contingency contracting activities might be improved so that Contingency...

  15. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    2017-07-01

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.

  16. Vagal tone during infant contingency learning and its disruption.

    Science.gov (United States)

    Sullivan, Margaret Wolan

    2016-04-01

    This study used contingency learning to examine changes in infants' vagal tone during learning and its disruption. The heart rate of 160 five-month-old infants was recorded continuously during the first of two training sessions as they experienced an audiovisual event contingent on their pulling. Maternal reports of infant temperament were also collected. Baseline vagal tone, a measure of parasympathetic regulation of the heart, was related to vagal levels during the infants' contingency learning session, but not to their learner status. Vagal tone levels did not vary significantly over session minutes. Instead, vagal tone levels were a function of both individual differences in learner status and infant soothability. Vagal levels of infants who learned in the initial session were similar regardless of their soothability; however, vagal levels of infants who learned in a subsequent session differed as a function of soothability. Additionally, vagal levels during contingency disruption were significantly higher among infants in this group who were more soothable as opposed to those who were less soothable. The results suggest that contingency learning and disruption is associated with stable vagal tone in the majority of infants, but that individual differences in attention processes and state associated with vagal tone may be most readily observed during the disruption phase. © 2015 Wiley Periodicals, Inc.

  17. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  18. Thévenin equivalent based static contingency assessment

    DEFF Research Database (Denmark)

    2015-01-01

    of the determined present state of the power system and determining a first representation of the network based on the determined Thevenin equivalents, determining a modified representation of the network, wherein the modified representation is a representation of the network having at least one contingency......, wherein at least one Thevenin equivalent of at least one voltage controlled node is modified due to the at least one contingency, the modified network representation being determined on the basis of the modified Thevenin equivalents, calculating voltage angles of the modified Thevenin equivalents......, and evaluating the voltage angles to determine whether the network having at least one contingency admit a steady state. Also a method of providing information on a real time static security assessment of a power system is disclosed....

  19. Appraisal of the Performance of Contingency Cost Provision for ...

    African Journals Online (AJOL)

    The paper appraised performance of contingency allowance in addressing projects' cost risk. To achieve this aim, impact of contingency provision in some selected building projects were evaluated. Data for the study was collected by means of checklist from 40 completed projects' files. Furthermore, 100 questionnaires on ...

  20. Contingency in the Cosmos and the Contingency of the Cosmos : Two Theological Approaches

    NARCIS (Netherlands)

    Drees, W.B.

    Contingency in reality may be epistemic, due to incomplete knowledge or the intersection of unrelated causal trajectories. In quantum physics, it appears to be ontological. More fundamental and interesting is the limit-question ‘why is there something rather than nothing,’ pointing out the

  1. Hypersensitivity to Contingent Behavior in Paranoia: A New Virtual Reality Paradigm.

    Science.gov (United States)

    Fornells-Ambrojo, Miriam; Elenbaas, Maaike; Barker, Chris; Swapp, David; Navarro, Xavier; Rovira, Aitor; Sanahuja, Josep Maria Tomàs; Slater, Mel

    2016-02-01

    Contingency in interpersonal relationships is associated with the development of secure attachment and trust, whereas paranoia arises from the overattribution of negative intentions. We used a new virtual reality paradigm to experimentally investigate the impact of contingent behavior on trust along the paranoia continuum. Sixty-one healthy participants were randomly allocated to have a social interaction with a pleasant virtual human (avatar) programmed to be highly responsive or not (high/low contingency). Perceived trustworthiness and trusting behavior were assessed alongside control variables attachment and anxiety. Higher paranoia and dismissive attachment were associated with larger interpersonal distances. Unexpectedly, extremely paranoid individuals experienced the highly contingent avatar as more trustworthy than their low contingency counterpart. Higher dismissive attachment was also associated with more subjective trust in both conditions. Extreme paranoia is associated with hypersensitivity to noncontingent behavior, which might explain experiences of mistrust when others are not highly responsive in everyday social situations.

  2. Reducing contingent self-worth: a defensive response to self-threats.

    Science.gov (United States)

    Buckingham, Justin; Lam, Tiffany A; Andrade, Fernanda C; Boring, Brandon L; Emery, Danielle

    2018-04-10

    Previous research shows that people with high self-esteem cope with threats to the self by reducing the extent to which their self-worth is contingent on the threatened domain (Buckingham, Weber, & Sypher, 2012). The present studies tested the hypothesis that this is a defensive process. In support of this hypothesis, Study 1 (N = 160), showed that self-affirmation attenuates the tendency for people with high self-esteem to reduce their contingencies of self-worth following self-threat. Furthermore, Study 2 (N = 286), showed that this tendency was more prevalent among people with defensive self-esteem than among those with secure self-esteem. The present studies imply that reducing contingent self-worth after self-threat is a defensive process. We discuss implications for theories of contingent self-worth.

  3. Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes

    Science.gov (United States)

    Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy

    2006-01-01

    We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…

  4. Effects of contingent self-esteem on depressive symptoms and suicidal behavior.

    Science.gov (United States)

    Lakey, Chad E; Hirsch, Jameson K; Nelson, Lyndsay A; Nsamenang, Sheri A

    2014-01-01

    Contingent self-esteem, or self-worth hinged upon successfully meeting standards or attaining goals, requires continual maintenance and validation. Despite the inherent instability that accompanies contingent self-esteem, relatively little is known about how it relates to markers of mental health. A sample of 371 college students completed measures of self-esteem, contingent self-esteem, suicidal behaviors, and depression. Individuals with fragile low self-esteem, described as highly contingent, reported greater depressive symptoms and suicidal behavior. Among those with secure high self-esteem, or high yet noncontingent, depression and suicide risk were markedly lower. Therapeutically promoting positive but noncontingent self-worth may reduce poor mental health outcomes.

  5. Step 1: Human System Integration (HSI) FY05 Pilot-Technology Interface Requirements for Contingency Management

    Science.gov (United States)

    2005-01-01

    This document involves definition of technology interface requirements for Contingency Management. This was performed through a review of Contingency Management-related, HSI requirements documents, standards, and recommended practices. Technology concepts in use by the Contingency Management Work Package were considered. Beginning with HSI high-level functional requirements for Contingency Management, and Contingency Management technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of system failures and associated contingency procedures, and (2) the control capability needed by the pilot to obtain system status and procedure information. Fundamentally, these requirements provide the candidate Contingency Management technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Contingency Management operations and functions should interface with the pilot to provide the necessary Contingency Management functionality to the UA-pilot system. Requirements and guidelines for Contingency Management are partitioned into four categories: (1) Health and Status and (2) Contingency Management. Each requirement is stated and is supported with a rationale and associated reference(s).

  6. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  7. Contingency Cost estimation for Research reactor Decommissioning

    International Nuclear Information System (INIS)

    Jin, Hyung Gon; Hong, Yun Jeong

    2016-01-01

    There are many types of cost items in decommissioning cost estimation, however, contingencies are for unforeseen elements of cost within the defined project scope. Regulatory body wants to reasonable quantification for this issue. Many countries have adopted the breakdown of activity dependent and period-dependent costs to structure their estimates. Period-dependent costs could be broken down into defined time frames to reduce overall uncertainties. Several countries apply this notion by having different contingency factors for different phases of the project. This study is a compilation of contingency cost of research reactor and for each country. Simulation techniques using TRIM, MATLAB, and PSpice can be useful tools for designing detector channels. Thus far TRIM, MATLAB and PSpice have been used to calculate the detector current output pulse for SiC semiconductor detectors and to model the pulses that propagate through potential detector channels. This model is useful for optimizing the detector and the resolution for application to neutron monitoring in the Generation IV power reactors

  8. Contingency Cost estimation for Research reactor Decommissioning

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Hyung Gon; Hong, Yun Jeong [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There are many types of cost items in decommissioning cost estimation, however, contingencies are for unforeseen elements of cost within the defined project scope. Regulatory body wants to reasonable quantification for this issue. Many countries have adopted the breakdown of activity dependent and period-dependent costs to structure their estimates. Period-dependent costs could be broken down into defined time frames to reduce overall uncertainties. Several countries apply this notion by having different contingency factors for different phases of the project. This study is a compilation of contingency cost of research reactor and for each country. Simulation techniques using TRIM, MATLAB, and PSpice can be useful tools for designing detector channels. Thus far TRIM, MATLAB and PSpice have been used to calculate the detector current output pulse for SiC semiconductor detectors and to model the pulses that propagate through potential detector channels. This model is useful for optimizing the detector and the resolution for application to neutron monitoring in the Generation IV power reactors.

  9. Financial Management: DoD Process for Reporting Contingent Legal Liabilities

    National Research Council Canada - National Science Library

    Granetto, Paul J; Marsh, Patricia A; Peek, Marvin L; Brittingham, Scott S; Baidridge, Denise E; Egu, Charles O; Schenck, Kristy M; Adams, Carl L; Reiser, Cheri L

    2006-01-01

    ... contingent liabilities should read this report. It identifies areas where DoD and its Components have not fully complied with Federal financial accounting standards and are not consistent in computing and disclosing contingent legal liabilities...

  10. 10 CFR 72.186 - Change to physical security and safeguards contingency plans.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Change to physical security and safeguards contingency... contingency plans. (a) The licensee shall make no change that would decrease the safeguards effectiveness of... licensee safeguards contingency plan without prior approval of the Commission. A licensee desiring to make...

  11. Application of the IPEBS method to dynamic contingency analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martins, A C.B. [FURNAS, Rio de Janeiro, RJ (Brazil); Pedroso, A S [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil)

    1994-12-31

    Dynamic contingency analysis is certainly a demanding task in the context of dynamic performance evaluation. This paper presents the results of a test for checking the contingency screening capability of the IPEBS method. A brazilian 1100-bus, 112-gen system was used in the test; the ranking of the contingencies based on critical clearing times obtained with IPEBS, was compared with the ranking derived from detailed time-domain simulation. The results of this comparison encourages us to recommended the use of the method in industry applications, in a complementary basis to the current method of time domain simulation. (author) 5 refs., 1 fig., 2 tabs.

  12. Simple artificial neural networks that match probability and exploit and explore when confronting a multiarmed bandit.

    Science.gov (United States)

    Dawson, Michael R W; Dupuis, Brian; Spetch, Marcia L; Kelly, Debbie M

    2009-08-01

    The matching law (Herrnstein 1961) states that response rates become proportional to reinforcement rates; this is related to the empirical phenomenon called probability matching (Vulkan 2000). Here, we show that a simple artificial neural network generates responses consistent with probability matching. This behavior was then used to create an operant procedure for network learning. We use the multiarmed bandit (Gittins 1989), a classic problem of choice behavior, to illustrate that operant training balances exploiting the bandit arm expected to pay off most frequently with exploring other arms. Perceptrons provide a medium for relating results from neural networks, genetic algorithms, animal learning, contingency theory, reinforcement learning, and theories of choice.

  13. Students' Understanding of Conditional Probability on Entering University

    Science.gov (United States)

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  14. Zero-contingent entropy of quantum states of a Hydrogen atom

    International Nuclear Information System (INIS)

    Charvot, R.; Majernik, V.

    1996-01-01

    We calculated the zero-contingent entropy for the position of electron in H-atom as a function of its quantum numbers and compared it with the corresponding value of the Shannon entropy. The values of zero-contingent entropy of quantum states of H-atom correlate well with the corresponding values of Shannon's entropy. This points out that, besides the Shannon entropy, the zero-contingent entropy represents an appropriate, and mathematically rather simple, measure of the spreading out of the wave functions in H-atom. (authors)

  15. 40 CFR 267.53 - Who must have copies of the contingency plan?

    Science.gov (United States)

    2010-07-01

    ... contingency plan? 267.53 Section 267.53 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... UNDER A STANDARDIZED PERMIT Contingency Plan and Emergency Procedures § 267.53 Who must have copies of the contingency plan? (a) You must maintain a copy of the plan with all revisions at the facility; and...

  16. Contingency Contractor Optimization Phase 3 Sustainment Database Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Frazier, Christopher Rawls; Durfee, Justin David; Bandlow, Alisa; Gearhart, Jared Lee; Jones, Katherine A

    2016-05-01

    The Contingency Contractor Optimization Tool – Prototype (CCOT-P) database is used to store input and output data for the linear program model described in [1]. The database allows queries to retrieve this data and updating and inserting new input data.

  17. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  18. Statistics for experimentalists

    CERN Document Server

    Cooper, B E

    2014-01-01

    Statistics for Experimentalists aims to provide experimental scientists with a working knowledge of statistical methods and search approaches to the analysis of data. The book first elaborates on probability and continuous probability distributions. Discussions focus on properties of continuous random variables and normal variables, independence of two random variables, central moments of a continuous distribution, prediction from a normal distribution, binomial probabilities, and multiplication of probabilities and independence. The text then examines estimation and tests of significance. Topics include estimators and estimates, expected values, minimum variance linear unbiased estimators, sufficient estimators, methods of maximum likelihood and least squares, and the test of significance method. The manuscript ponders on distribution-free tests, Poisson process and counting problems, correlation and function fitting, balanced incomplete randomized block designs and the analysis of covariance, and experiment...

  19. Probability and statistical correlation of the climatic parameters for estimatingenergy consumption of a building

    Directory of Open Access Journals (Sweden)

    Samarin Oleg Dmitrievich

    2014-01-01

    Full Text Available The problem of the most accurate estimation of energy consumption by ventilation and air conditioning systems in buildings is a high-priority task now because of the decrease of energy and fuel sources and because of the revision of building standards in Russian Federation. That’s why it is very important to find simple but accurate enough correlations of the climatic parameters in heating and cooling seasons of a year.Therefore the probabilistic and statistical relationship of the parameters of external climate in warm and cold seasons are considered. The climatic curves for cold and warm seasons in Moscow showing the most probable combinations between the external air temperature and the relative air humidity are plotted using the data from the Design Guidelines to the State Building Code “Building Climatology”. The statistical relationship of the enthalpy and the external air temperature for climatic conditions of Moscow are determined using these climatic curves and formulas connecting relative air humidity and other parameters of the air moisture degree.The mean value of the external air enthalpy for the heating season is calculated in order to simplify the determination of full heat consumption of ventilating and air conditioning systems taking into account the real mean state of external air. The field of application and the estimation of accuracy and standard deviation for the presented dependences are found. The obtained model contains the only independent parameter namely the external air temperature and therefore it can be easily used in engineering practice especially during preliminary calculation.

  20. Visual perceptual learning by operant conditioning training follows rules of contingency

    Science.gov (United States)

    Kim, Dongho; Seitz, Aaron R; Watanabe, Takeo

    2015-01-01

    Visual perceptual learning (VPL) can occur as a result of a repetitive stimulus-reward pairing in the absence of any task. This suggests that rules that guide Conditioning, such as stimulus-reward contingency (e.g. that stimulus predicts the likelihood of reward), may also guide the formation of VPL. To address this question, we trained subjects with an operant conditioning task in which there were contingencies between the response to one of three orientations and the presence of reward. Results showed that VPL only occurred for positive contingencies, but not for neutral or negative contingencies. These results suggest that the formation of VPL is influenced by similar rules that guide the process of Conditioning. PMID:26028984

  1. Visual perceptual learning by operant conditioning training follows rules of contingency.

    Science.gov (United States)

    Kim, Dongho; Seitz, Aaron R; Watanabe, Takeo

    2015-01-01

    Visual perceptual learning (VPL) can occur as a result of a repetitive stimulus-reward pairing in the absence of any task. This suggests that rules that guide Conditioning, such as stimulus-reward contingency (e.g. that stimulus predicts the likelihood of reward), may also guide the formation of VPL. To address this question, we trained subjects with an operant conditioning task in which there were contingencies between the response to one of three orientations and the presence of reward. Results showed that VPL only occurred for positive contingencies, but not for neutral or negative contingencies. These results suggest that the formation of VPL is influenced by similar rules that guide the process of Conditioning.

  2. Preserved statistical learning of tonal and linguistic material in congenital amusia.

    Science.gov (United States)

    Omigie, Diana; Stewart, Lauren

    2011-01-01

    Congenital amusia is a lifelong disorder whereby individuals have pervasive difficulties in perceiving and producing music. In contrast, typical individuals display a sophisticated understanding of musical structure, even in the absence of musical training. Previous research has shown that they acquire this knowledge implicitly, through exposure to music's statistical regularities. The present study tested the hypothesis that congenital amusia may result from a failure to internalize statistical regularities - specifically, lower-order transitional probabilities. To explore the specificity of any potential deficits to the musical domain, learning was examined with both tonal and linguistic material. Participants were exposed to structured tonal and linguistic sequences and, in a subsequent test phase, were required to identify items which had been heard in the exposure phase, as distinct from foils comprising elements that had been present during exposure, but presented in a different temporal order. Amusic and control individuals showed comparable learning, for both tonal and linguistic material, even when the tonal stream included pitch intervals around one semitone. However analysis of binary confidence ratings revealed that amusic individuals have less confidence in their abilities and that their performance in learning tasks may not be contingent on explicit knowledge formation or level of awareness to the degree shown in typical individuals. The current findings suggest that the difficulties amusic individuals have with real-world music cannot be accounted for by an inability to internalize lower-order statistical regularities but may arise from other factors.

  3. Preserved Statistical Learning of Tonal and Linguistic Material in Congenital Amusia

    Directory of Open Access Journals (Sweden)

    Diana eOmigie

    2011-06-01

    Full Text Available Congenital amusia is a lifelong disorder whereby individuals have pervasive difficulties in perceiving and producing music. In contrast, typical individuals display a sophisticated understanding of musical structure, even in the absence of musical training. Previous research has shown that they acquire this knowledge implicitly, through exposure to music’s statistical regularities. The present study tested the hypothesis that congenital amusia may result from a failure to internalize statistical regularities - specifically, lower-order transitional probabilities. To explore the specificity of any potential deficits to the musical domain, learning was examined with both tonal and linguistic material. Participants were exposed to structured tonal and linguistic sequences and, in a subsequent test phase, were required to identify items which had been heard in the exposure phase, as distinct from foils comprising elements that had been present during exposure, but presented in a different temporal order. Amusic and control individuals showed comparable learning, for both tonal and linguistic material, even when the tonal stream included pitch intervals around one semitone. However analysis of binary confidence ratings revealed that amusic individuals have less confidence in their abilities and that their performance in learning tasks may not be contingent on explicit knowledge formation or level of awareness to the degree shown in typical individuals. The current findings suggest that the difficulties amusic individuals have with real-world music cannot be accounted for by an inability to internalize lower-order statistical regularities but may arise from other factors.

  4. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  5. Comparing Value of Urban Green Space Using Contingent Valuation and Travel Cost Methods

    Science.gov (United States)

    Chintantya, Dea; Maryono

    2018-02-01

    Green urban open space are an important element of the city. They gives multiple benefits for social life, human health, biodiversity, air quality, carbon sequestration, and water management. Travel Cost Method (TCM) and Contingent Valuation Method (CVM) are the most frequently used method in various studies that assess environmental good and services in monetary term for valuing urban green space. Both of those method are determined the value of urban green space through willingness to pay (WTP) for ecosystem benefit and collected data through direct interview and questionnaire. Findings of this study showed the weaknesses and strengths of both methods for valuing urban green space and provided factors influencing the probability of user's willingness to pay in each method.

  6. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  7. A comparison of analysis methods to estimate contingency strength.

    Science.gov (United States)

    Lloyd, Blair P; Staubitz, Johanna L; Tapp, Jon T

    2018-05-09

    To date, several data analysis methods have been used to estimate contingency strength, yet few studies have compared these methods directly. To compare the relative precision and sensitivity of four analysis methods (i.e., exhaustive event-based, nonexhaustive event-based, concurrent interval, concurrent+lag interval), we applied all methods to a simulated data set in which several response-dependent and response-independent schedules of reinforcement were programmed. We evaluated the degree to which contingency strength estimates produced from each method (a) corresponded with expected values for response-dependent schedules and (b) showed sensitivity to parametric manipulations of response-independent reinforcement. Results indicated both event-based methods produced contingency strength estimates that aligned with expected values for response-dependent schedules, but differed in sensitivity to response-independent reinforcement. The precision of interval-based methods varied by analysis method (concurrent vs. concurrent+lag) and schedule type (continuous vs. partial), and showed similar sensitivities to response-independent reinforcement. Recommendations and considerations for measuring contingencies are identified. © 2018 Society for the Experimental Analysis of Behavior.

  8. What constitutes vulnerable self-esteem? Comparing the prospective effects of low, unstable, and contingent self-esteem on depressive symptoms.

    Science.gov (United States)

    Sowislo, Julia Friederike; Orth, Ulrich; Meier, Laurenz L

    2014-11-01

    A growing body of longitudinal studies suggests that low self-esteem is a risk factor for depression. However, it is unclear whether other characteristics of self-esteem, besides its level, explain incremental or even greater variance in subsequent depression. We examined the prospective effects of self-esteem level, instability (i.e., the degree of variability in self-esteem across short periods), and contingency (i.e., the degree to which self-esteem fluctuates in response to self-relevant events) on depressive symptoms in 1 overarching model, using data from 2 longitudinal studies. In Study 1, 372 adults were assessed at 2 waves over 6 months, including 40 daily diary assessments at Wave 1. In Study 2, 235 young adults were assessed at 2 waves over 6 weeks, including about 6 daily diary assessments at each wave. Self-esteem contingency was measured by self-report and by a statistical index based on the diary data (capturing event-related fluctuations in self-esteem). In both studies self-esteem level, but not self-esteem contingency, predicted subsequent depressive symptoms. Self-esteem instability predicted subsequent depressive symptoms in Study 2 only, with a smaller effect size than self-esteem level. Also, level, instability, and contingency of self-esteem did not interact in the prediction of depressive symptoms. Moreover, the effect of self-esteem level held when controlling for neuroticism and for all other Big Five personality traits. Thus, the findings provide converging evidence for a vulnerability effect of self-esteem level, tentative evidence for a smaller vulnerability effect of self-esteem instability, and no evidence for a vulnerability effect of self-esteem contingency.

  9. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  10. I spy with my little eye - the detection of intentional contingency in early psychosis.

    Science.gov (United States)

    Fett, Anne-Kathrin J; González Berdugo, Clara Isabel; Hanssen, Esther; Lemmers-Jansen, Imke; Shergill, Sukhi S; Krabbendam, Lydia

    2015-01-01

    Paranoid delusions have been associated with a tendency to over-attribute intentionality and contingency to others' actions and incidental events in individuals with chronic psychosis. However, this hyper-associative perception bias has not been investigated in the early illness stages of psychosis, during which it may play a particularly crucial role in the formation of symptoms. We used an experimental paradigm with 20 short film clips of simple animate and inanimate shapes that either moved in a contingent or non-contingent manner to investigate the perception of contingency in 38 adolescents with early psychosis and 93 healthy control adolescents. Participants rated the contingency between the shapes' movements on a scale from 0 to 10. The data were analysed with multilevel regression analyses to account for repeated measures within subjects. There were no significant differences between patients and controls; both perceived the contingency of the shapes' movements similarly across all conditions and patients' contingency perception was unrelated to their levels of paranoid delusions. Contingency perception was unimpaired in patients with early psychosis, suggesting that it might still be intact in the early illness stages. Future studies should set out to determine whether the early illness stages could offer a window for interventions that counteract the development of hyper-associative perceptions of contingency.

  11. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  12. On the discretization of probability density functions and the ...

    Indian Academy of Sciences (India)

    important for most applications or theoretical problems of interest. In statistics ... In probability theory, statistics, statistical mechanics, communication theory, and other .... (1) by taking advantage of SMVT as a general mathematical approach.

  13. Flexibility to contingency changes distinguishes habitual and goal-directed strategies in humans.

    Science.gov (United States)

    Lee, Julie J; Keramati, Mehdi

    2017-09-01

    Decision-making in the real world presents the challenge of requiring flexible yet prompt behavior, a balance that has been characterized in terms of a trade-off between a slower, prospective goal-directed model-based (MB) strategy and a fast, retrospective habitual model-free (MF) strategy. Theory predicts that flexibility to changes in both reward values and transition contingencies can determine the relative influence of the two systems in reinforcement learning, but few studies have manipulated the latter. Therefore, we developed a novel two-level contingency change task in which transition contingencies between states change every few trials; MB and MF control predict different responses following these contingency changes, allowing their relative influence to be inferred. Additionally, we manipulated the rate of contingency changes in order to determine whether contingency change volatility would play a role in shifting subjects between a MB and MF strategy. We found that human subjects employed a hybrid MB/MF strategy on the task, corroborating the parallel contribution of MB and MF systems in reinforcement learning. Further, subjects did not remain at one level of MB/MF behaviour but rather displayed a shift towards more MB behavior over the first two blocks that was not attributable to the rate of contingency changes but rather to the extent of training. We demonstrate that flexibility to contingency changes can distinguish MB and MF strategies, with human subjects utilizing a hybrid strategy that shifts towards more MB behavior over blocks, consequently corresponding to a higher payoff.

  14. Role of contingency in striatal response to incentive in adolescents with anxiety.

    Science.gov (United States)

    Benson, Brenda E; Guyer, Amanda E; Nelson, Eric E; Pine, Daniel S; Ernst, Monique

    2015-03-01

    This study examines the effect of contingency on reward function in anxiety. We define contingency as the aspect of a situation in which the outcome is determined by one's action-that is, when there is a direct link between one's action and the outcome of the action. Past findings in adolescents with anxiety or at risk for anxiety have revealed hypersensitive behavioral and neural responses to higher value rewards with correct performance. This hypersensitivity to highly valued (salient) actions suggests that the value of actions is determined not only by outcome magnitude, but also by the degree to which the outcome is contingent on correct performance. Thus, contingency and incentive value might each modulate reward responses in unique ways in anxiety. Using fMRI with a monetary reward task, striatal response to cue anticipation is compared in 18 clinically anxious and 20 healthy adolescents. This task manipulates orthogonally reward contingency and incentive value. Findings suggest that contingency modulates the neural response to incentive magnitude differently in the two groups. Specifically, during the contingent condition, right-striatal response tracks incentive value in anxious, but not healthy, adolescents. During the noncontingent condition, striatal response is bilaterally stronger to low than to high incentive in anxious adolescents, while healthy adolescents exhibit the expected opposite pattern. Both contingency and reward magnitude differentiate striatal activation in anxious versus healthy adolescents. These findings may reflect exaggerated concern about performance and/or alterations of striatal coding of reward value in anxious adolescents. Abnormalities in reward function in anxiety may have treatment implications.

  15. 5-HT modulation by acute tryptophan depletion of human instrumental contingency judgements.

    Science.gov (United States)

    Chase, Henry W; Crockett, Molly J; Msetfi, Rachel M; Murphy, Robin A; Clark, Luke; Sahakian, Barbara J; Robbins, Trevor W

    2011-02-01

    The concept of 'depressive realism', that depression leads to more accurate perception of causal control, has been influential in the field of depression research, but remains controversial. Recent work testing contingency learning has suggested that contextual processing might determine realism-like effects. Serotonin (5-hydroxytryptamine, (5-HT)), which is implicated in the pathophysiology of depression, might also influence contextual processing. Using acute tryptophan depletion (ATD), we tested the hypothesis that dysfunctional serotoninergic neurotransmission influences contingency judgements in dysphoric subjects via an effect on contextual processing. We employed a novel contingency learning task to obtain separate measures (ratings) of the causal effect of participants' responses and efficacy of the background context over an outcome. Participants, without a history of depression, completed this task on and off ATD in a double-blind, placebo-controlled, within-subjects design. As with other work on contingency learning, the effects of ATD were related to baseline mood levels. Although no overall effects of ATD were observed, the subgroup of participants with low Beck depression inventory (BDI) scores showed reduced ratings of contextual control and improved accuracy of contingency judgements under positive contingencies following ATD, compared to placebo. High BDI participants demonstrated low accuracy in contingency judgements, regardless of serotoninergic status. No effect of ATD on contingency judgements was observed in the group as a whole, but effects were observed in a subgroup of participants with low BDI scores. We discuss these data in light of the context processing hypothesis, and prior research on 5-HT and depressive realism.

  16. Flexibility to contingency changes distinguishes habitual and goal-directed strategies in humans.

    Directory of Open Access Journals (Sweden)

    Julie J Lee

    2017-09-01

    Full Text Available Decision-making in the real world presents the challenge of requiring flexible yet prompt behavior, a balance that has been characterized in terms of a trade-off between a slower, prospective goal-directed model-based (MB strategy and a fast, retrospective habitual model-free (MF strategy. Theory predicts that flexibility to changes in both reward values and transition contingencies can determine the relative influence of the two systems in reinforcement learning, but few studies have manipulated the latter. Therefore, we developed a novel two-level contingency change task in which transition contingencies between states change every few trials; MB and MF control predict different responses following these contingency changes, allowing their relative influence to be inferred. Additionally, we manipulated the rate of contingency changes in order to determine whether contingency change volatility would play a role in shifting subjects between a MB and MF strategy. We found that human subjects employed a hybrid MB/MF strategy on the task, corroborating the parallel contribution of MB and MF systems in reinforcement learning. Further, subjects did not remain at one level of MB/MF behaviour but rather displayed a shift towards more MB behavior over the first two blocks that was not attributable to the rate of contingency changes but rather to the extent of training. We demonstrate that flexibility to contingency changes can distinguish MB and MF strategies, with human subjects utilizing a hybrid strategy that shifts towards more MB behavior over blocks, consequently corresponding to a higher payoff.

  17. Ensuring the Reliable Operation of the Power Grid: State-Based and Distributed Approaches to Scheduling Energy and Contingency Reserves

    Science.gov (United States)

    Prada, Jose Fernando

    Keeping a contingency reserve in power systems is necessary to preserve the security of real-time operations. This work studies two different approaches to the optimal allocation of energy and reserves in the day-ahead generation scheduling process. Part I presents a stochastic security-constrained unit commitment model to co-optimize energy and the locational reserves required to respond to a set of uncertain generation contingencies, using a novel state-based formulation. The model is applied in an offer-based electricity market to allocate contingency reserves throughout the power grid, in order to comply with the N-1 security criterion under transmission congestion. The objective is to minimize expected dispatch and reserve costs, together with post contingency corrective redispatch costs, modeling the probability of generation failure and associated post contingency states. The characteristics of the scheduling problem are exploited to formulate a computationally efficient method, consistent with established operational practices. We simulated the distribution of locational contingency reserves on the IEEE RTS96 system and compared the results with the conventional deterministic method. We found that assigning locational spinning reserves can guarantee an N-1 secure dispatch accounting for transmission congestion at a reasonable extra cost. The simulations also showed little value of allocating downward reserves but sizable operating savings from co-optimizing locational nonspinning reserves. Overall, the results indicate the computational tractability of the proposed method. Part II presents a distributed generation scheduling model to optimally allocate energy and spinning reserves among competing generators in a day-ahead market. The model is based on the coordination between individual generators and a market entity. The proposed method uses forecasting, augmented pricing and locational signals to induce efficient commitment of generators based on firm

  18. Roentgenologic characteristic of 7 group contingent of dispensary registration

    International Nuclear Information System (INIS)

    Derzhavin, V.I.; Nalivajko, N.N.; Kozlova, L.N.; Petrik, R.N.

    1984-01-01

    9694 persons of 7 group contingent of dispensary registration were examined. Roentgenologic study of posttuberculous changes of 7 group contingent of dispensary registration showed that in people of 7-A subgroup prevail processes of secondary genesis (79.4%) and in people of 7-B subgroup - of primary genesis (55.8%). Consequences of secondary tuberculosis are most recurring

  19. Analisa Kemampuan Saluran Berdasarkan Metode Contingency N-1 Analysis

    OpenAIRE

    Syukriyadin,; Susanti, Rahmi

    2010-01-01

    Sistem transmisi memegang peranan yang sangatpenting dalam proses penyaluran daya. Oleh karena itupengamanan pada saluran transmisi perlu mendapatperhatian yang serius dalam perencanaannya. Analisakemampuan saluran merupakan aplikasi untuk mempelajarikestabilan sistem. Analisa kemampuan saluran dalampenelitian ini menggunakan metode contingency N-1 analysis.Contingency N-1 analysis merupakan sebuah program untukmemperhitungkan berbagi kondisi yang mungkin terjadidalam sistem dimasa yang akan ...

  20. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  1. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  2. Extended child and caregiver benefits of behavior-based child contingency learning games.

    Science.gov (United States)

    Dunst, Carl J; Raab, Melinda; Trivette, Carol M; Wilson, Linda L; Hamby, Deborah W; Parkey, Cindy

    2010-08-01

    Findings from 2 studies of the relationship between response-contingent child behavior and child, caregiver-child, and caregiver behavior not directly associated with child contingency learning are described. The participants were 19 children with significant developmental delays and their mothers in 1 study and 22 children with significant developmental delays and their teachers in the second study. Caregivers engaged the children in learning games characterized by behavior-based contingencies for 15 weeks. Research staff observed the children and their caregivers in everyday routines and activities and rated child and caregiver behavior while the children and caregivers were not playing the games. Results from both studies showed that the degree of response-contingent responding during the games was related to child and caregiver behavior, not the focus of the contingency learning opportunities afforded the children. Implications for practice are described.

  3. Fast and precise method of contingency ranking in modern power system

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2011-01-01

    Contingency Analysis is one of the most important aspect of Power System Security Analysis. This paper presents a fast and precise method of contingency ranking for effective power system security analysis. The method proposed in this research work takes due consideration of both apparent power o...... is based on realistic approach taking practical situations into account. Besides taking real situations into consideration the proposed method is fast enough to be considered for on-line security analysis.......Contingency Analysis is one of the most important aspect of Power System Security Analysis. This paper presents a fast and precise method of contingency ranking for effective power system security analysis. The method proposed in this research work takes due consideration of both apparent power...

  4. 48 CFR 603.405 - Misrepresentations or violations of the Covenant Against Contingent Fees.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Misrepresentations or violations of the Covenant Against Contingent Fees. 603.405 Section 603.405 Federal Acquisition Regulations... Contingent Fees 603.405 Misrepresentations or violations of the Covenant Against Contingent Fees. (a) The...

  5. Contingent self-worth moderates the relationship between school stressors and psychological stress responses.

    Science.gov (United States)

    Ishizu, Kenichiro

    2017-04-01

    This study examined the moderating role of contingent self-worth on the relationships between school stressors and psychological stress responses among Japanese adolescents. A total of 371 Japanese junior high school students (184 boys and 187 girls, M age  = 12.79 years, SD = 0.71) completed the Japanese version of the Self-Worth Contingency Questionnaire and a mental health checklist at two points separated by a two-month interval. Hierarchical multiple regression analyses were then used to determine whether contingent self-worth moderated the relationship between school stressors and psychological stress responses. The results indicated that, when psychological stress responses were controlled for at Time 1, contingent self-worth did not predict the psychological stress responses at Time 2. However, a two-way interaction between contingent self-worth and stressors was found to significantly influence psychological stress responses, thus indicating that stressors had a stronger impact on psychological stress responses among those with high contingent self-worth compared to those with low contingent self-worth. Copyright © 2017 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  6. National Contingency Plan Subpart J

    Science.gov (United States)

    Subpart J of the National Oil and Hazardous Substances Pollution Contingency Plan (NCP) directs EPA to prepare a schedule of dispersants, other chemicals, and oil spill mitigating devices and substances that may be used to remove or control oil discharges.

  7. Power plant construction lead times: The value of contingency planning

    International Nuclear Information System (INIS)

    Rubin, L.J.

    1985-01-01

    In this paper an analysis of two different approaches to the construction of a major power plant (nuclear) is presented. The analysis compares an accelerated, ''go-for-broke'' strategy-which has some risk of being delayed-with a more deliberate contingency construction schedule in terms of revenue requirements and costs of electricity. It is demonstrated that under a wide variety of circumstances there are important advantages to the contingency strategy, but that the magnitude of those advantages is sensitive to the character of the power system being examined and to the flexibility of the contingency approach

  8. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  9. Public education programme for nuclear contingency planning in Hong Kong

    International Nuclear Information System (INIS)

    Wong, M. C.; Li, S. W.

    2002-01-01

    Two nuclear power stations on the coast of southern China are situated some 50 kilometers to the northeast of Hong Kong. Although the stations are far away from Hong Kong, the construction and operation of the nuclear power stations have generated public anxiety locally, in particular, after the Chernobyl accident in 1986. A comprehensive contingency plan which takes into account such concerns of the public has been implemented in Hong Kong. Cooperation by the public is vital to the effective implementation of any contingency plan. Understanding of the basics of radiation protection as well as the contingency plan will help the public to appreciate the situation and react in a rational manner. A public education program to promote awareness of the contingency plan has been implemented in Hong Kong. In particular, a Virtual Exhibition Hall on radiation has been developed and launched in February 2002 for access by the public via Internet. A video and a set of web pages will be launched in the later part of 2002 to inform and educate the public on matters related to nuclear accident response in Hong Kong. This paper describes the public education programme in Hong Kong to promote public awareness and understanding of the nuclear contingency plan

  10. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  11. Introductory statistical inference

    CERN Document Server

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  12. Applied Problems and Use of Technology in an Aligned Way in Basic Courses in Probability and Statistics for Engineering Students--A Way to Enhance Understanding and Increase Motivation

    Science.gov (United States)

    Zetterqvist, Lena

    2017-01-01

    Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…

  13. Relationship between the generalized equivalent uniform dose formulation and the Poisson statistics-based tumor control probability model

    International Nuclear Information System (INIS)

    Zhou Sumin; Das, Shiva; Wang Zhiheng; Marks, Lawrence B.

    2004-01-01

    The generalized equivalent uniform dose (GEUD) model uses a power-law formalism, where the outcome is related to the dose via a power law. We herein investigate the mathematical compatibility between this GEUD model and the Poisson statistics based tumor control probability (TCP) model. The GEUD and TCP formulations are combined and subjected to a compatibility constraint equation. This compatibility constraint equates tumor control probability from the original heterogeneous target dose distribution to that from the homogeneous dose from the GEUD formalism. It is shown that this constraint equation possesses a unique, analytical closed-form solution which relates radiation dose to the tumor cell survival fraction. It is further demonstrated that, when there is no positive threshold or finite critical dose in the tumor response to radiation, this relationship is not bounded within the realistic cell survival limits of 0%-100%. Thus, the GEUD and TCP formalisms are, in general, mathematically inconsistent. However, when a threshold dose or finite critical dose exists in the tumor response to radiation, there is a unique mathematical solution for the tumor cell survival fraction that allows the GEUD and TCP formalisms to coexist, provided that all portions of the tumor are confined within certain specific dose ranges

  14. Collectivists' contingency and autonomy as predictors of buffet preferences among Taiwanese adolescents.

    Science.gov (United States)

    Chiou, Wen-Bin

    2006-01-01

    In a culture or society with high collectivism, contingent orientation and constrained autonomy are the prominent characteristics of adolescents' self-construal. This article examined whether Taiwanese adolescents' contingency and autonomy were associated with their prevalent preferences for buffet consumption. Findings in a panel survey indicated that contingency was positively correlated with adolescents' buffet preference, whereas autonomy was negatively correlated. Moreover, the results showed that adolescents' contingent orientation and perceived autonomy could predict their subsequent buffet preference over a half-year period. A laboratory experiment showed that adolescents who perceived lower autonomy exhibited greater preferences for buffet over the other diet consumption. In general, the results suggest that collectivist adolescents' contingency and autonomy were related to their trait-like preferences for buffet, and the state-like preferences for buffet were affected by their perceived levels of autonomy. Findings provide further insights into the impact of adolescents' self-construal on their diet consumption.

  15. Tests of the power PC theory of causal induction with negative contingencies.

    Science.gov (United States)

    Shanks, David R

    2002-01-01

    The power PC theory of causal induction (Cheng, 1997) proposes that causal estimates are based on the power p of a potential cause, where p is the contingency between the cause and effect normalized by the base rate of the effect. Previous tests of this theory have concentrated on generative causes that have positive contingencies with their associated outcomes. Here we empirically test this theory in two experiments using preventive causes that have negative contingencies for their outcomes. Contrary to the power PC theory, the results show that causal judgments vary with contingency across conditions of constant power p. This pattern is consistent, however, with several alternative accounts of causal judgment.

  16. An analysis of contingency statements in a DRO procedure: A case report.

    Science.gov (United States)

    Gerow, Stephanie; Rispoli, Mandy; Boles, Margot B; Neely, Leslie C

    2015-06-01

    To examine latency to criterion for reduction of challenging behaviour with and without stating a contingency statement immediately prior to a DRO procedure. An ABAC design in which A was baseline, B was used to evaluate the efficacy of a DRO procedure, and C was used to evaluate the efficacy of a DRO procedure with a contingency statement. The DRO with the contingency statement intervention was associated with a shorter latency to behaviour change than the DRO procedure without the contingency statement. These preliminary findings from this case study highlight the importance of examining the efficiency of behaviour change procedures. Directions for future research are provided.

  17. Overdispersion in nuclear statistics

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    The modern statistical distribution theory is applied to the development of the overdispersion theory in ionizing-radiation statistics for the first time. The physical nuclear system is treated as a sequence of binomial processes, each depending on a characteristic probability, such as probability of decay, detection, etc. The probabilities fluctuate in the course of a measurement, and the physical reasons for that are discussed. If the average values of the probabilities change from measurement to measurement, which originates from the random Lexis binomial sampling scheme, then the resulting distribution is overdispersed. The generating functions and probability distribution functions are derived, followed by a moment analysis. The Poisson and Gaussian limits are also given. The distribution functions belong to a family of generalized hypergeometric factorial moment distributions by Kemp and Kemp, and can serve as likelihood functions for the statistical estimations. An application to radioactive decay with detection is described and working formulae are given, including a procedure for testing the counting data for overdispersion. More complex experiments in nuclear physics (such as solar neutrino) can be handled by this model, as well as distinguishing between the source and background

  18. Contingency Teaching during Close Reading

    Science.gov (United States)

    Fisher, Douglas; Frey, Nancy

    2015-01-01

    12 teachers were interviewed and observed as they engaged students in close reading. We analyzed their responses and instruction to determine the scaffolds that were used as well as the contingency teaching plans they implemented when students were unable to understand the text.

  19. A Profile of Contingent Workers.

    Science.gov (United States)

    Polivka, Anne E.

    1996-01-01

    Based on data from the supplement to the February 1995 Current Population Survey, contingent workers were more likely to be female, black, young, enrolled in school, and employed in services and construction industries than were noncontingent workers. More than 10% were teachers. (Author)

  20. Scan Statistics

    CERN Document Server

    Glaz, Joseph

    2009-01-01

    Suitable for graduate students and researchers in applied probability and statistics, as well as for scientists in biology, computer science, pharmaceutical science and medicine, this title brings together a collection of chapters illustrating the depth and diversity of theory, methods and applications in the area of scan statistics.

  1. Radiological Contingency Planning for the Mars Science Laboratory Launch

    Energy Technology Data Exchange (ETDEWEB)

    Paul P. Guss

    2008-04-01

    This paper describes the contingency planning for the launch of the Mars Science Laboratory scheduled for the 21-day window beginning on September 15, 2009. National Security Technologies, LLC (NSTec), based in Las Vegas, Nevada, will support the U.S. Department of Energy (DOE) in its role for managing the overall radiological contingency planning support effort. This paper will focus on new technologies that NSTec’s Remote Sensing Laboratory (RSL) is developing to enhance the overall response capability that would be required for a highly unlikely anomaly. This paper presents recent advances in collecting and collating data transmitted from deployed teams and sensors. RSL is responsible to prepare the contingency planning for a range of areas from monitoring and assessment, sample collection and control, contaminated material release criteria, data management, reporting, recording, and even communications. The tools RSL has available to support these efforts will be reported. The data platform RSL will provide shall also be compatible with integration of assets and field data acquired with other DOE, National Space and Aeronautics and Space Administration (NASA), state, and local resources, personnel, and equipment. This paper also outlines the organizational structure for response elements in radiological contingency planning.

  2. Social Sensorimotor Contingencies

    OpenAIRE

    Bütepage, Judith

    2016-01-01

    As the field of robotics advances, more robots are employed in our everyday environment. Thus, the implementation of robots that can actively engage in physical collaboration and naturally interact with humans is of high importance. In order to achieve this goal, it is necessary to study human interaction and social cognition and how these aspects can be implemented in robotic agents. The theory of social sensorimotor contingencies hypothesises that many aspects of human-human interaction de...

  3. Associationism and cognition: human contingency learning at 25.

    Science.gov (United States)

    Shanks, David R

    2007-03-01

    A major topic within human learning, the field of contingency judgement, began to emerge about 25 years ago following publication of an article on depressive realism by Alloy and Abramson (1979). Subsequently, associationism has been the dominant theoretical framework for understanding contingency learning but this has been challenged in recent years by an alternative cognitive or inferential approach. This article outlines the key conceptual differences between these approaches and summarizes some of the main methods that have been employed to distinguish between them.

  4. [Making up tuberculosis risk groups from decreed contingents].

    Science.gov (United States)

    Kucherov, A L; Il'icheva, E Iu

    2001-01-01

    The paper provides materials to make up risk groups from decreed contingents by using the database developed and introduced in the Novomoskovsk district, as well as a programme for rapid determination of the risk of tuberculosis. This procedure reduces a scope of fluorographic surveys among the decreed contingents, as well as their expenditures by 60%. Moreover, it may be useful for professional choice in the employment of the decreed persons, which may promote a decrease in the incidence of tuberculosis among them.

  5. The effective application of contingency theory in health settings: problems and recommended solutions.

    Science.gov (United States)

    Strasser, S

    1983-01-01

    Contingency theory as a managerial perspective is conceptually elegant, but it may cause a number of unforeseen problems when applied in real work settings. Health care administrators can avoid many of these problems by using a hybrid contingency theory framework that blends the manager's own perceptions and experience with established contingency models.

  6. Breakdown concepts for contingency tables

    NARCIS (Netherlands)

    Kuhnt, S.

    2010-01-01

    Loglinear Poisson models are commonly used to analyse contingency tables. So far, robustness of parameter estimators as well as outlier detection have rarely been treated in this context. We start with finite-sample breakdown points. We yield that the breakdown point of mean value estimators

  7. Job satisfaction and contingent employment

    NARCIS (Netherlands)

    de Graaf-Zijl, M.

    2012-01-01

    This paper analyses job satisfaction as an aggregate of satisfaction with several job aspects, with special focus on the influence of contingent-employment contracts. Fixed-effect analysis is applied on a longitudinal sample of Dutch employees in four work arrangements: regular, fixed-term, on-call

  8. Quantum mechanics from classical statistics

    International Nuclear Information System (INIS)

    Wetterich, C.

    2010-01-01

    Quantum mechanics can emerge from classical statistics. A typical quantum system describes an isolated subsystem of a classical statistical ensemble with infinitely many classical states. The state of this subsystem can be characterized by only a few probabilistic observables. Their expectation values define a density matrix if they obey a 'purity constraint'. Then all the usual laws of quantum mechanics follow, including Heisenberg's uncertainty relation, entanglement and a violation of Bell's inequalities. No concepts beyond classical statistics are needed for quantum physics - the differences are only apparent and result from the particularities of those classical statistical systems which admit a quantum mechanical description. Born's rule for quantum mechanical probabilities follows from the probability concept for a classical statistical ensemble. In particular, we show how the non-commuting properties of quantum operators are associated to the use of conditional probabilities within the classical system, and how a unitary time evolution reflects the isolation of the subsystem. As an illustration, we discuss a classical statistical implementation of a quantum computer.

  9. Baseline Response Levels Are a Nuisance in Infant Contingency Learning

    Science.gov (United States)

    Millar, W. S.; Weir, Catherine

    2015-01-01

    The impact of differences in level of baseline responding on contingency learning in the first year was examined by considering the response acquisition of infants classified into baseline response quartiles. Whereas the three lower baseline groups showed the predicted increment in responding to a contingency, the highest baseline responders did…

  10. Body Image Concerns and Contingent Self-Esteem in Male and Female College Students.

    Science.gov (United States)

    Grossbard, Joel R; Lee, Christine M; Neighbors, Clayton; Larimer, Mary E

    2009-02-01

    Body dissatisfaction in females, and to a lesser extent males, is associated with low self-esteem, depression, and eating disorders. This research examined gender as a moderator of the association between contingent self-esteem and body image concerns, including weight and muscularity. Participants included 359 (59.1% female) heavy drinking first-year U.S. undergraduate students who completed a survey assessing health-related risk behaviors. Hierarchical multiple regression was used to examine relations among gender, contingent self-esteem, and body image. Females reported higher levels of contingent self-esteem and greater concerns about their weight, although males reported a greater drive for muscularity. The relationship between contingent self-esteem and weight concerns was stronger among females, and for males, greater contingent self-esteem was associated with a greater drive for muscularity.

  11. Wind Turbine Contingency Control Through Generator De-Rating

    Science.gov (United States)

    Frost, Susan; Goebel, Kai; Balas, Mark

    2013-01-01

    Maximizing turbine up-time and reducing maintenance costs are key technology drivers for wind turbine operators. Components within wind turbines are subject to considerable stresses due to unpredictable environmental conditions resulting from rapidly changing local dynamics. In that context, systems health management has the aim to assess the state-of-health of components within a wind turbine, to estimate remaining life, and to aid in autonomous decision-making to minimize damage to the turbine. Advanced contingency control is one way to enable autonomous decision-making by providing the mechanism to enable safe and efficient turbine operation. The work reported herein explores the integration of condition monitoring of wind turbines with contingency control to balance the trade-offs between maintaining system health and energy capture. The contingency control involves de-rating the generator operating point to achieve reduced loads on the wind turbine. Results are demonstrated using a high fidelity simulator of a utility-scale wind turbine.

  12. Contingency table analysis methods and implementation using R

    CERN Document Server

    Kateri, Maria

    2014-01-01

    Combining theory and applications, this book presents models and methods for the analysis of two‐ and multi‐dimensional contingency tables. The author uses a threefold approach: fundamental models and related inferences are presented, their interpretational aspects are highlighted, and their practical usefulness is demonstrated. Throughout, practical guidance for using R is provided along with a comprehensive R-functions web-appendix.   Contingency tables arise in diverse fields, including the life, pedagogic, social and political sciences. They also play a prominent role in market research and opinion surveys. The analysis of contingency tables can provide insight into essential structures, relevant quantities and their interactions, and thus leads to improved decision-making.   Special features include:   ·         A motivating example for each topic ·         Applications and implementations in R for all models discussed ·         Emphasis on association and symmetry model...

  13. NextGen Flight Deck Surface Trajectory-Based Operations (STBO): Contingency Holds

    Science.gov (United States)

    Bakowski, Deborah Lee; Hooey, Becky Lee; Foyle, David C.; Wolter, Cynthia A.; Cheng, Lara W. S.

    2013-01-01

    The purpose of this pilot-in-the-loop taxi simulation was to investigate a NextGen Surface Trajectory-Based Operations (STBO) concept called "contingency holds." The contingency-hold concept parses a taxi route into segments, allowing an air traffic control (ATC) surface traffic management (STM) system to hold an aircraft when necessary for safety. Under nominal conditions, if the intersection or active runway crossing is clear, the hold is removed, allowing the aircraft to continue taxiing without slowing, thus improving taxi efficiency, while minimizing the excessive brake use, fuel burn, and emissions associated with stop-and-go taxi. However, when a potential traffic conflict exists, the hold remains in place as a fail-safe mechanism. In this departure operations simulation, the taxi clearance included a required time of arrival (RTA) to a specified intersection. The flight deck was equipped with speed-guidance avionics to aid the pilot in safely meeting the RTA. On two trials, the contingency hold was not released, and pilots were required to stop. On two trials the contingency hold was released 15 sec prior to the RTA, and on two trials the contingency hold was released 30 sec prior to the RTA. When the hold remained in place, all pilots complied with the hold. Results also showed that when the hold was released at 15-sec or 30-sec prior to the RTA, the 30-sec release allowed pilots to maintain nominal taxi speed, thus supporting continuous traffic flow; whereas, the 15-sec release did not. The contingency-hold concept, with at least a 30-sec release, allows pilots to improve taxiing efficiency by reducing braking, slowing, and stopping, but still maintains safety in that no pilots "busted" the clearance holds. Overall, the evidence suggests that the contingency-hold concept is a viable concept for optimizing efficiency while maintaining safety.

  14. Comparing Value of Urban Green Space Using Contingent Valuation and Travel Cost Methods

    Directory of Open Access Journals (Sweden)

    Chintantya Dea

    2018-01-01

    Full Text Available Green urban open space are an important element of the city. They gives multiple benefits for social life, human health, biodiversity, air quality, carbon sequestration, and water management. Travel Cost Method (TCM and Contingent Valuation Method (CVM are the most frequently used method in various studies that assess environmental good and services in monetary term for valuing urban green space. Both of those method are determined the value of urban green space through willingness to pay (WTP for ecosystem benefit and collected data through direct interview and questionnaire. Findings of this study showed the weaknesses and strengths of both methods for valuing urban green space and provided factors influencing the probability of user’s willingness to pay in each method.

  15. Contingencies and metacontingencies: Toward a synthesis of behavior analysis and cultural materialism

    Science.gov (United States)

    Glenn, Sigrid S.

    1988-01-01

    A synthesis of cultural materialism and behavior analysis might increase the scientific and technological value of both fields. Conceptual and substantive relations between the two fields show important similarities, particularly with regard to the causal role of the environment in behavioral and cultural evolution. Key concepts in Marvin Harris's cultural materialist theories are outlined. A distinction is made between contingencies at the behavioral level of analysis (contingencies of reinforcement) and contingencies at the cultural level of analysis (metacontingencies). Relations between the two kinds of contingencies are explored in cultural practices from paleolithic to industrial sociocultural systems. A synthesis of these two fields may offer the opportunity to resolve serious problems currently facing modern cultures. PMID:22478011

  16. The Contingent Unknowability of Facts and its Relation with Informal, Epistemological Contexts

    Directory of Open Access Journals (Sweden)

    Stanley Kreiter Bezerra Medeiros

    2017-11-01

    Full Text Available This paper focuses on elements that are involved in a specific type of judgment, namely, those involving facts that, in virtue of contingent reasons, are out of our epistemic reach. Its goal is to propose a philosophical explanation about why we, in informal contexts, take some facts as contingently unknowable. In order to accomplish that goal, we develop a theory that defines contingently unknowable facts in a very specific way. We establish three clauses that are jointly necessary and sufficient — so we argue — for taking an arbitrary fact as contingently unknowable. In a variety of contexts, this strategy has the potential of reducing efforts in an epistemological analysis of this particular type of unknowability.

  17. Negotiating contingent knowledges in a time of epistemic doubt

    DEFF Research Database (Denmark)

    Phillips, Louise Jane

    How can/should we produce and communicate social scientific knowledge with authority under conditions of epistemic doubt? If all knowledge is contingent and if truth is a discursive effect rather than the final claim about reality - as post-foundationalism suggests - how can we formulate...... and provide support for contingent knowledge-claims? And how can the communication of social scientificknowlege be theorised and practised as the negotiation between social scientific knowledge and other forms of contingent knowledge rather than the one-way transmission of universal, value-free truth......-claims? In the paper, I outline an approach to addressing the final question. The approach is based on a combination of approaches to the production of knowledge developed in post-foundationalist sociology and philosophy of science, approaches to the communication of knowlege developed within communication studies...

  18. Task-Contingent Conscientiousness as a Unit of Personality at Work

    Science.gov (United States)

    Minbashian, Amirali; Wood, Robert E.; Beckmann, Nadin

    2010-01-01

    The present study examined the viability of incorporating task-contingent units into the study of personality at work, using conscientiousness as an illustrative example. We used experience-sampling data from 123 managers to show that (a) momentary conscientiousness at work is contingent on the difficulty and urgency demands of the tasks people…

  19. How Precarious Is Contingent Work?

    DEFF Research Database (Denmark)

    Scheuer, Steen

    2015-01-01

    agree. This study focuses on a number of non-pay conditions for contingent employees, compared to permanent staff, under the assumption that these conditions are cumulatively negative. The article is based on utilizes a survey of app.4,900 employees (response rate 57%), asking questions concerning...

  20. Dangerous "spin": the probability myth of evidence-based prescribing - a Merleau-Pontyian approach.

    Science.gov (United States)

    Morstyn, Ron

    2011-08-01

    The aim of this study was to examine logical positivist statistical probability statements used to support and justify "evidence-based" prescribing rules in psychiatry when viewed from the major philosophical theories of probability, and to propose "phenomenological probability" based on Maurice Merleau-Ponty's philosophy of "phenomenological positivism" as a better clinical and ethical basis for psychiatric prescribing. The logical positivist statistical probability statements which are currently used to support "evidence-based" prescribing rules in psychiatry have little clinical or ethical justification when subjected to critical analysis from any of the major theories of probability and represent dangerous "spin" because they necessarily exclude the individual , intersubjective and ambiguous meaning of mental illness. A concept of "phenomenological probability" founded on Merleau-Ponty's philosophy of "phenomenological positivism" overcomes the clinically destructive "objectivist" and "subjectivist" consequences of logical positivist statistical probability and allows psychopharmacological treatments to be appropriately integrated into psychiatric treatment.

  1. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  2. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  3. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  4. Effects of Individual and Group Contingency Interventions on Attendance in Adolescent Part-Time Employees

    Science.gov (United States)

    Berkovits, Shira Melody; Sturmey, Peter; Alvero, Alicia M.

    2012-01-01

    This study examined the effects of individual and group monetary contingencies on the attendance of adolescent part-time employees. Attendance increased in both individual and group contingency phases; however staff questionnaire responses indicated a preference for the individual contingencies. Future research should consider staff acceptability…

  5. 78 FR 53113 - Approval and Promulgation of Implementation Plans; California; San Joaquin Valley; Contingency...

    Science.gov (United States)

    2013-08-28

    ...] Approval and Promulgation of Implementation Plans; California; San Joaquin Valley; Contingency Measures for... California to address Clean Air Act nonattainment area contingency measure requirements for the 1997 annual... Air Act Requirements for Contingency Measures III. Review of the Submitted San Joaquin Valley PM 2.5...

  6. 48 CFR 2132.770 - Insurance premium payments and special contingency reserve.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Insurance premium payments... GENERAL CONTRACTING REQUIREMENTS CONTRACT FINANCING Contract Funding 2132.770 Insurance premium payments and special contingency reserve. Insurance premium payments and a special contingency reserve are made...

  7. Aspects of modern fracture statistics

    International Nuclear Information System (INIS)

    Tradinik, W.; Pabst, R.F.; Kromp, K.

    1981-01-01

    This contribution begins with introductory general remarks about fracture statistics. Then the fundamentals of the distribution of fracture probability are described. In the following part the application of the Weibull Statistics is justified. In the fourth chapter the microstructure of the material is considered in connection with calculations made in order to determine the fracture probability or risk of fracture. (RW) [de

  8. Contingency management for the treatment of methamphetamine use disorders.

    Science.gov (United States)

    Roll, John M; Petry, Nancy M; Stitzer, Maxine L; Brecht, Mary L; Peirce, Jessica M; McCann, Michael J; Blaine, Jack; MacDonald, Marilyn; DiMaria, Joan; Lucero, Leroy; Kellogg, Scott

    2006-11-01

    Theory and some preliminary evidence suggest that contingency management may be an effective treatment strategy or adjunct to psychosocial treatment for methamphetamine use disorders. An experimentally rigorous investigation on the topic was provided by a large multisite trial conducted under the auspices of the Clinical Trials Network of the National Institute on Drug Abuse. The authors report data on 113 participants who were diagnosed with methamphetamine abuse or dependence. They were randomly assigned to receive 12 weeks of either treatment as usual or treatment as usual plus contingency management. Urine samples were tested for illicit drugs, and breath samples were tested for alcohol. The reinforcers for drug-negative samples were plastic chips, some of which could be exchanged for prizes. The number of plastic chips drawn increased with each week of negative samples but was reset to one after a missed or positive sample. The participants in both groups remained in treatment for equivalent times, but those receiving contingency management in addition to usual treatment submitted significantly more negative samples, and they were abstinent for a longer period of time (5 versus 3 weeks). These results suggest that contingency management has promise as a component in treatment strategies for methamphetamine use disorder.

  9. Contingencies of self-worth and social-networking-site behavior.

    Science.gov (United States)

    Stefanone, Michael A; Lackaff, Derek; Rosen, Devan

    2011-01-01

    Social-networking sites like Facebook enable people to share a range of personal information with expansive groups of "friends." With the growing popularity of media sharing online, many questions remain regarding antecedent conditions for this behavior. Contingencies of self-worth afford a more nuanced approach to variable traits that affect self-esteem, and may help explain online behavior. A total of 311 participants completed an online survey measuring such contingencies and typical behaviors on Facebook. First, exploratory factor analyses revealed an underlying structure to the seven dimensions of self-worth. Public-based contingencies explained online photo sharing (β = 0.158, p relationship with time online (β = -0.186, p relationship with the intensity of online photo sharing (β = 0.242), although no relationship was evident for time spent managing profiles.

  10. An economic evaluation of contingency management for completion of hepatitis B vaccination in those on treatment for opiate dependence.

    Science.gov (United States)

    Rafia, Rachid; Dodd, Peter J; Brennan, Alan; Meier, Petra S; Hope, Vivian D; Ncube, Fortune; Byford, Sarah; Tie, Hiong; Metrebian, Nicola; Hellier, Jennifer; Weaver, Tim; Strang, John

    2016-09-01

    To determine whether the provision of contingency management using financial incentives to improve hepatitis B vaccine completion in people who inject drugs entering community treatment represents a cost-effective use of health-care resources. A probabilistic cost-effectiveness analysis was conducted, using a decision-tree to estimate the short-term clinical and health-care cost impact of the vaccination strategies, followed by a Markov process to evaluate the long-term clinical consequences and costs associated with hepatitis B infection. Data on attendance to vaccination from a UK cluster randomized trial. Two contingency management options were examined in the trial: fixed versus escalating schedule financial incentives. Life-time health-care costs and quality-adjusted life years discounted at 3.5% annually; incremental cost-effectiveness ratios. The resulting estimate for the incremental life-time health-care cost of the contingency management strategy versus usual care was £21.86 [95% confidence interval (CI) = -£12.20 to 39.86] per person offered the incentive. For 1000 people offered the incentive, the incremental reduction in numbers of hepatitis B infections avoided over their lifetime was estimated at 19 (95% CI = 8-30). The probabilistic incremental cost per quality adjusted life-year gained of the contingency management programme was estimated to be £6738 (95% CI = £6297-7172), with an 89% probability of being considered cost-effective at a threshold of £20 000 per quality-adjusted life years gained (97.60% at £30 000). Using financial incentives to increase hepatitis B vaccination completion in people who inject drugs could be a cost-effective use of health-care resources in the UK as long as the incidence remains above 1.2%. © 2016 Society for the Study of Addiction.

  11. Molecular Darwinism: the contingency of spontaneous genetic variation.

    Science.gov (United States)

    Arber, Werner

    2011-01-01

    The availability of spontaneously occurring genetic variants is an important driving force of biological evolution. Largely thanks to experimental investigations by microbial geneticists, we know today that several different molecular mechanisms contribute to the overall genetic variations. These mechanisms can be assigned to three natural strategies to generate genetic variants: 1) local sequence changes, 2) intragenomic reshuffling of DNA segments, and 3) acquisition of a segment of foreign DNA. In these processes, specific gene products are involved in cooperation with different nongenetic elements. Some genetic variations occur fully at random along the DNA filaments, others rather with a statistical reproducibility, although at many possible sites. We have to be aware that evolution in natural ecosystems is of higher complexity than under most laboratory conditions, not at least in view of symbiotic associations and the occurrence of horizontal gene transfer. The encountered contingency of genetic variation can possibly best ensure a long-term persistence of life under steadily changing living conditions.

  12. Simple statistical model for branched aggregates

    DEFF Research Database (Denmark)

    Lemarchand, Claire; Hansen, Jesper Schmidt

    2015-01-01

    , given that it already has bonds with others. The model is applied here to asphaltene nanoaggregates observed in molecular dynamics simulations of Cooee bitumen. The variation with temperature of the probabilities deduced from this model is discussed in terms of statistical mechanics arguments....... The relevance of the statistical model in the case of asphaltene nanoaggregates is checked by comparing the predicted value of the probability for one molecule to have exactly i bonds with the same probability directly measured in the molecular dynamics simulations. The agreement is satisfactory......We propose a statistical model that can reproduce the size distribution of any branched aggregate, including amylopectin, dendrimers, molecular clusters of monoalcohols, and asphaltene nanoaggregates. It is based on the conditional probability for one molecule to form a new bond with a molecule...

  13. Contingency and inevitability in science - Instruments, interfaces and the independent world

    NARCIS (Netherlands)

    Boon, Mieke; Soler, L.; Trizio, E.; Pickering, A.

    2015-01-01

    It is argued that the meaning of inevitability and contingency depends on the position someone has in the realism/constructivism debate. Furthermore, it is argued that analyzing what we mean by inevitable versus contingent knowledge adds a new dimension to the realism/constructivism debate.

  14. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  15. 45 CFR 264.72 - What requirements are imposed on a State if it receives contingency funds?

    Science.gov (United States)

    2010-10-01

    ... receives contingency funds? 264.72 Section 264.72 Public Welfare Regulations Relating to Public Welfare... Contingency Fund? § 264.72 What requirements are imposed on a State if it receives contingency funds? (a)(1) A State must meet a Contingency Fund MOE level of 100 percent of historic State expenditures for FY 1994...

  16. Individual responsibility and health-risk behaviour: a contingent valuation study from the ex ante societal perspective.

    Science.gov (United States)

    van der Star, Sanne M; van den Berg, Bernard

    2011-08-01

    This study analyzes peoples' social preferences for individual responsibility to health-risk behaviour in health care using the contingent valuation method adopting a societal perspective. We measure peoples' willingness to pay for inclusion of a treatment in basic health insurance of a hypothetical lifestyle dependent (smoking) and lifestyle independent (chronic) health problem. Our hypothesis is that peoples' willingness to pay for the independent and the dependent health problems are similar. As a methodological challenge, this study also analyzes the extent to which people consider their personal situation when answering contingent valuation questions adopting a societal perspective. 513 Dutch inhabitants responded to the questionnaire. They were asked to state their maximum willingness to pay for inclusion of treatments in basic health insurance package for two health problems. We asked them to assume that one hypothetical health problem was totally independent of behaviour (for simplicity called chronic disease). Alternatively, we asked them to assume that the other hypothetical health problem was totally caused by health-risk behaviour (for simplicity called smoking disease). We applied the payment card method to guide respondents to answer the contingent valuation method questions. Mean willingness to pay was 42.39 Euros (CI=37.24-47.55) for inclusion of treatment for health problem that was unrelated to behaviour, with '5-10' and '10-20 Euros' as most frequently stated answers. In contrast, mean willingness to pay for inclusion treatment for health-risk related problem was 11.29 Euros (CI=8.83-14.55), with '0' and '0-5 Euros' as most frequently provided answers. Difference in mean willingness to pay was substantial (over 30 Euros) and statistically significant (p-value=0.000). Smokers were statistically significantly more (p-valuenon-smokers, while people with chronic condition were not willing to pay more for the health-risk unrelated (chronic) problem

  17. Contingency Contractor Optimization Phase 3 Sustainment Platform Requirements - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frazier, Christopher Rawls [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-06-01

    Sandia National Laboratories (Sandia) is in Phase 3 Sustainment of development of a prototype tool, currently referred to as the Contingency Contractor Optimization Tool - Prototype (CCOTP), under the direction of OSD Program Support. CCOT-P is intended to help provide senior Department of Defense (DoD) leaders with comprehensive insight into the global availability, readiness and capabilities of the Total Force Mix. The CCOT-P will allow senior decision makers to quickly and accurately assess the impacts, risks and mitigating strategies for proposed changes to force/capabilities assignments, apportionments and allocations options, focusing specifically on contingency contractor planning. During Phase 2 of the program, conducted during fiscal year 2012, Sandia developed an electronic storyboard prototype of the Contingency Contractor Optimization Tool that can be used for communication with senior decision makers and other Operational Contract Support (OCS) stakeholders. Phase 3 used feedback from demonstrations of the electronic storyboard prototype to develop an engineering prototype for planners to evaluate. Sandia worked with the DoD and Joint Chiefs of Staff strategic planning community to get feedback and input to ensure that the engineering prototype was developed to closely align with future planning needs. The intended deployment environment was also a key consideration as this prototype was developed. Initial release of the engineering prototype was done on servers at Sandia in the middle of Phase 3. In 2013, the tool was installed on a production pilot server managed by the OUSD(AT&L) eBusiness Center. The purpose of this document is to specify the CCOT-P engineering prototype platform requirements as of May 2016. Sandia developed the CCOT-P engineering prototype using common technologies to minimize the likelihood of deployment issues. CCOT-P engineering prototype was architected and designed to be as independent as possible of the major deployment

  18. Uncertainty analysis with statistically correlated failure data

    International Nuclear Information System (INIS)

    Modarres, M.; Dezfuli, H.; Roush, M.L.

    1987-01-01

    Likelihood of occurrence of the top event of a fault tree or sequences of an event tree is estimated from the failure probability of components that constitute the events of the fault/event tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. At present most fault tree calculations are based on uncorrelated component failure data. This chapter describes a methodology for assessing the probability intervals for the top event failure probability of fault trees or frequency of occurrence of event tree sequences when event failure data are statistically correlated. To estimate mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. Moment matching technique is used to obtain the probability distribution function of the top event through fitting the Johnson Ssub(B) distribution. The computer program, CORRELATE, was developed to perform the calculations necessary for the implementation of the method developed. (author)

  19. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  20. Counterfactuals and history: Contingency and convergence in histories of science and life.

    Science.gov (United States)

    Hesketh, Ian

    2016-08-01

    This article examines a series of recent histories of science that have attempted to consider how science may have developed in slightly altered historical realities. These works have, moreover, been influenced by debates in evolutionary science about the opposing forces of contingency and convergence in regard to Stephen Jay Gould's notion of "replaying life's tape." The article argues that while the historians under analysis seem to embrace contingency in order to present their counterfactual narratives, for the sake of historical plausibility they are forced to accept a fairly weak role for contingency in shaping the development of science. It is therefore argued that Simon Conway Morris's theory of evolutionary convergence comes closer to describing the restrained counterfactual worlds imagined by these historians of science than does contingency. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. A statistical model for aggregating judgments by incorporating peer predictions

    OpenAIRE

    McCoy, John; Prelec, Drazen

    2017-01-01

    We propose a probabilistic model to aggregate the answers of respondents answering multiple-choice questions. The model does not assume that everyone has access to the same information, and so does not assume that the consensus answer is correct. Instead, it infers the most probable world state, even if only a minority vote for it. Each respondent is modeled as receiving a signal contingent on the actual world state, and as using this signal to both determine their own answer and predict the ...

  2. Interference statistics and capacity analysis for uplink transmission in two-tier small cell networks: A geometric probability approach

    KAUST Repository

    Tabassum, Hina

    2014-07-01

    This paper presents a novel framework to derive the statistics of the interference considering dedicated and shared spectrum access for uplink transmission in two-tier small cell networks such as the macrocell-femtocell networks. The framework exploits the distance distributions from geometric probability theory to characterize the uplink interference while considering a traditional grid-model set-up for macrocells along with the randomly deployed femtocells. The derived expressions capture the impact of path-loss, composite shadowing and fading, uniform and non-uniform traffic loads, spatial distribution of femtocells, and partial and full spectral reuse among femtocells. Considering dedicated spectrum access, first, we derive the statistics of co-tier interference incurred at both femtocell and macrocell base stations (BSs) from a single interferer by approximating generalized- K composite fading distribution with the tractable Gamma distribution. We then derive the distribution of the number of interferers considering partial spectral reuse and moment generating function (MGF) of the cumulative interference for both partial and full spectral reuse scenarios. Next, we derive the statistics of the cross-tier interference at both femtocell and macrocell BSs considering shared spectrum access. Finally, we utilize the derived expressions to analyze the capacity in both dedicated and shared spectrum access scenarios. The derived expressions are validated by the Monte Carlo simulations. Numerical results are generated to assess the feasibility of shared and dedicated spectrum access in femtocells under varying traffic load and spectral reuse scenarios. © 2014 IEEE.

  3. Contingency Valuation of Croatian Arboretum Opeka

    Directory of Open Access Journals (Sweden)

    Stjepan Posavec

    2012-12-01

    Full Text Available Background and Purpose: Social aspects of forestry have always been an important factor of forest usage and management, and therefore have significant influence on its sustainability. Non-wood forest functions such as recreation, tourism, aesthetic and educational factors influence development of rural areas. Contingent valuation method has rarely been used for evaluation of protected forest areas. The aim of the article is to estimate the amount of money visitors are willing to pay for nature’s resources preservation in the arboretum Opeka in the North-West Croatia. Material and Methods: Opeka Arboretum is situated in the Vinica municipality in northern Croatia. Located in a large park surrounding a manor, Opeka arboretum, with its 65 hectares is the largest of the three arboretums existing in Croatia today. The arboretum was founded in 1860 by the Count Marko Bombelles. Contingent valuation is a survey-based economic technique for the non-market valuation of resources, such as environmental preservation or the impact of contamination. It is also the approach that can generally be used to include what is usually referred to as the passive use component of the economic value of environmental goods. Results and Conclusion: Willingness to pay for visitor’s use of the arboretum has been investigated using the survey and contingency valuation method on a sample of 53 respondents. Research results present high preference for arboretum benefits such as beauty of landscape, cultural and historical significance, recreation and health but low willingness to pay.

  4. Understanding common statistical methods, Part I: descriptive methods, probability, and continuous data.

    Science.gov (United States)

    Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A

    2011-01-01

    Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.

  5. Statistics For Dummies

    CERN Document Server

    Rumsey, Deborah

    2011-01-01

    The fun and easy way to get down to business with statistics Stymied by statistics? No fear ? this friendly guide offers clear, practical explanations of statistical ideas, techniques, formulas, and calculations, with lots of examples that show you how these concepts apply to your everyday life. Statistics For Dummies shows you how to interpret and critique graphs and charts, determine the odds with probability, guesstimate with confidence using confidence intervals, set up and carry out a hypothesis test, compute statistical formulas, and more.Tracks to a typical first semester statistics cou

  6. [Contingency management in opioid substitution treatment].

    Science.gov (United States)

    Specka, M; Böning, A; Scherbaum, N

    2011-07-01

    The majority of opiate-dependent patients in substitution treatment show additional substance-related disorders. Concomitant use of heroin, alcohol, benzodiazepines or cocaine compromises treatment success. Concomitant drug use may be treated by using contingency management (CM) which is based on learning theory. In CM, abstinence from drugs, as verified by drug screenings, is reinforced directly and contingently. Reinforcers used in CM studies with substituted patients were, amongst others, vouchers and take-home privileges. Studies in the USA show a medium average effect of CM on drug consumption rates and abstinence. The effects decrease markedly after the end of the intervention. We discuss whether CM is applicable within the German substitution treatment system and how it can be combined with other interventions such as selective detoxification treatments or cognitive-behavioural programmes. © Georg Thieme Verlag KG Stuttgart · New York.

  7. Strategic cost management, contingent factors and performance in services

    Directory of Open Access Journals (Sweden)

    Odysseas Pavlatos

    2018-06-01

    Full Text Available The purpose of this paper is to investigate the relationship between contextual factors identified from contingency-based research, the extent of the use of strategic cost management (SCM techniques and business performance in services. An empirical survey was conducted on a sample of 88 services in Greece. The analysis of the survey data indicates that the use of strategic cost management techniques in services can be considered quite satisfactory. By drawing on the grounds of contingency theory, five factors were identified as potentially exhibiting an emergent relationship with strategic cost management. The five factors are; (1 Perceived environmental uncertainty, (2 Structure, (3 Organizational life cycle stage, (4 Strategy and (5 Size. The survey revealed that SCM usage is positively affected by these five contingent factors, while SCM usage, in turn, positively affects performance. A significant mediating effect of SCM usage on performance is evident.

  8. New Challenges of Contingency Theory in Management Accounting System, in Terms of Global Economic Crisis

    OpenAIRE

    Ene Dumitru

    2010-01-01

    This paper aims to answer the question: 1. The contingency theory can be a source of improvement in management accounting research ,in terms of global economic crisis?’’ 2. Can be Contingency factors a bridge between organizational theories and management accounting? Research purpose: -The contingency theory can be a source of improvement in management accounting research, in terms of global economic crises; -Contingency factors can be a bridge between organizational theories and management a...

  9. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  10. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    Science.gov (United States)

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  11. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  13. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    International Nuclear Information System (INIS)

    Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van’t

    2012-01-01

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  14. Monte Carlo approach to define the refrigerator capacities for JT-60SA

    International Nuclear Information System (INIS)

    Wanner, Manfred; Barabaschi, Pietro; Lamaison, Valerie; Michel, Frederic; Reynaud, Pascal; Roussel, Pascal

    2011-01-01

    The JT-60SA cryogenic system shall provide refrigeration to keep the superconducting magnets and their structures at 4.4 K, cryo-pumps at 3.7 K, thermal shields at 80-100 K, and deliver a flow of 50 K helium to the current leads. A Monte Carlo method is proposed to determine the capacity contingencies for the refrigeration system. Attributing individual contingencies and distribution probability functions to the design variables allows the different load contributions to be statistically averaged. The total refrigeration contingency is derived for each temperature level from the 95% confidence level of the integrated distribution function.

  15. The statistical significance of error probability as determined from decoding simulations for long codes

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  16. Business statistics I essentials

    CERN Document Server

    Clark, Louise

    2014-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Business Statistics I includes descriptive statistics, introduction to probability, probability distributions, sampling and sampling distributions, interval estimation, and hypothesis t

  17. Contingent Imitation Increases Verbal Interaction in Children with Autism Spectrum Disorders

    Science.gov (United States)

    Ishizuka, Yuka; Yamamoto, Jun-ichi

    2016-01-01

    Several studies have suggested that contingent adult imitation increase nonverbal communication, such as attention and proximity to adults, in children with autism spectrum disorders. However, few studies have shown the effect of contingent imitation on verbal communication. This study examined whether children with autism were able to promote…

  18. Evaluation of a modified contingency management intervention for consistent attendance in therapeutic workplace participants.

    Science.gov (United States)

    Wong, Conrad J; Dillon, Erin M; Sylvest, Christine; Silverman, Kenneth

    2004-06-11

    In a therapeutic workplace business, drug abuse patients are hired as data entry operators and paid to perform data entry work contingent upon documented drug abstinence. Reliable attendance has been difficult to maintain despite the opportunity for operators to earn a living wage, 6 h per day, 5 days per week. A within-subject reversal design experiment evaluated a contingency management intervention that allowed for flexibility regarding when operators could arrive to work, yet maintained a contingency for reliable workplace attendance. Results from a within-subject reversal design experiment demonstrated the contingency management intervention to be effective in increasing the frequency of completed work shifts in four of five operators. Repeated measures ANOVA and Tukey's post-hoc tests of grouped data showed that the contingency management intervention significantly (P workplace participants.

  19. Defense plan of Hydro-Quebec for extreme contingencies

    International Nuclear Information System (INIS)

    Trudel, Guilles; Bernard, Serge; Portales, Esteban

    2000-01-01

    In the last years, Hydro-Quebec it undertook an important program to improve the dependability of their net of energy transport. They concentrated the efforts on increasing the capacity of the net resist in the event of carries to an extreme contingency caused in general by multiple incidents or for successive disconnection of the lines of energy transport. To neutralize these contingencies, Hydro-Quebec it adopted a series of special measures that are contained under the general title of Plan of Defense for Extreme Contingencies. The objective of this plan is to detect the incidents that surpass the capacity of the net. It is completely automatic and it is based mainly in: A system of automatic disconnection of generation and tele-shot of loads; A system of automatic maneuver (opening and closing) of inductances shunt of 735 kw; A system of disconnection of loads for low voltage; A system of disconnection of loads for low frequency. The present document summarizes the orientations that there is taking Hydro-Quebec to protect its net in the event of extreme contingencies and it describes the different automatism that they are adopts, in particular the system automatic disconnection of generation and tele-shot of loads (RPTC) that is one of the main components of the defense plan. The system RPTC detects the simultaneous loss of several lines directly in 15 substations of 735 kw. It understands four places of automatic disconnection of generation and a centralized system of tele-shot of loads

  20. Resource Management and Contingencies in Aerospace Concurrent Engineering

    Science.gov (United States)

    Karpati, Gabe; Hyde, Tupper; Peabody, Hume; Garrison, Matthew

    2012-01-01

    significant concern in designing complex systems implementing new technologies is that while knowledge about the system is acquired incrementally, substantial financial commitments, even make-or-break decisions, must be made upfront, essentially in the unknown. One practice that helps in dealing with this dichotomy is the smart embedding of contingencies and margins in the design to serve as buffers against surprises. This issue presents itself in full force in the aerospace industry, where unprecedented systems are formulated and committed to as a matter of routine. As more and more aerospace mission concepts are generated by concurrent design laboratories, it is imperative that such laboratories apply well thought-out contingency and margin structures to their designs. The first part of this publication provides an overview of resource management techniques and standards used in the aerospace industry. That is followed by a thought provoking treatise on margin policies. The expose presents the actual flight telemetry data recorded by the thermal discipline during several recent NASA Goddard Space Flight Center missions. The margins actually achieved in flight are compared against pre-flight predictions, and the appropriateness and the ramifications of having designed with rigid margins to bounding stacked worst case conditions are assessed. The second half of the paper examines the particular issues associated with the application of contingencies and margins in the concurrent engineering environment. In closure, a discipline-by-discipline disclosure of the contingency and margin policies in use at the Integrated Design Center at NASA s Goddard Space Flight Center is made.

  1. Evaluation of probability and hazard in nuclear energy

    International Nuclear Information System (INIS)

    Novikov, V.Ya.; Romanov, N.L.

    1979-01-01

    Various methods of evaluation of accident probability on NPP are proposed because of NPP security statistic evaluation unreliability. The conception of subjective probability for quantitative analysis of security and hazard are described. Intrepretation of probability as real faith of an expert is assumed as a basis of the conception. It is suggested to study the event uncertainty in the framework of subjective probability theory which not only permits but demands to take into account expert opinions when evaluating the probability. These subjective expert evaluations effect to a certain extent the calculation of the usual mathematical event probability. The above technique is advantageous to use for consideration of a separate experiment or random event

  2. Two psychologies: Cognitive versus contingency-oriented

    NARCIS (Netherlands)

    Mey, H.R.A. De

    2003-01-01

    Cognitive psychology and contingency-based behavior analysis are contrasted to each other with respect to their philosophical and theoretical underpinnings as well as to theirpractical goals. Whereas the former focuses on intra-organismic structure and function in explaining minds, the latter

  3. Thomas Aquinas on Contingency of Nature

    Czech Academy of Sciences Publication Activity Database

    Dvořák, Petr

    2008-01-01

    Roč. 5, č. 2 (2008), s. 185-196 ISSN 1214-8407 R&D Projects: GA AV ČR(CZ) IAA900090602 Institutional research plan: CEZ:AV0Z90090514 Keywords : Thomas Aquinas * determinism * contingency Subject RIV: AA - Philosophy ; Religion

  4. A Contingency View of Problem Solving in Schools: A Case Analysis.

    Science.gov (United States)

    Hanson, E. Mark; Brown, Michael E.

    Patterns of problem-solving activity in one middle-class urban high school are examined and a problem solving model rooted in a conceptual framework of contingency theory is presented. Contingency theory stresses that as political, economic, and social conditions in an organization's environment become problematic, the internal structures of the…

  5. Dynamic encoding of speech sequence probability in human temporal cortex.

    Science.gov (United States)

    Leonard, Matthew K; Bouchard, Kristofer E; Tang, Claire; Chang, Edward F

    2015-05-06

    Sensory processing involves identification of stimulus features, but also integration with the surrounding sensory and cognitive context. Previous work in animals and humans has shown fine-scale sensitivity to context in the form of learned knowledge about the statistics of the sensory environment, including relative probabilities of discrete units in a stream of sequential auditory input. These statistics are a defining characteristic of one of the most important sequential signals humans encounter: speech. For speech, extensive exposure to a language tunes listeners to the statistics of sound sequences. To address how speech sequence statistics are neurally encoded, we used high-resolution direct cortical recordings from human lateral superior temporal cortex as subjects listened to words and nonwords with varying transition probabilities between sound segments. In addition to their sensitivity to acoustic features (including contextual features, such as coarticulation), we found that neural responses dynamically encoded the language-level probability of both preceding and upcoming speech sounds. Transition probability first negatively modulated neural responses, followed by positive modulation of neural responses, consistent with coordinated predictive and retrospective recognition processes, respectively. Furthermore, transition probability encoding was different for real English words compared with nonwords, providing evidence for online interactions with high-order linguistic knowledge. These results demonstrate that sensory processing of deeply learned stimuli involves integrating physical stimulus features with their contextual sequential structure. Despite not being consciously aware of phoneme sequence statistics, listeners use this information to process spoken input and to link low-level acoustic representations with linguistic information about word identity and meaning. Copyright © 2015 the authors 0270-6474/15/357203-12$15.00/0.

  6. Double contingency: A practical example of a risk acceptance philosophy

    International Nuclear Information System (INIS)

    Bazley, J.J.

    1995-01-01

    The double-contingency principle as defined in ANSI/ANS-8.1 specifies that open-quotes Process designs should, in general, incorporate sufficient factors of safety to require at least two unlikely, independent, and concurrent changes in process conditions before a criticality accident is possible.close quotes The following practical example has been used to familiarize plant operators and managers and train criticality safety engineers in double contingency

  7. Defense Infrastructure: Actions Needed to Enhance Oversight of Construction Projects Supporting Military Contingency Operations

    Science.gov (United States)

    2016-09-01

    supporting documentation for reviews that the U.S. Forces-Afghanistan conducted beginning in November 2011 of planned or ongoing contingency ...12 Contingency basing includes the planning , designing, constructing, operating, managing, and transitioning or closing of a non-enduring location...2016). Background Definition of “ Contingency Construction” Project Page 7 GAO-16-406 Defense Infrastructure statutory authority

  8. A contingency-based approach to the etiology of 'disorganized' attachment: the 'flickering switch' hypothesis.

    Science.gov (United States)

    Koós, O; Gergely, G

    2001-01-01

    The authors present a new approach to the etiology of disorganized attachment based on contingency detection theory. According to this view, the relevant common factor in parental maltreatment and unresolved loss that leads to disorganized attachment has to do with the type of "deviant contingency environment" that both of these conditions generate. In such environments, infants experience periods of being in control followed by periods of sudden loss of control over the caregiver's behavior. The authors hypothesize that this adversely affects the developmental unfolding of the infant's innate "contingency detection module" (Gergely & Watson, 1999), which normally involves a maturational shift around 3 months from an initial attention bias for perfectly contingent stimulation to an emerging preference for less-than-perfect social contingencies. The periodically changing controllability of abusive and dissociating "unresolved" attachment figures is hypothesized to block this process and to lead to the defensive fixation of a dysfunctional "flickering contingency switch" mechanism with two dominant and competing target positions (self-oriented vs. other-oriented). This results in the dissociative style of attention and behavioral organization characteristic of disorganized infant attachment. The authors summarize the preliminary results of an empirical study that provides support for this model in 6.5-month-old infants using a modified Still-Face situation (the Mirror Interaction Situation). The study demonstrates differential emotional and behavioral reactions to sudden loss of maternal contingency and a specific interest in exploring the perfectly contingent self-image in the mirror in infants who at 12 months become categorized as "disorganized" in the Strange Situation.

  9. Dynamic Contingency Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2016-01-14

    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS/E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  10. When does self-esteem relate to deviant behavior? The role of contingencies of self-worth.

    Science.gov (United States)

    Ferris, D Lance; Brown, Douglas J; Lian, Huiwen; Keeping, Lisa M

    2009-09-01

    Researchers have assumed that low self-esteem predicts deviance, but empirical results have been mixed. This article draws upon recent theoretical developments regarding contingencies of self-worth to clarify the self-esteem/deviance relation. It was predicted that self-esteem level would relate to deviance only when self-esteem was not contingent on workplace performance. In this manner, contingent self-esteem is a boundary condition for self-consistency/behavioral plasticity theory predictions. Using multisource data collected from 123 employees over 6 months, the authors examined the interaction between level (high/low) and type (contingent/noncontingent) of self-esteem in predicting workplace deviance. Results support the hypothesized moderating effects of contingent self-esteem; implications for self-esteem theories are discussed.

  11. conting : an R package for Bayesian analysis of complete and incomplete contingency tables

    OpenAIRE

    Overstall, Antony; King, Ruth

    2014-01-01

    The aim of this paper is to demonstrate the R package conting for the Bayesian analysis of complete and incomplete contingency tables using hierarchical log-linear models. This package allows a user to identify interactions between categorical factors (via complete contingency tables) and to estimate closed population sizes using capture-recapture studies (via incomplete contingency tables). The models are fitted using Markov chain Monte Carlo methods. In particular, implementations of the Me...

  12. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  13. 78 FR 71448 - Regional Reliability Standard BAL-002-WECC-2-Contingency Reserve

    Science.gov (United States)

    2013-11-29

    ...; Order No. 789] Regional Reliability Standard BAL-002-WECC-2--Contingency Reserve AGENCY: Federal Energy... (Contingency Reserve). The North American Electric Reliability Corporation (NERC) and Western Electricity... Region and is meant to specify the quantity and types of [[Page 71449

  14. The Corps Engineer Battalion in Contingency Operations

    National Research Council Canada - National Science Library

    Raymer, James

    2001-01-01

    .... The central research question asks: Is the proposed echelons above division engineer battalion design a better one for active and reserve component corps engineer forces to respond in a contingency...

  15. Attenuation of the contingency detection effect in the extrastriate body area in autism spectrum disorder.

    Science.gov (United States)

    Okamoto, Yuko; Kitada, Ryo; Tanabe, Hiroki C; Hayashi, Masamichi J; Kochiyama, Takanori; Munesue, Toshio; Ishitobi, Makoto; Saito, Daisuke N; Yanaka, Hisakazu T; Omori, Masao; Wada, Yuji; Okazawa, Hidehiko; Sasaki, Akihiro T; Morita, Tomoyo; Itakura, Shoji; Kosaka, Hirotaka; Sadato, Norihiro

    2014-10-01

    Detection of the contingency between one's own behavior and consequent social events is important for normal social development, and impaired contingency detection may be a cause of autism spectrum disorder (ASD). To depict the neural underpinnings of this contingency effect, 19 adults with ASD and 22 control participants underwent functional MRI while imitating another's actions and their actions being imitated by the other. As the extrastriate body area (EBA) receives efference copies of one's own movements, we predicted that the EBA would show an atypical response during contingency detection in ASD. We manipulated two factors: the congruency of the executed and observed actions, and the order of action execution and observation. Both groups showed the congruency effect in the bilateral EBA during imitation. When action preceded observation, the left EBA of the control group showed the congruency effect, representing the response to being imitated, indicating contingency detection. The ASD group showed a reduced contingency effect in the left EBA. These results indicate that the function of the EBA in the contingency detection is altered in ASD. Copyright © 2014 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.

  16. Alternative Approaches to the Analysis of Multidimensional Contingency Tables

    Directory of Open Access Journals (Sweden)

    Iva Pecáková

    2011-12-01

    Full Text Available The practical analyses of interactions between categorical variables in various areas (such as public opinion research or marketing research are often only applications of chi-square tests in two-way contingency tables. However, in many situations it is impossible to use large-sample approximations to sampling distributions when theiradequacy can be in doubt. It is known, that these approximations may be very poor when the contingency table contains very small expected frequencies. However, recent work has shown that these approximations can be very poor when the contingency table contains both small and large expected frequencies. Of course, the rule of thumb of a minimum expected frequency is not met either in the case of sparse table. The article deals with alternative approaches to the data analysis in such cases. It points out other possibilities and shows that thanks to the development of computer technology exact methods previously only difficult usable are available for this purpose.

  17. Ordinal Log-Linear Models for Contingency Tables

    Directory of Open Access Journals (Sweden)

    Brzezińska Justyna

    2016-12-01

    Full Text Available A log-linear analysis is a method providing a comprehensive scheme to describe the association for categorical variables in a contingency table. The log-linear model specifies how the expected counts depend on the levels of the categorical variables for these cells and provide detailed information on the associations. The aim of this paper is to present theoretical, as well as empirical, aspects of ordinal log-linear models used for contingency tables with ordinal variables. We introduce log-linear models for ordinal variables: linear-by-linear association, row effect model, column effect model and RC Goodman’s model. Algorithm, advantages and disadvantages will be discussed in the paper. An empirical analysis will be conducted with the use of R.

  18. Lessons in Contingent, Recursive Humility

    Science.gov (United States)

    Vagle, Mark D.

    2011-01-01

    In this article, the author argues that critical work in teacher education should begin with teacher educators turning a critical eye on their own practices. The author uses Lesko's conception of contingent, recursive growth and change to analyze a lesson he observed as part of a phenomenological study aimed at understanding more about what it is…

  19. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  20. Statistics for Engineers

    International Nuclear Information System (INIS)

    Kim, Jin Gyeong; Park, Jin Ho; Park, Hyeon Jin; Lee, Jae Jun; Jun, Whong Seok; Whang, Jin Su

    2009-08-01

    This book explains statistics for engineers using MATLAB, which includes arrangement and summary of data, probability, probability distribution, sampling distribution, assumption, check, variance analysis, regression analysis, categorical data analysis, quality assurance such as conception of control chart, consecutive control chart, breakthrough strategy and analysis using Matlab, reliability analysis like measurement of reliability and analysis with Maltab, and Markov chain.