WorldWideScience

Sample records for subject probability theory

  1. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  2. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  3. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  4. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  5. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  6. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  7. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  8. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  9. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  10. Records via probability theory

    CERN Document Server

    Ahsanullah, Mohammad

    2015-01-01

    A lot of statisticians, actuarial mathematicians, reliability engineers, meteorologists, hydrologists, economists. Business and sport analysts deal with records which play important roles in various fields of statistics and its application. This book enables a reader to check his/her level of understanding of the theory of record values. We give basic formulae which are more important in the theory and present a lot of examples which illustrate the theoretical statements. For a beginner in record statistics, as well as for graduate students the study of our book needs the basic knowledge of the subject. A more advanced reader can use our book to polish his/her knowledge. An upgraded list of bibliography which will help a reader to enrich his/her theoretical knowledge and widen the experience of dealing with ordered observations, is also given in the book.

  11. PSA, subjective probability and decision making

    International Nuclear Information System (INIS)

    Clarotti, C.A.

    1989-01-01

    PSA is the natural way to making decisions in face of uncertainty relative to potentially dangerous plants; subjective probability, subjective utility and Bayes statistics are the ideal tools for carrying out a PSA. This paper reports that in order to support this statement the various stages of the PSA procedure are examined in detail and step by step the superiority of Bayes techniques with respect to sampling theory machinery is proven

  12. Constructor theory of probability

    Science.gov (United States)

    2016-01-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914

  13. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  14. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  15. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  16. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  17. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  18. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  19. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  20. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  1. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  2. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  3. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2017-01-01

    significantly due to risk aversion. We characterize an approach for eliciting the entire subjective belief distribution that is minimally biased due to risk aversion. We offer simulated examples to demonstrate the intuition of our approach. We also provide theory to formally characterize our framework. And we...... provide experimental evidence which corroborates our theoretical results. We conclude that for empirically plausible levels of risk aversion, one can reliably elicit most important features of the latent subjective belief distribution without undertaking calibration for risk attitudes providing one...

  4. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  5. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  6. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  7. Chance, determinism and the classical theory of probability.

    Science.gov (United States)

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations?

    Directory of Open Access Journals (Sweden)

    David Howden

    2009-10-01

    Full Text Available Frequency probability theorists define an event’s probability distribution as the limit of a repeated set of trials belonging to a homogeneous collective. The subsets of this collective are events which we have deficient knowledge about on an individual level, although for the larger collective we have knowledge its aggregate behavior. Hence, probabilities can only be achieved through repeated trials of these subsets arriving at the established frequencies that define the probabilities. Crovelli (2009 argues that this is a mistaken approach, and that a subjective assessment of individual trials should be used instead. Bifurcating between the two concepts of risk and uncertainty, Crovelli first asserts that probability is the tool used to manage uncertain situations, and then attempts to rebuild a definition of probability theory with this in mind. We show that such an attempt has little to gain, and results in an indeterminate application of entrepreneurial forecasting to uncertain decisions—a process far-removed from any application of probability theory.

  9. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  10. Introduction to probability and measure theories

    International Nuclear Information System (INIS)

    Partasarati, K.

    1983-01-01

    Chapters of probability and measured theories are presented. The Borele images of spaces with the measure into each other and in separate metric spaces are studied. The Kolmogorov theorem on the continuation of probabilies is drawn from the theorem on the measure continuation to the projective limits of spaces with measure. The integration theory is plotted, measures on multiplications of spaces are studied. The theory of conventional mathematical expectations by projections in Hilbert space is presented. In conclusion, the theory of weak convergence of measures of elements of the theory of characteristic functions and the theory of invariant and quasi-invariant measures on groups and homogeneous spaces is given

  11. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  12. Surprisingly rational: probability theory plus noise explains biases in judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. (c) 2014 APA, all rights reserved.

  13. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  14. Eliciting Subjective Probability Distributions with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2015-01-01

    We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....

  15. Probability and logical structure of statistical theories

    International Nuclear Information System (INIS)

    Hall, M.J.W.

    1988-01-01

    A characterization of statistical theories is given which incorporates both classical and quantum mechanics. It is shown that each statistical theory induces an associated logic and joint probability structure, and simple conditions are given for the structure to be of a classical or quantum type. This provides an alternative for the quantum logic approach to axiomatic quantum mechanics. The Bell inequalities may be derived for those statistical theories that have a classical structure and satisfy a locality condition weaker than factorizability. The relation of these inequalities to the issue of hidden variable theories for quantum mechanics is discussed and clarified

  16. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  17. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  18. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  19. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  20. Bayesian probability theory and inverse problems

    International Nuclear Information System (INIS)

    Kopec, S.

    1994-01-01

    Bayesian probability theory is applied to approximate solving of the inverse problems. In order to solve the moment problem with the noisy data, the entropic prior is used. The expressions for the solution and its error bounds are presented. When the noise level tends to zero, the Bayesian solution tends to the classic maximum entropy solution in the L 2 norm. The way of using spline prior is also shown. (author)

  1. A Challenge to Ludwig von Mises’s Theory of Probability

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2010-10-01

    Full Text Available The most interesting and completely overlooked aspect of Ludwig von Mises’s theory of probability is the total absence of any explicit definition for probability in his theory. This paper examines Mises’s theory of probability in light of the fact that his theory possesses no definition for probability. It is argued, first, that Mises’s theory differs in important respects from his brother’s famous theory of probability. A defense of the subjective definition for probability is then provided, which is subsequently used to critique Ludwig von Mises’s theory. It is argued that only the subjective definition for probability comports with Mises’s other philosophical positions. Since Mises did not provide an explicit definition for probability, it is suggested that he ought to have adopted a subjective definition.

  2. Statistics and Probability Theory In Pursuit of Engineering Decision Support

    CERN Document Server

    Faber, Michael Havbro

    2012-01-01

    This book provides the reader with the basic skills and tools of statistics and probability in the context of engineering modeling and analysis. The emphasis is on the application and the reasoning behind the application of these skills and tools for the purpose of enhancing  decision making in engineering. The purpose of the book is to ensure that the reader will acquire the required theoretical basis and technical skills such as to feel comfortable with the theory of basic statistics and probability. Moreover, in this book, as opposed to many standard books on the same subject, the perspective is to focus on the use of the theory for the purpose of engineering model building and decision making.  This work is suitable for readers with little or no prior knowledge on the subject of statistics and probability.

  3. Subjective Illness theory and coping

    Directory of Open Access Journals (Sweden)

    Gessmann H.-W.

    2015-03-01

    Full Text Available The article presents a view of a problem of subjective illness theory in context of coping behavior. The article compiles the results of the latest studies of coping; discloses the way subjective illness theory affects the illness coping and patient's health; presents the study of differences in coping behaviour of patients at risk of heart attack and oncology. The article is recommended for specialists, concerned with psychological reasons of pathogenic processes and coping strategies of patients.

  4. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  5. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  6. Using Fuzzy Probability Weights in Cumulative Prospect Theory

    Directory of Open Access Journals (Sweden)

    Užga-Rebrovs Oļegs

    2016-12-01

    Full Text Available During the past years, a rapid growth has been seen in the descriptive approaches to decision choice. As opposed to normative expected utility theory, these approaches are based on the subjective perception of probabilities by the individuals, which takes place in real situations of risky choice. The modelling of this kind of perceptions is made on the basis of probability weighting functions. In cumulative prospect theory, which is the focus of this paper, decision prospect outcome weights are calculated using the obtained probability weights. If the value functions are constructed in the sets of positive and negative outcomes, then, based on the outcome value evaluations and outcome decision weights, generalised evaluations of prospect value are calculated, which are the basis for choosing an optimal prospect.

  7. Joseph L Doob and Development of Probability Theory

    Indian Academy of Sciences (India)

    IAS Admin

    (equivalent to BSc in India) in mathematics at the famous Harvard University, ... Doob says that the force of economic circumstances got him into probability theory. ... Books. J L Doob established probability theory as a major discipline of study ...

  8. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  10. Tests of Cumulative Prospect Theory with graphical displays of probability

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-10-01

    Full Text Available Recent research reported evidence that contradicts cumulative prospect theory and the priority heuristic. The same body of research also violates two editing principles of original prospect theory: cancellation (the principle that people delete any attribute that is the same in both alternatives before deciding between them and combination (the principle that people combine branches leading to the same consequence by adding their probabilities. This study was designed to replicate previous results and to test whether the violations of cumulative prospect theory might be eliminated or reduced by using formats for presentation of risky gambles in which cancellation and combination could be facilitated visually. Contrary to the idea that decision behavior contradicting cumulative prospect theory and the priority heuristic would be altered by use of these formats, however, data with two new graphical formats as well as fresh replication data continued to show the patterns of evidence that violate cumulative prospect theory, the priority heuristic, and the editing principles of combination and cancellation. Systematic violations of restricted branch independence also contradicted predictions of ``stripped'' prospect theory (subjectively weighted additive utility without the editing rules.

  11. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  12. Independent Events in Elementary Probability Theory

    Science.gov (United States)

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  13. On a paradox of probability theory

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)

  14. Subjective Expected Utility Theory without States of the World

    OpenAIRE

    Edi Karni

    2005-01-01

    This paper develops an axiomatic theory of decision making under uncertainty that dispenses with the state space. The results are subjective expected utility models with unique, action-dependent, subjective probabilities, and a utility function defined over wealth-effect pairs that is unique up to positive linear transformation.

  15. Probability Theory, Not the Very Guide of Life

    Science.gov (United States)

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  16. Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory

    Directory of Open Access Journals (Sweden)

    Yafei Song

    2015-01-01

    Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.

  17. A Short History of Probability Theory and Its Applications

    Science.gov (United States)

    Debnath, Lokenath; Basu, Kanadpriya

    2015-01-01

    This paper deals with a brief history of probability theory and its applications to Jacob Bernoulli's famous law of large numbers and theory of errors in observations or measurements. Included are the major contributions of Jacob Bernoulli and Laplace. It is written to pay the tricentennial tribute to Jacob Bernoulli, since the year 2013…

  18. Quantum interference of probabilities and hidden variable theories

    International Nuclear Information System (INIS)

    Srinivas, M.D.

    1984-01-01

    One of the fundamental contributions of Louis de Broglie, which does not get cited often, has been his analysis of the basic difference between the calculus of the probabilities as predicted by quantum theory and the usual calculus of probabilities - the one employed by most mathematicians, in its standard axiomatised version due to Kolmogorov. This paper is basically devoted to a discussion of the 'quantum interference of probabilities', discovered by de Broglie. In particular, it is shown that it is this feature of the quantum theoretic probabilities which leads to some serious constraints on the possible 'hidden-variable formulations' of quantum mechanics, including the celebrated theorem of Bell. (Auth.)

  19. Hamiltonian theories quantization based on a probability operator

    International Nuclear Information System (INIS)

    Entral'go, E.E.

    1986-01-01

    The quantization method with a linear reflection of classical coordinate-momentum-time functions Λ(q,p,t) at quantum operators in a space of quantum states ψ, is considered. The probability operator satisfies a system of equations representing the principles of dynamical and canonical correspondences between the classical and quantum theories. The quantization based on a probability operator leads to a quantum theory with a nonnegative joint coordinate-momentum distribution function for any state ψ. The main consequences of quantum mechanics with a probability operator are discussed in comparison with the generally accepted quantum and classical theories. It is shown that a probability operator leads to an appearance of some new notions called ''subquantum'' ones. Hence the quantum theory with a probability operator does not pretend to any complete description of physical reality in terms of classical variables and by this reason contains no problems like Einstein-Podolsky-Rosen paradox. The results of some concrete problems are given: a free particle, a harmonic oscillator, an electron in the Coulomb field. These results give hope on the possibility of an experimental verification of the quantization based on a probability operator

  20. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  1. Bayesian probability theory applications in the physical sciences

    CERN Document Server

    Linden, Wolfgang von der; Toussaint, Udo von

    2014-01-01

    From the basics to the forefront of modern research, this book presents all aspects of probability theory, statistics and data analysis from a Bayesian perspective for physicists and engineers. The book presents the roots, applications and numerical implementation of probability theory, and covers advanced topics such as maximum entropy distributions, stochastic processes, parameter estimation, model selection, hypothesis testing and experimental design. In addition, it explores state-of-the art numerical techniques required to solve demanding real-world problems. The book is ideal for students and researchers in physical sciences and engineering.

  2. Probability and information theory, with applications to radar

    CERN Document Server

    Woodward, P M; Higinbotham, W

    1964-01-01

    Electronics and Instrumentation, Second Edition, Volume 3: Probability and Information Theory with Applications to Radar provides information pertinent to the development on research carried out in electronics and applied physics. This book presents the established mathematical techniques that provide the code in which so much of the mathematical theory of electronics and radar is expressed.Organized into eight chapters, this edition begins with an overview of the geometry of probability distributions in which moments play a significant role. This text then examines the mathematical methods in

  3. Reliability of structures by using probability and fatigue theories

    International Nuclear Information System (INIS)

    Lee, Ouk Sub; Kim, Dong Hyeok; Park, Yeon Chang

    2008-01-01

    Methodologies to calculate failure probability and to estimate the reliability of fatigue loaded structures are developed. The applicability of the methodologies is evaluated with the help of the fatigue crack growth models suggested by Paris and Walker. The probability theories such as the FORM (first order reliability method), the SORM (second order reliability method) and the MCS (Monte Carlo simulation) are utilized. It is found that the failure probability decreases with the increase of the design fatigue life and the applied minimum stress, the decrease of the initial edge crack size, the applied maximum stress and the slope of Paris equation. Furthermore, according to the sensitivity analysis of random variables, the slope of Pairs equation affects the failure probability dominantly among other random variables in the Paris and the Walker models

  4. On Dobrushin's way from probability theory to statistical physics

    CERN Document Server

    Minlos, R A; Suhov, Yu M; Suhov, Yu

    2000-01-01

    R. Dobrushin worked in several branches of mathematics (probability theory, information theory), but his deepest influence was on mathematical physics. He was one of the founders of the rigorous study of statistical physics. When Dobrushin began working in that direction in the early sixties, only a few people worldwide were thinking along the same lines. Now there is an army of researchers in the field. This collection is devoted to the memory of R. L. Dobrushin. The authors who contributed to this collection knew him quite well and were his colleagues. The title, "On Dobrushin's Way", is mea

  5. Probability theory versus simulation of petroleum potential in play analysis

    Science.gov (United States)

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  6. A subjective utilitarian theory of moral judgment.

    Science.gov (United States)

    Cohen, Dale J; Ahn, Minwoo

    2016-10-01

    Current theories hypothesize that moral judgments are difficult because rational and emotional decision processes compete. We present a fundamentally different theory of moral judgment: the Subjective Utilitarian Theory of moral judgment. The Subjective Utilitarian Theory posits that people try to identify and save the competing item with the greatest "personal value." Moral judgments become difficult only when the competing items have similar personal values. In Experiment 1, we estimate the personal values of 104 items. In Experiments 2-5, we show that the distributional overlaps of the estimated personal values account for over 90% of the variance in reaction times (RTs) and response choices in a moral judgment task. Our model fundamentally restructures our understanding of moral judgments from a competition between decision processes to a competition between similarly valued items. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. The Misapplication of Probability Theory in Quantum Mechanics

    Science.gov (United States)

    Racicot, Ronald

    2014-03-01

    This article is a revision of two papers submitted to the APS in the past two and a half years. In these papers, arguments and proofs are summarized for the following: (1) The wrong conclusion by EPR that Quantum Mechanics is incomplete, perhaps requiring the addition of ``hidden variables'' for completion. Theorems that assume such ``hidden variables,'' such as Bell's theorem, are also wrong. (2) Quantum entanglement is not a realizable physical phenomenon and is based entirely on assuming a probability superposition model for quantum spin. Such a model directly violates conservation of angular momentum. (3) Simultaneous multiple-paths followed by a quantum particle traveling through space also cannot possibly exist. Besides violating Noether's theorem, the multiple-paths theory is based solely on probability calculations. Probability calculations by themselves cannot possibly represent simultaneous physically real events. None of the reviews of the submitted papers actually refuted the arguments and evidence that was presented. These analyses should therefore be carefully evaluated since the conclusions reached have such important impact in quantum mechanics and quantum information theory.

  8. Subjective Expected Utility Theory with "Small Worlds"

    DEFF Research Database (Denmark)

    Gyntelberg, Jacob; Hansen, Frank

    which is a more general construction than a state space. We retain preference axioms similar in spirit to the Savage axioms and obtain, without abandoning linearity of expectations, a subjective expected utility theory which allows for an intuitive distinction between risk and uncertainty. We also...

  9. Exaggerated risk: prospect theory and probability weighting in risky choice.

    Science.gov (United States)

    Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick

    2009-11-01

    In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and Kahneman's (1992) prospect theory, we found that the weighting function required to model precautionary decisions differed from that required for monetary gambles. This result indicates a failure of the descriptive invariance axiom of expected utility theory. For precautionary decisions, people overweighted small, medium-sized, and moderately large probabilities-they exaggerated risks. This effect is not anticipated by prospect theory or experience-based decision research (Hertwig, Barron, Weber, & Erev, 2004). We found evidence that exaggerated risk is caused by the accessibility of events in memory: The weighting function varies as a function of the accessibility of events. This suggests that people's experiences of events leak into decisions even when risk information is explicitly provided. Our findings highlight a need to investigate how variation in decision content produces variation in preferences for risk.

  10. A short course on measure and probability theories

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the past decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.

  11. The maximum entropy method of moments and Bayesian probability theory

    Science.gov (United States)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  12. On the subjectivity of personality theory.

    Science.gov (United States)

    Atwood, G E; Tomkins, S S

    1976-04-01

    Every theorist of personality views the human condition from the unique perspective of his own individuality. As a consequence, personality theories are strongly influenced by personal and subjective factors. These influences are partially responsible for the present day lack of consensus in psychology as to basic conceptual frameworks for the study of man. The science of human personality can achieve a greater degree of consensus and generality only if it begins to turn back on itself and question its own psychological foundations. The role of subjective and personal factors in this field can be studied and made more explicit by means of a psychobiographical method which interprets the major ideas of personality theories in the light of the formative experiences in the respective theorists' lives. This method is briefly illustrated by an examination of the influence of personal experiences on theoretical concepts in the work of Carl Jung, Carl Rogers, Wilhelm Reich, and Gordon Allport. The subjective factors disclosed by psychobiographical analysis can bee seen to interact with influences stemming from the intellectual and historical context within which the theorist work. The psychobiographical study of personality theory is only one part of a larger discipline, the psychology of knowledge, which would study the role of subjective and personal factors in the structure of man's knowledge in general.

  13. Perturbation theory and collision probability formalism. Vol. 2

    Energy Technology Data Exchange (ETDEWEB)

    Nasr, M [National Center for Nuclear Safety and Radiation Control, Atomic Energy Authority, Cairo (Egypt)

    1996-03-01

    Perturbation theory is commonly used in evaluating the activity effects, particularly those resulting from small and localized perturbation in multiplying media., e.g. in small sample reactivity measurements. The Boltzmann integral transport equation is generally used for evaluating the direct and adjoint fluxes in the heterogenous lattice cells to be used in the perturbation equations. When applying perturbation theory in this formalism, a term involving the perturbation effects on the special transfer kernel arises. This term is difficult to evaluate correctly, since it involves an integration all over the entire system. The main advantage of the perturbation theory which is the limitation of the integration procedure on the perturbation region is found to be of no practical use in such cases. In the present work, the perturbation equation in the collision probability formalism is analyzed. A mathematical treatment of the term in question is performed. A new mathematical expression for this term is derived. The new expression which can be estimated easily is derived.

  14. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  15. High-resolution elastic recoil detection utilizing Bayesian probability theory

    International Nuclear Information System (INIS)

    Neumaier, P.; Dollinger, G.; Bergmaier, A.; Genchev, I.; Goergens, L.; Fischer, R.; Ronning, C.; Hofsaess, H.

    2001-01-01

    Elastic recoil detection (ERD) analysis is improved in view of depth resolution and the reliability of the measured spectra. Good statistics at even low ion fluences is obtained utilizing a large solid angle of 5 msr at the Munich Q3D magnetic spectrograph and using a 40 MeV 197 Au beam. In this way the elemental depth profiles are not essentially altered during analysis even if distributions with area densities below 1x10 14 atoms/cm 2 are measured. As the energy spread due to the angular acceptance is fully eliminated by ion-optical and numerical corrections, an accurate and reliable apparatus function is derived. It allows to deconvolute the measured spectra using the adaptive kernel method, a maximum entropy concept in the framework of Bayesian probability theory. In addition, the uncertainty of the reconstructed spectra is quantified. The concepts are demonstrated at 13 C depth profiles measured at ultra-thin films of tetrahedral amorphous carbon (ta-C). Depth scales of those profiles are given with an accuracy of 1.4x10 15 atoms/cm 2

  16. Measuring inequity aversion in a heterogeneous population using experimental decisions and subjective probabilities

    NARCIS (Netherlands)

    Bellemare, C.; Kroger, S.; van Soest, A.H.O.

    2008-01-01

    We combine choice data in the ultimatum game with the expectations of proposers elicited by subjective probability questions to estimate a structural model of decision making under uncertainty. The model, estimated using a large representative sample of subjects from the Dutch population, allows

  17. Probability model for analyzing fire management alternatives: theory and structure

    Science.gov (United States)

    Frederick W. Bratten

    1982-01-01

    A theoretical probability model has been developed for analyzing program alternatives in fire management. It includes submodels or modules for predicting probabilities of fire behavior, fire occurrence, fire suppression, effects of fire on land resources, and financial effects of fire. Generalized "fire management situations" are used to represent actual fire...

  18. Probability theory plus noise: Replies to Crupi and Tentori (2016) and to Nilsson, Juslin, and Winman (2016).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-01-01

    A standard assumption in much of current psychology is that people do not reason about probability using the rules of probability theory but instead use various heuristics or "rules of thumb," which can produce systematic reasoning biases. In Costello and Watts (2014), we showed that a number of these biases can be explained by a model where people reason according to probability theory but are subject to random noise. More importantly, that model also predicted agreement with probability theory for certain expressions that cancel the effects of random noise: Experimental results strongly confirmed this prediction, showing that probabilistic reasoning is simultaneously systematically biased and "surprisingly rational." In their commentaries on that paper, both Crupi and Tentori (2016) and Nilsson, Juslin, and Winman (2016) point to various experimental results that, they suggest, our model cannot explain. In this reply, we show that our probability theory plus noise model can in fact explain every one of the results identified by these authors. This gives a degree of additional support to the view that people's probability judgments embody the rational rules of probability theory and that biases in those judgments can be explained as simply effects of random noise. (c) 2015 APA, all rights reserved).

  19. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided

  20. Non-equilibrium random matrix theory. Transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, Francisco Gil [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Gruppe Theorie

    2016-06-15

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  1. Non-equilibrium random matrix theory. Transition probabilities

    International Nuclear Information System (INIS)

    Pedro, Francisco Gil; Westphal, Alexander

    2016-06-01

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  2. The first cycle of the reflective pedagogical paradigm implementation in the introduction probability theory course

    Science.gov (United States)

    Julie, Hongki

    2017-08-01

    One of purposes of this study was describing the steps of the teaching and learning process if the teacher in the Introduction Probability Theory course wanted to teach about the event probability by using the reflective pedagogical paradigm (RPP) and describing the results achieved by the students. The study consisted of three cycles, but the results would be presented in this paper was limited to the results obtained in the first cycle. Stages conducted by the researcher in the first cycle could be divided into five stages, namely (1) to know the students' context, (2) to plan and provide student learning experiences, (3) to facilitate students in actions, (4) to ask students to make a reflection and (5) to evaluate. The type of research used in this research was descriptive qualitative and quantitative research. The students' learning experience, the students' action, and the students' reflection would be described qualitatively. The student evaluation results would be described quantitatively. The research subject in this study was 38 students taking the introduction probability theory course in class C. From the students' reflection, still quite a lot of students were not complete in writing concepts that they have learned and / or have not been precise in describing the relationships between concepts that they have learned. From the students' evaluation, 85.29% students got score under 7. If examined more deeply, the most difficulty of students were in the mathematical horizontal process. As a result, they had difficulty in performing the mathematical vertical process.

  3. Exaggerated Risk: Prospect Theory and Probability Weighting in Risky Choice

    Science.gov (United States)

    Kusev, Petko; van Schaik, Paul; Ayton, Peter; Dent, John; Chater, Nick

    2009-01-01

    In 5 experiments, we studied precautionary decisions in which participants decided whether or not to buy insurance with specified cost against an undesirable event with specified probability and cost. We compared the risks taken for precautionary decisions with those taken for equivalent monetary gambles. Fitting these data to Tversky and…

  4. Main factors for fatigue failure probability of pipes subjected to fluid thermal fluctuation

    International Nuclear Information System (INIS)

    Machida, Hideo; Suzuki, Masaaki; Kasahara, Naoto

    2015-01-01

    It is very important to grasp failure probability and failure mode appropriately to carry out risk reduction measures of nuclear power plants. To clarify the important factors for failure probability and failure mode of pipes subjected to fluid thermal fluctuation, failure probability analyses were performed by changing the values of a stress range, stress ratio, stress components and threshold of stress intensity factor range. The important factors for the failure probability are range, stress ratio (mean stress condition) and threshold of stress intensity factor range. The important factor for the failure mode is a circumferential angle range of fluid thermal fluctuation. When a large fluid thermal fluctuation acts on the entire circumferential surface of the pipe, the probability of pipe breakage increases, calling for measures to prevent such a failure and reduce the risk to the plant. When the circumferential angle subjected to fluid thermal fluctuation is small, the failure mode of piping is leakage and the corrective maintenance might be applicable from the viewpoint of risk to the plant. (author)

  5. Sociological theories of subjective well-being

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2009-01-01

    textabstractSubjective well-being is no great issue in sociology; the subject is not mentioned in sociological textbooks (a notable exception is Nolan & Lenski, 2004) and is rarely discussed in sociological journals. This absence has many reasons: pragmatic, ideological, and theoretical. To begin

  6. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  7. Higher risk of probable mental emotional disorder in low or severe vision subjects

    Directory of Open Access Journals (Sweden)

    Lutfah Rif’ati

    2012-07-01

    health problem priority in Indonesia. This paper presents an assessment of severe visual impairments related to the risk of MED. Methods: This paper assessed a part of Basic Health Research (Riskesdas 2007 data. For this assessment, subjects 15 years old or more had their visual acuity measured using the Snellen chart and their mental health status determined using the Self Reporting Questionnaire (SRQ 20. A subject was considered to have probable MED if the subject had a total score of 6 or more on the SRQ. Based on the measure of visual acuity, visual acuity was divided into 3 categories: normal/mild (20/20 to 20/60; low vision (less than 20/60 to 3/60; and blind (less than 3/60 to 0/0. Results: Among 972,989 subjects, 554,886 were aged 15 years or older. 11.4% of the subjects had probable MED. The prevalence of low vision and blindness was 5.1% and 0.9%, respectively. Compared to subjects with normal or mild visual impairments, subjects with low vision had a 74% increased risk for probable MED [adjusted relative risk (RRa=1,75; 95% confidence interval (CI=1,71-1,79].  Blind subjects had a 2.7-fold risk to be probable MED (RRa=2.69; 95% CI=2.60-2.78] compared to subjects with normal or mild visual impairments. Conclusion: Visual impairment severity increased probable MED risk. Therefore, visual impairment subjects need more attention on probable MED. (Health Science Indones 2011;2:9-13

  8. Inventory control based on advanced probability theory, an application

    CERN Document Server

    Krever, Maarten; Schorr, Bernd; Wunderink, S

    2005-01-01

    Whenever stock is placed as a buffer between consumption and supply the decision when to replenish the stock is based on uncertain values of future demand and supply variables. Uncertainty exists about the replenishment lead time, about the number of demands and the quantities demanded during this period. We develop a new analytical expression for the reorder point, which is based on the desired service level and three distributions: the distribution of the quantity of single demands during lead time, the distribution of the lengths of time intervals between successive demands, and the distribution of the lead time itself. The distribution of lead time demand is derived from the distributions of individual demand quantities and not from the demand per period. It is not surprising that the resulting formulae for the mean and variance are different from those currently used. The theory developed is also applicable to periodic review systems. The system has been implemented at CERN and enables a significant enha...

  9. Measurement and probability a probabilistic theory of measurement with applications

    CERN Document Server

    Rossi, Giovanni Battista

    2014-01-01

    Measurement plays a fundamental role both in physical and behavioral sciences, as well as in engineering and technology: it is the link between abstract models and empirical reality and is a privileged method of gathering information from the real world. Is it possible to develop a single theory of measurement for the various domains of science and technology in which measurement is involved? This book takes the challenge by addressing the following main issues: What is the meaning of measurement? How do we measure? What can be measured? A theoretical framework that could truly be shared by scientists in different fields, ranging from physics and engineering to psychology is developed. The future in fact will require greater collaboration between science and technology and between different sciences. Measurement, which played a key role in the birth of modern science, can act as an essential interdisciplinary tool and language for this new scenario. A sound theoretical basis for addressing key problems in mea...

  10. Determinism and probability in the development of the cell theory.

    Science.gov (United States)

    Duchesneau, François

    2012-09-01

    A return to Claude Bernard's original use of the concept of 'determinism' displays the fact that natural laws were presumed to rule over all natural processes. In a more restricted sense, the term boiled down to a mere presupposition of constant determinant causes for those processes, leaving aside any particular ontological principle, even stochastic. The history of the cell theory until around 1900 was dominated by a twofold conception of determinant causes. Along a reductionist trend, cells' structures and processes were supposed to be accounted for through their analysis into detailed partial mechanisms. But a more holistic approach tended to subsume those analytic means and the mechanism involved under a program of global functional determinations. When mitotic and meiotic sequences in nuclear replication were being unveiled and that neo-Mendelian genetics was being grafted onto cytology and embryology, a conception of strict determinism at the nuclear level, principally represented by Wilhelm Roux and August Weismann, would seem to rule unilaterally over the mosaic interpretation of the cleavage of blastomeres. But, as shown by E.B. Wilson, in developmental processes there occur contingent outcomes of cell division which observations and experiments reveal. This induces the need to admit 'epigenetic' determinants and relativize the presumed 'preformation' of thedevelopmental phases by making room for an emergent order which the accidental circumstances of gene replication would trigger on. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Subjective probability appraisal of uranium resources in the state of New Mexico

    International Nuclear Information System (INIS)

    Ellis, J.R.; Harris, D.P.; VanWie, N.H.

    1975-12-01

    This report presents an estimate of undiscovered uranium resources in New Mexico of 226,681,000 tons of material containing 455,480 tons U 3 O 8 . The basis for this estimate was a survey of expectations of 36 geologists, in terms of subjective probabilities of number of deposits, ore tonnage, and grade. Weighting of the geologists' estimates to derive a mean value used a self-appraisal index of their knowledge within the field. Detailed estimates are presented for the state, for each of 62 subdivisions (cells), and for an aggregation of eight cells encompassing the San Juan Basin, which is estimated to contain 92 percent of the undiscovered uranium resources in New Mexico. Ore-body attributes stated as probability distributions enabled the application of Monte Carlo methods to the analysis of the data. Sampling of estimates of material and contained U 3 O 8 which are provided as probability distributions indicates a 10 percent probability of there being at least 600,000 tons U 3 O 8 remaining undiscovered in deposits virtually certain to number between 500 and 565. An indicated probability of 99.5 percent that the ore grade is greater than 0.12 percent U 3 O 8 suggests that this survey may not provide reliable estimates of the abundance of material in very low-grade categories. Extrapolation to examine the potential for such deposits indicates more than 1,000,000 tons U 3 O 8 may be available down to a grade of 0.05 percent U 3 O 8 . Supplemental point estimates of ore depth and thickness allowed derivative estimates of cost of development, extraction, and milling. 80 percent of the U 3 O 8 is estimated to be available at a cost less than dollars 15/lb (1974) and about 98 percent at less than dollars 30/lb

  12. An Alternative Method to Compute the Bit Error Probability of Modulation Schemes Subject to Nakagami- Fading

    Directory of Open Access Journals (Sweden)

    Madeiro Francisco

    2010-01-01

    Full Text Available Abstract This paper presents an alternative method for determining exact expressions for the bit error probability (BEP of modulation schemes subject to Nakagami- fading. In this method, the Nakagami- fading channel is seen as an additive noise channel whose noise is modeled as the ratio between Gaussian and Nakagami- random variables. The method consists of using the cumulative density function of the resulting noise to obtain closed-form expressions for the BEP of modulation schemes subject to Nakagami- fading. In particular, the proposed method is used to obtain closed-form expressions for the BEP of -ary quadrature amplitude modulation ( -QAM, -ary pulse amplitude modulation ( -PAM, and rectangular quadrature amplitude modulation ( -QAM under Nakagami- fading. The main contribution of this paper is to show that this alternative method can be used to reduce the computational complexity for detecting signals in the presence of fading.

  13. Estimation and asymptotic theory for transition probabilities in Markov Renewal Multi–state models

    NARCIS (Netherlands)

    Spitoni, C.; Verduijn, M.; Putter, H.

    2012-01-01

    In this paper we discuss estimation of transition probabilities for semi–Markov multi–state models. Non–parametric and semi–parametric estimators of the transition probabilities for a large class of models (forward going models) are proposed. Large sample theory is derived using the functional

  14. Probabilities and Shannon's Entropy in the Everett Many-Worlds Theory

    Directory of Open Access Journals (Sweden)

    Andreas Wichert

    2016-12-01

    Full Text Available Following a controversial suggestion by David Deutsch that decision theory can solve the problem of probabilities in the Everett many-worlds we suggest that the probabilities are induced by Shannon's entropy that measures the uncertainty of events. We argue that a relational person prefers certainty to uncertainty due to fundamental biological principle of homeostasis.

  15. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  16. Relationship between Future Time Orientation and Item Nonresponse on Subjective Probability Questions: A Cross-Cultural Analysis.

    Science.gov (United States)

    Lee, Sunghee; Liu, Mingnan; Hu, Mengyao

    2017-06-01

    Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals' time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying "I don't know" item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research.

  17. Aspects of a representation of quantum theory in terms of classical probability theory by means of integration in Hilbert space

    International Nuclear Information System (INIS)

    Bach, A.

    1981-01-01

    A representation of quantum mechanics in terms of classical probability theory by means of integration in Hilbert space is discussed. This formal hidden-variables representation is analysed in the context of impossibility proofs concerning hidden-variables theories. The structural analogy of this formulation of quantum theory with classical statistical mechanics is used to elucidate the difference between classical mechanics and quantum mechanics. (author)

  18. USING THE WEB-SERVICES WOLFRAM|ALPHA TO SOLVE PROBLEMS IN PROBABILITY THEORY

    Directory of Open Access Journals (Sweden)

    Taras Kobylnyk

    2015-10-01

    Full Text Available The trend towards the use of remote network resources on the Internet clearly delineated. Traditional training combined with increasingly networked, remote technologies become popular cloud computing. Research methods of probability theory are used in various fields. Of particular note is the use of methods of probability theory in psychological and educational research in statistical analysis of experimental data. Conducting such research is impossible without the use of modern information technology. Given the advantages of web-based software, the article describes web-service Wolfram|Alpha. Detailed analysis of the possibilities of using web-service Wolfram|Alpha for solving problems of probability theory. In the case studies described the results of queries for solving of probability theory, in particular the sections random events and random variables. Considered and analyzed the problem of the number of occurrences of event A in n independent trials using Wolfram|Alpha, detailed analysis of the possibilities of using the service Wolfram|Alpha for the study of continuous random variable that has a normal and uniform probability distribution, including calculating the probability of getting the value of a random variable in a given interval. The problem in applying the binomial and hypergeometric probability distribution of a discrete random variable and demonstrates the possibility of using the service Wolfram|Alpha for solving it.

  19. Geometric function theory: a modern view of a classical subject

    International Nuclear Information System (INIS)

    Crowdy, Darren

    2008-01-01

    Geometric function theory is a classical subject. Yet it continues to find new applications in an ever-growing variety of areas such as modern mathematical physics, more traditional fields of physics such as fluid dynamics, nonlinear integrable systems theory and the theory of partial differential equations. This paper surveys, with a view to modern applications, open problems and challenges in this subject. Here we advocate an approach based on the use of the Schottky–Klein prime function within a Schottky model of compact Riemann surfaces. (open problem)

  20. The Subject, Feminist Theory and Latin American Texts

    Directory of Open Access Journals (Sweden)

    Sara Castro-Klaren

    1996-01-01

    Full Text Available From a feminist perspective, this essay reviews and analyzes the interaction between metropolitan feminist theories and their interphase with the academic criticism of texts written by Latin American women. Discussion focuses on the question of the subject, which the author believes to be paramount in feminist theory, in as much as the construction of gender and the historical subordination of women devolve on the play of difference and identity. This paper examines how the problematic assumption by feminist theorists in the North American academy of Freudian and Lacanian theories of the subject pose unresolved problems and unanticipated complications to subsequent deployment of this subject theory as modes of interpretation of texts written by women in Latin America or even to the emancipatory goals on feminists in the academy. This is a case where "traveling theory" must be examined and evaluated very carefully. The second part of the paper concentrates on the feminist challenges that have been already made to both Freudian and Lacanian theories of the feminine. It highlights the work of Jane Flax, Nacy Chodorov, Gayatri Spivak and Judith Butler in suggesting a way out of theories that rely on the primacy of the male subject formation and therefore occlude and preclude the investigation of the modes of women's agency.

  1. Probability density function evolution of power systems subject to stochastic variation of renewable energy

    Science.gov (United States)

    Wei, J. Q.; Cong, Y. C.; Xiao, M. Q.

    2018-05-01

    As renewable energies are increasingly integrated into power systems, there is increasing interest in stochastic analysis of power systems.Better techniques should be developed to account for the uncertainty caused by penetration of renewables and consequently analyse its impacts on stochastic stability of power systems. In this paper, the Stochastic Differential Equations (SDEs) are used to represent the evolutionary behaviour of the power systems. The stationary Probability Density Function (PDF) solution to SDEs modelling power systems excited by Gaussian white noise is analysed. Subjected to such random excitation, the Joint Probability Density Function (JPDF) solution to the phase angle and angular velocity is governed by the generalized Fokker-Planck-Kolmogorov (FPK) equation. To solve this equation, the numerical method is adopted. Special measure is taken such that the generalized FPK equation is satisfied in the average sense of integration with the assumed PDF. Both weak and strong intensities of the stochastic excitations are considered in a single machine infinite bus power system. The numerical analysis has the same result as the one given by the Monte Carlo simulation. Potential studies on stochastic behaviour of multi-machine power systems with random excitations are discussed at the end.

  2. Opera house acoustics based on subjective preference theory

    CERN Document Server

    Ando, Yoichi

    2015-01-01

    This book focuses on opera house acoustics based on subjective preference theory; it targets researchers in acoustics and vision who are working in physics, psychology, and brain physiology. This book helps readers to understand any subjective attributes in relation to objective parameters based on the powerful and workable model of the auditory system. It is reconfirmed here that the well-known Helmholtz theory, which was based on a peripheral model of the auditory system, may not well describe pitch, timbre, and duration as well as the spatial sensations described in this book, nor overall responses such as subjective preference of sound fields and the annoyance of environmental noise.

  3. Self-Organized Complexity and Coherent Infomax from the Viewpoint of Jaynes’s Probability Theory

    Directory of Open Access Journals (Sweden)

    William A. Phillips

    2012-01-01

    Full Text Available This paper discusses concepts of self-organized complexity and the theory of Coherent Infomax in the light of Jaynes’s probability theory. Coherent Infomax, shows, in principle, how adaptively self-organized complexity can be preserved and improved by using probabilistic inference that is context-sensitive. It argues that neural systems do this by combining local reliability with flexible, holistic, context-sensitivity. Jaynes argued that the logic of probabilistic inference shows it to be based upon Bayesian and Maximum Entropy methods or special cases of them. He presented his probability theory as the logic of science; here it is considered as the logic of life. It is concluded that the theory of Coherent Infomax specifies a general objective for probabilistic inference, and that contextual interactions in neural systems perform functions required of the scientist within Jaynes’s theory.

  4. Traceable accounts of subjective probability judgments in the IPCC and beyond

    Science.gov (United States)

    Baer, P. G.

    2012-12-01

    One of the major sources of controversy surrounding the reports of the IPCC has been the characterization of uncertainty. Although arguably the IPCC has paid more attention to the process of uncertainty analysis and communication than any comparable assessment body, its efforts to achieve consistency have produced mixed results. In particular, the extensive use of subjective probability assessment has attracted widespread criticism. Statements such as "Average Northern Hemisphere temperatures during the second half of the 20th century were very likely higher than during any other 50-year period in the last 500 years" are ubiquitous (one online database lists nearly 3000 such claims), and indeed are the primary way in which its key "findings" are reported. Much attention is drawn to the precise quantitative definition of such statements (e.g., "very likely" means >90% probability, vs. "extremely likely" which means >95% certainty). But there is no process by which the decision regarding the choice of such uncertainty level for a given finding is formally made or reported, and thus they are easily by disputed by anyone, expert or otherwise, who disagrees with the assessment. In the "Uncertainty Guidance Paper" for the Third Assessment Report, Richard Moss and Steve Schneider defined the concept of a "traceable account," which gave exhaustive detail regarding how one ought to provide documentation of such an uncertainty assessment. But the guidance, while appearing straightforward and reasonable, in fact was an unworkable recipe, which would have taken near-infinite time if used for more than a few key results, and would have required a different structuring of the text than the conventional scientific assessment. And even then it would have left a gap when it came to the actual provenance of any such specific judgments, because there simply is no formal step at which individuals turn their knowledge of the evidence on some finding into a probability judgment. The

  5. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  6. Unification of field theory and maximum entropy methods for learning probability densities

    OpenAIRE

    Kinney, Justin B.

    2014-01-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy de...

  7. A Complete Theory of Everything (Will Be Subjective

    Directory of Open Access Journals (Sweden)

    Marcus Hutter

    2010-09-01

    Full Text Available Increasingly encompassing models have been suggested for our world. Theories range from generally accepted to increasingly speculative to apparently bogus. The progression of theories from ego- to geo- to helio-centric models to universe and multiverse theories and beyond was accompanied by a dramatic increase in the sizes of the postulated worlds, with humans being expelled from their center to ever more remote and random locations. Rather than leading to a true theory of everything, this trend faces a turning point after which the predictive power of such theories decreases (actually to zero. Incorporating the location and other capacities of the observer into such theories avoids this problem and allows to distinguish meaningful from predictively meaningless theories. This also leads to a truly complete theory of everything consisting of a (conventional objective theory of everything plus a (novel subjective observer process. The observer localization is neither based on the controversial anthropic principle, nor has it anything to do with the quantum-mechanical observation process. The suggested principle is extended to more practical (partial, approximate, probabilistic, parametric world models (rather than theories of everything. Finally, I provide a justification of Ockham’s razor, and criticize the anthropic principle, the doomsday argument, the no free lunch theorem, and the falsifiability dogma.

  8. The force distribution probability function for simple fluids by density functional theory.

    Science.gov (United States)

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  9. Exact closed form expressions for outage probability of GSC receivers over Rayleigh fading channel subject to self-interference

    KAUST Repository

    Nam, Sungsik; Hasna, Mazen Omar; Alouini, Mohamed-Slim

    2010-01-01

    in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for outage probability of GSC RAKE receivers subject to self-interference over independent and identically distributed Rayleigh fading

  10. Ergodic theory, interpretations of probability and the foundations of statistical mechanics

    NARCIS (Netherlands)

    van Lith, J.H.

    2001-01-01

    The traditional use of ergodic theory in the foundations of equilibrium statistical mechanics is that it provides a link between thermodynamic observables and microcanonical probabilities. First of all, the ergodic theorem demonstrates the equality of microcanonical phase averages and infinite time

  11. Transformation & uncertainty : some thoughts on quantum probability theory, quantum statistics, and natural bundles

    NARCIS (Netherlands)

    Janssens, B.

    2010-01-01

    This PHD thesis is concerned partly with uncertainty relations in quantum probability theory, partly with state estimation in quantum stochastics, and partly with natural bundles in differential geometry. The laws of quantum mechanics impose severe restrictions on the performance of measurement.

  12. The use of modern information technologies in teaching students of economics theory of probability

    Directory of Open Access Journals (Sweden)

    Иван Васильевич Детушев

    2013-12-01

    Full Text Available This article discusses the use of the program «MathCAD» in teaching students of economic specialties of mathematics. It is shown that the use of this software product contributes to the effective development of methods for solving problems of the theory of probability.

  13. Probabilities and Possibilities: The Strategic Counseling Implications of the Chaos Theory of Careers

    Science.gov (United States)

    Pryor, Robert G. L.; Amundson, Norman E.; Bright, Jim E. H.

    2008-01-01

    The chaos theory of careers emphasizes both stability and change in its account of career development. This article outlines counseling strategies derived from this emphasis in terms of convergent or probability thinking and emergent or possibility thinking. These 2 perspectives are characterized, and practical counseling strategy implications are…

  14. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    Science.gov (United States)

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  15. [A probability wave theory on the ion movement across cell membrane].

    Science.gov (United States)

    Zhang, Hui; Xu, Jiadong; Niu, Zhongqi

    2007-04-01

    The ionic quantity across the channel of the cell membrane decides the cell in a certain life state. The theory analysis that existed on the bio-effects of the electro-magnetic field (EMF) does not unveil the relationship between the EMF exerted on the cell and the ionic quantity across the cell membrane. Based on the cell construction, the existed theory analysis and the experimental results, an ionic probability wave theory is proposed in this paper to explain the biological window-effects of the electromagnetic wave. The theory regards the membrane channel as the periodic potential barrier and gives the physical view of the ion movement across cell-membrane. The theory revises the relationship between ion's energy in cell channel and the frequency exerted EMF. After the application of the concept of the wave function, the ionic probability across the cell membrane is given by the method of the quantum mechanics. The numerical results analyze the physical factors that influences the ion's movement across the cell membrane. These results show that the theory can explain the phenomenon of the biological window-effects.

  16. Unification of field theory and maximum entropy methods for learning probability densities

    Science.gov (United States)

    Kinney, Justin B.

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  17. Unification of field theory and maximum entropy methods for learning probability densities.

    Science.gov (United States)

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  18. Estimation of delayed neutron emission probability by using the gross theory of nuclear β-decay

    International Nuclear Information System (INIS)

    Tachibana, Takahiro

    1999-01-01

    The delayed neutron emission probabilities (P n -values) of fission products are necessary in the study of reactor physics; e.g. in the calculation of total delayed neutron yields and in the summation calculation of decay heat. In this report, the P n -values estimated by the gross theory for some fission products are compared with experiment, and it is found that, on the average, the semi-gross theory somewhat underestimates the experimental P n -values. A modification of the β-decay strength function is briefly discussed to get more reasonable P n -values. (author)

  19. The attention schema theory: a mechanistic account of subjective awareness.

    Science.gov (United States)

    Graziano, Michael S A; Webb, Taylor W

    2015-01-01

    We recently proposed the attention schema theory, a novel way to explain the brain basis of subjective awareness in a mechanistic and scientifically testable manner. The theory begins with attention, the process by which signals compete for the brain's limited computing resources. This internal signal competition is partly under a bottom-up influence and partly under top-down control. We propose that the top-down control of attention is improved when the brain has access to a simplified model of attention itself. The brain therefore constructs a schematic model of the process of attention, the 'attention schema,' in much the same way that it constructs a schematic model of the body, the 'body schema.' The content of this internal model leads a brain to conclude that it has a subjective experience. One advantage of this theory is that it explains how awareness and attention can sometimes become dissociated; the brain's internal models are never perfect, and sometimes a model becomes dissociated from the object being modeled. A second advantage of this theory is that it explains how we can be aware of both internal and external events. The brain can apply attention to many types of information including external sensory information and internal information about emotions and cognitive states. If awareness is a model of attention, then this model should pertain to the same domains of information to which attention pertains. A third advantage of this theory is that it provides testable predictions. If awareness is the internal model of attention, used to help control attention, then without awareness, attention should still be possible but should suffer deficits in control. In this article, we review the existing literature on the relationship between attention and awareness, and suggest that at least some of the predictions of the theory are borne out by the evidence.

  20. The attention schema theory: a mechanistic account of subjective awareness

    Directory of Open Access Journals (Sweden)

    Taylor W. Webb

    2015-04-01

    Full Text Available We recently proposed the attention schema theory, a novel way to explain the brain basis of subjective awareness in a mechanistic and scientifically testable manner. The theory begins with attention, the process by which signals compete for the brain’s limited computing resources. This internal signal competition is partly under a bottom-up influence and partly under top-down control. We propose that the top-down control of attention is improved when the brain has access to a simplified model of attention itself. The brain therefore constructs a schematic model of the process of attention, the ‘attention schema’, in much the same way that it constructs a schematic model of the body, the ‘body schema’. The content of this internal model leads a brain to conclude that it has a subjective experience. One advantage of this theory is that it explains how awareness and attention can sometimes become dissociated; the brain’s internal models are never perfect, and sometimes a model becomes dissociated from the object being modeled. A second advantage of this theory is that it explains how we can be aware of both internal and external events. The brain can apply attention to many types of information including external sensory information and internal information about emotions and cognitive states. If awareness is a model of attention, then this model should pertain to the same domains of information to which attention pertains. A third advantage of this theory is that it provides testable predictions. If awareness is the internal model of attention, used to help control attention, then without awareness, attention should still be possible but should suffer deficits in control. In this article, we review the existing literature on the relationship between attention and awareness, and suggest that at least some of the predictions of the theory are borne out by the evidence.

  1. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  2. Realistic neurons can compute the operations needed by quantum probability theory and other vector symbolic architectures.

    Science.gov (United States)

    Stewart, Terrence C; Eliasmith, Chris

    2013-06-01

    Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B).

  3. Fusing probability density function into Dempster-Shafer theory of evidence for the evaluation of water treatment plant.

    Science.gov (United States)

    Chowdhury, Shakhawat

    2013-05-01

    The evaluation of the status of a municipal drinking water treatment plant (WTP) is important. The evaluation depends on several factors, including, human health risks from disinfection by-products (R), disinfection performance (D), and cost (C) of water production and distribution. The Dempster-Shafer theory (DST) of evidence can combine the individual status with respect to R, D, and C to generate a new indicator, from which the overall status of a WTP can be evaluated. In the DST, the ranges of different factors affecting the overall status are divided into several segments. The basic probability assignments (BPA) for each segment of these factors are provided by multiple experts, which are then combined to obtain the overall status. In assigning the BPA, the experts use their individual judgments, which can impart subjective biases in the overall evaluation. In this research, an approach has been introduced to avoid the assignment of subjective BPA. The factors contributing to the overall status were characterized using the probability density functions (PDF). The cumulative probabilities for different segments of these factors were determined from the cumulative density function, which were then assigned as the BPA for these factors. A case study is presented to demonstrate the application of PDF in DST to evaluate a WTP, leading to the selection of the required level of upgradation for the WTP.

  4. The theory, direction, and magnitude of ecosystem fire probability as constrained by precipitation and temperature.

    Science.gov (United States)

    Guyette, Richard; Stambaugh, Michael C; Dey, Daniel; Muzika, Rose Marie

    2017-01-01

    The effects of climate on wildland fire confronts society across a range of different ecosystems. Water and temperature affect the combustion dynamics, irrespective of whether those are associated with carbon fueled motors or ecosystems, but through different chemical, physical, and biological processes. We use an ecosystem combustion equation developed with the physical chemistry of atmospheric variables to estimate and simulate fire probability and mean fire interval (MFI). The calibration of ecosystem fire probability with basic combustion chemistry and physics offers a quantitative method to address wildland fire in addition to the well-studied forcing factors such as topography, ignition, and vegetation. We develop a graphic analysis tool for estimating climate forced fire probability with temperature and precipitation based on an empirical assessment of combustion theory and fire prediction in ecosystems. Climate-affected fire probability for any period, past or future, is estimated with given temperature and precipitation. A graphic analyses of wildland fire dynamics driven by climate supports a dialectic in hydrologic processes that affect ecosystem combustion: 1) the water needed by plants to produce carbon bonds (fuel) and 2) the inhibition of successful reactant collisions by water molecules (humidity and fuel moisture). These two postulates enable a classification scheme for ecosystems into three or more climate categories using their position relative to change points defined by precipitation in combustion dynamics equations. Three classifications of combustion dynamics in ecosystems fire probability include: 1) precipitation insensitive, 2) precipitation unstable, and 3) precipitation sensitive. All three classifications interact in different ways with variable levels of temperature.

  5. Probability of primordial black hole pair creation in a modified gravitational theory

    International Nuclear Information System (INIS)

    Paul, B. C.; Paul, Dilip

    2006-01-01

    We compute the probability for quantum creation of an inflationary universe with and without a pair of black holes in a modified gravity. The action of the modified theory of gravity contains αR 2 and δR -1 terms in addition to a cosmological constant (Λ) in the Einstein-Hilbert action. The probabilities for the creation of universe with a pair of black holes have been evaluated considering two different kinds of spatial sections, one which accommodates a pair of black holes and the other without black hole. We adopt a technique prescribed by Bousso and Hawking to calculate the above creation probability in a semiclassical approximation using the Hartle-Hawking boundary condition. We note a class of new and physically interesting instanton solutions characterized by the parameters in the action. These instantons may play an important role in the creation of the early universe. We also note that the probability of creation of a universe with a pair of black holes is strongly suppressed with a positive cosmological constant when δ=(4Λ 2 /3) for α>0 but it is more probable for α<-(1/6Λ). In the modified gravity considered here instanton solutions are permitted even without a cosmological constant when one begins with a negative δ

  6. Reality, Causality, and Probability, from Quantum Mechanics to Quantum Field Theory

    Science.gov (United States)

    Plotnitsky, Arkady

    2015-10-01

    These three lectures consider the questions of reality, causality, and probability in quantum theory, from quantum mechanics to quantum field theory. They do so in part by exploring the ideas of the key founding figures of the theory, such N. Bohr, W. Heisenberg, E. Schrödinger, or P. A. M. Dirac. However, while my discussion of these figures aims to be faithful to their thinking and writings, and while these lectures are motivated by my belief in the helpfulness of their thinking for understanding and advancing quantum theory, this project is not driven by loyalty to their ideas. In part for that reason, these lectures also present different and even conflicting ways of thinking in quantum theory, such as that of Bohr or Heisenberg vs. that of Schrödinger. The lectures, most especially the third one, also consider new physical, mathematical, and philosophical complexities brought in by quantum field theory vis-à-vis quantum mechanics. I close by briefly addressing some of the implications of the argument presented here for the current state of fundamental physics.

  7. The relevance of the early history of probability theory to current risk assessment practices in mental health care.

    Science.gov (United States)

    Large, Matthew

    2013-12-01

    Probability theory is at the base of modern concepts of risk assessment in mental health. The aim of the current paper is to review the key developments in the early history of probability theory in order to enrich our understanding of current risk assessment practices.

  8. What subject matter questions motivate the use of machine learning approaches compared to statistical models for probability prediction?

    Science.gov (United States)

    Binder, Harald

    2014-07-01

    This is a discussion of the following papers: "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Explaining participation differentials in Dutch higher education: The impact of subjective success probabilities on level choice and field choice

    NARCIS (Netherlands)

    Tolsma, J.; Need, A.; Jong, U. de

    2010-01-01

    In this article we examine whether subjective estimates of success probabilities explain the effect of social origin, sex, and ethnicity on students' choices between different school tracks in Dutch higher education. The educational options analysed differ in level (i.e. university versus

  10. Explaining participation differentials in Dutch higher education : the impact of subjective success probabilities on level choice and field choice

    NARCIS (Netherlands)

    Tolsma, J.; Need, A.; Jong, U. de

    2010-01-01

    In this article we examine whether subjective estimates of success probabilities explain the effect of social origin, sex, and ethnicity on students’ choices between different school tracks in Dutch higher education. The educational options analysed differ in level (i.e. university versus

  11. Knot probability of polygons subjected to a force: a Monte Carlo study

    International Nuclear Information System (INIS)

    Rensburg, E J Janse van; Orlandini, E; Tesi, M C; Whittington, S G

    2008-01-01

    We use Monte Carlo methods to study the knot probability of lattice polygons on the cubic lattice in the presence of an external force f. The force is coupled to the span of the polygons along a lattice direction, say the z-direction. If the force is negative polygons are squeezed (the compressive regime), while positive forces tend to stretch the polygons along the z-direction (the tensile regime). For sufficiently large positive forces we verify that the Pincus scaling law in the force-extension curve holds. At a fixed number of edges n the knot probability is a decreasing function of the force. For a fixed force the knot probability approaches unity as 1 - exp(-α 0 (f)n + o(n)), where α 0 (f) is positive and a decreasing function of f. We also examine the average of the absolute value of the writhe and we verify the square root growth law (known for f = 0) for all values of f

  12. Communicating through Probabilities: Does Quantum Theory Optimize the Transfer of Information?

    Directory of Open Access Journals (Sweden)

    William K. Wootters

    2013-08-01

    Full Text Available A quantum measurement can be regarded as a communication channel, in which the parameters of the state are expressed only in the probabilities of the outcomes of the measurement. We begin this paper by considering, in a non-quantum-mechanical setting, the problem of communicating through probabilities. For example, a sender, Alice, wants to convey to a receiver, Bob, the value of a continuous variable, θ, but her only means of conveying this value is by sending Bob a coin in which the value of θ is encoded in the probability of heads. We ask what the optimal encoding is when Bob will be allowed to flip the coin only a finite number of times. As the number of tosses goes to infinity, we find that the optimal encoding is the same as what nature would do if we lived in a world governed by real-vector-space quantum theory. We then ask whether the problem might be modified, so that the optimal communication strategy would be consistent with standard, complex-vector-space quantum theory.

  13. How Prevalent Is Wishful Thinking? Misattribution of Arousal Causes Optimism and Pessimism in Subjective Probabilities

    Science.gov (United States)

    Vosgerau, Joachim

    2010-01-01

    People appear to be unrealistically optimistic about their future prospects, as reflected by theory and research in the fields of psychology, organizational behavior, behavioral economics, and behavioral finance. Many real-world examples (e.g., consumer behavior during economic recessions), however, suggest that people are not always overly…

  14. Conditional Probabilities in the Excursion Set Theory. Generic Barriers and non-Gaussian Initial Conditions

    CERN Document Server

    De Simone, Andrea; Riotto, Antonio

    2011-01-01

    The excursion set theory, where density perturbations evolve stochastically with the smoothing scale, provides a method for computing the dark matter halo mass function. The computation of the mass function is mapped into the so-called first-passage time problem in the presence of a moving barrier. The excursion set theory is also a powerful formalism to study other properties of dark matter halos such as halo bias, accretion rate, formation time, merging rate and the formation history of halos. This is achieved by computing conditional probabilities with non-trivial initial conditions, and the conditional two-barrier first-crossing rate. In this paper we use the recently-developed path integral formulation of the excursion set theory to calculate analytically these conditional probabilities in the presence of a generic moving barrier, including the one describing the ellipsoidal collapse, and for both Gaussian and non-Gaussian initial conditions. The non-Markovianity of the random walks induced by non-Gaussi...

  15. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  16. On possibility of agreement of quantum mechanics with classical probability theory

    International Nuclear Information System (INIS)

    Slavnov, D.A.

    2006-01-01

    Paper describes a scheme to carry out a construction of the quantum mechanics where the quantum system is assumed to be a pattern of the open classical subsystems. It enables to make use both of the formal classical logic and the classical probability theory in the quantum mechanics. On the other hand, in terms of the mentioned approach one manages to ensure complete reconstruction of the quantum mechanics standard mathematical tool specifying its application limits. The problem dealing with the quantum state reduction is scrutinized [ru

  17. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ferson, Scott [Applied Biomathematics, Setauket, NY (United States); Nelsen, Roger B. [Lewis & Clark College, Portland OR (United States); Hajagos, Janos [Applied Biomathematics, Setauket, NY (United States); Berleant, Daniel J. [Iowa State Univ., Ames, IA (United States); Zhang, Jianzhong [Iowa State Univ., Ames, IA (United States); Tucker, W. Troy [Applied Biomathematics, Setauket, NY (United States); Ginzburg, Lev R. [Applied Biomathematics, Setauket, NY (United States); Oberkampf, William L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  18. Subjective Vertical Conflict Theory and Space Motion Sickness.

    Science.gov (United States)

    Chen, Wei; Chao, Jian-Gang; Wang, Jin-Kun; Chen, Xue-Wen; Tan, Cheng

    2016-02-01

    Space motion sickness (SMS) remains a troublesome problem during spaceflight. The subjective vertical (SV) conflict theory postulates that all motion sickness provoking situations are characterized by a condition in which the SV sensed from gravity and visual and idiotropic cues differs from the expected vertical. This theory has been successfully used to predict motion sickness in different vehicles on Earth. We have summarized the most outstanding and recent studies on the illusions and characteristics associated with spatial disorientation and SMS during weightlessness, such as cognitive map and mental rotation, the visual reorientation and inversion illusions, and orientation preferences between visual scenes and the internal z-axis of the body. The relationships between the SV and the incidence of and susceptibility to SMS as well as spatial disorientation were addressed. A consistent framework was presented to understand and explain SMS characteristics in more detail on the basis of the SV conflict theory, which is expected to be more advantageous in SMS prediction, prevention, and training.

  19. The theory, direction, and magnitude of ecosystem fire probability as constrained by precipitation and temperature.

    Directory of Open Access Journals (Sweden)

    Richard Guyette

    Full Text Available The effects of climate on wildland fire confronts society across a range of different ecosystems. Water and temperature affect the combustion dynamics, irrespective of whether those are associated with carbon fueled motors or ecosystems, but through different chemical, physical, and biological processes. We use an ecosystem combustion equation developed with the physical chemistry of atmospheric variables to estimate and simulate fire probability and mean fire interval (MFI. The calibration of ecosystem fire probability with basic combustion chemistry and physics offers a quantitative method to address wildland fire in addition to the well-studied forcing factors such as topography, ignition, and vegetation. We develop a graphic analysis tool for estimating climate forced fire probability with temperature and precipitation based on an empirical assessment of combustion theory and fire prediction in ecosystems. Climate-affected fire probability for any period, past or future, is estimated with given temperature and precipitation. A graphic analyses of wildland fire dynamics driven by climate supports a dialectic in hydrologic processes that affect ecosystem combustion: 1 the water needed by plants to produce carbon bonds (fuel and 2 the inhibition of successful reactant collisions by water molecules (humidity and fuel moisture. These two postulates enable a classification scheme for ecosystems into three or more climate categories using their position relative to change points defined by precipitation in combustion dynamics equations. Three classifications of combustion dynamics in ecosystems fire probability include: 1 precipitation insensitive, 2 precipitation unstable, and 3 precipitation sensitive. All three classifications interact in different ways with variable levels of temperature.

  20. Beta-decay rate and beta-delayed neutron emission probability of improved gross theory

    Science.gov (United States)

    Koura, Hiroyuki

    2014-09-01

    A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for unmeasured nuclei are adopted from the KTUY nuclear mass formula, which is based on the spherical-basis method. Considering the properties of the integrated Fermi function, we can roughly categorized energy region of excited-state of a daughter nucleus into three regions: a highly-excited energy region, which fully affect a delayed neutron probability, a middle energy region, which is estimated to contribute the decay heat, and a region neighboring the ground-state, which determines the beta-decay rate. Some results will be given in the presentation. A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for

  1. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    Science.gov (United States)

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  2. Joint probabilities of noncommuting observables and the Einstein-Podolsky-Rosen question in Wiener-Siegel quantum theory

    International Nuclear Information System (INIS)

    Warnock, R.L.

    1996-02-01

    Ordinary quantum theory is a statistical theory without an underlying probability space. The Wiener-Siegel theory provides a probability space, defined in terms of the usual wave function and its ''stochastic coordinates''; i.e., projections of its components onto differentials of complex Wiener processes. The usual probabilities of quantum theory emerge as measures of subspaces defined by inequalities on stochastic coordinates. Since each point α of the probability space is assigned values (or arbitrarily small intervals) of all observables, the theory gives a pseudo-classical or ''hidden-variable'' view in which normally forbidden concepts are allowed. Joint probabilities for values of noncommuting variables are well-defined. This paper gives a brief description of the theory, including a new generalization to incorporate spin, and reports the first concrete calculation of a joint probability for noncommuting components of spin of a single particle. Bohm's form of the Einstein-Podolsky-Rosen Gedankenexperiment is discussed along the lines of Carlen's paper at this Congress. It would seem that the ''EPR Paradox'' is avoided, since to each α the theory assigns opposite values for spin components of two particles in a singlet state, along any axis. In accordance with Bell's ideas, the price to pay for this attempt at greater theoretical detail is a disagreement with usual quantum predictions. The disagreement is computed and found to be large

  3. Average bit error probability of binary coherent signaling over generalized fading channels subject to additive generalized gaussian noise

    KAUST Repository

    Soury, Hamza

    2012-06-01

    This letter considers the average bit error probability of binary coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closed form expression in terms of the Fox\\'s H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading and Nakagami-m fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters. © 2012 IEEE.

  4. Exact Symbol Error Probability of Square M-QAM Signaling over Generalized Fading Channels subject to Additive Generalized Gaussian Noise

    KAUST Repository

    Soury, Hamza

    2013-07-01

    This paper considers the average symbol error probability of square Quadrature Amplitude Modulation (QAM) coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closedform expression in terms of the Fox H function and the bivariate Fox H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading, Nakagami-m fading, and Rayleigh fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters.

  5. Contribution to the neutronic theory of random stacks (diffusion coefficient and first-flight collision probabilities) with a general theorem on collision probabilities

    International Nuclear Information System (INIS)

    Dixmier, Marc.

    1980-10-01

    A general expression of the diffusion coefficient (d.c.) of neutrons was given, with stress being put on symmetries. A system of first-flight collision probabilities for the case of a random stack of any number of types of one- and two-zoned spherical pebbles, with an albedo at the frontiers of the elements or (either) consideration of the interstital medium, was built; to that end, the bases of collision probability theory were reviewed, and a wide generalisation of the reciprocity theorem for those probabilities was demonstrated. The migration area of neutrons was expressed for any random stack of convex, 'simple' and 'regular-contact' elements, taking into account the correlations between free-paths; the average cosinus of re-emission of neutrons by an element, in the case of a homogeneous spherical pebble and the transport approximation, was expressed; the superiority of the so-found result over Behrens' theory, for the type of media under consideration, was established. The 'fine structure current term' of the d.c. was also expressed, and it was shown that its 'polarisation term' is negligible. Numerical applications showed that the global heterogeneity effect on the d.c. of pebble-bed reactors is comparable with that for Graphite-moderated, Carbon gas-cooled, natural Uranium reactors. The code CARACOLE, which integrates all the results here obtained, was introduced [fr

  6. The golden mean in the topology of four-manifolds, in conformal field theory, in the mathematical probability theory and in Cantorian space-time

    International Nuclear Information System (INIS)

    Marek-Crnjac, L.

    2006-01-01

    In the present work we show the connections between the topology of four-manifolds, conformal field theory, the mathematical probability theory and Cantorian space-time. In all these different mathematical fields, we find as the main connection the appearance of the golden mean

  7. Comparison between the Health Belief Model and Subjective Expected Utility Theory: predicting incontinence prevention behaviour in post-partum women.

    Science.gov (United States)

    Dolman, M; Chase, J

    1996-08-01

    A small-scale study was undertaken to test the relative predictive power of the Health Belief Model and Subjective Expected Utility Theory for the uptake of a behaviour (pelvic floor exercises) to reduce post-partum urinary incontinence in primigravida females. A structured questionnaire was used to gather data relevant to both models from a sample antenatal and postnatal primigravida women. Questions examined the perceived probability of becoming incontinent, the perceived (dis)utility of incontinence, the perceived probability of pelvic floor exercises preventing future urinary incontinence, the costs and benefits of performing pelvic floor exercises and sources of information and knowledge about incontinence. Multiple regression analysis focused on whether or not respondents intended to perform pelvic floor exercises and the factors influencing their decisions. Aggregated data were analysed to compare the Health Belief Model and Subjective Expected Utility Theory directly.

  8. Integrated data analysis of fusion diagnostics by means of the Bayesian probability theory

    International Nuclear Information System (INIS)

    Fischer, R.; Dinklage, A.

    2004-01-01

    Integrated data analysis (IDA) of fusion diagnostics is the combination of heterogeneous diagnostics to obtain validated physical results. Benefits from the integrated approach result from a systematic use of interdependencies; in that sense IDA optimizes the extraction of information from sets of different data. For that purpose IDA requires a systematic and formalized error analysis of all (statistical and systematic) uncertainties involved in each diagnostic. Bayesian probability theory allows for a systematic combination of all information entering the diagnostic model by considering all uncertainties of the measured data, the calibration measurements, and the physical model. Prior physics knowledge on model parameters can be included. Handling of systematic errors is provided. A central goal of the integration of redundant or complementary diagnostics is to provide information to resolve inconsistencies by exploiting interdependencies. A comparable analysis of sets of diagnostics (meta-diagnostics) is performed by combining statistical and systematical uncertainties with model parameters and model uncertainties. Diagnostics improvement and experimental optimization and design of meta-diagnostics will be discussed

  9. Individual variation in social aggression and the probability of inheritance: theory and a field test.

    Science.gov (United States)

    Cant, Michael A; Llop, Justine B; Field, Jeremy

    2006-06-01

    Recent theory suggests that much of the wide variation in individual behavior that exists within cooperative animal societies can be explained by variation in the future direct component of fitness, or the probability of inheritance. Here we develop two models to explore the effect of variation in future fitness on social aggression. The models predict that rates of aggression will be highest toward the front of the queue to inherit and will be higher in larger, more productive groups. A third prediction is that, in seasonal animals, aggression will increase as the time available to inherit the breeding position runs out. We tested these predictions using a model social species, the paper wasp Polistes dominulus. We found that rates of both aggressive "displays" (aimed at individuals of lower rank) and aggressive "tests" (aimed at individuals of higher rank) decreased down the hierarchy, as predicted by our models. The only other significant factor affecting aggression rates was date, with more aggression observed later in the season, also as predicted. Variation in future fitness due to inheritance rank is the hidden factor accounting for much of the variation in aggressiveness among apparently equivalent individuals in this species.

  10. Exact closed form expressions for outage probability of GSC receivers over Rayleigh fading channel subject to self-interference

    KAUST Repository

    Nam, Sungsik

    2010-11-01

    Previous work on performance analyses of generalized selection combining (GSC) RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closed-form expressions for various performance measures. However, some open problems related to the performance evaluation of GSC RAKE receivers still remain to be solved such that an assessment of the impact of self-interference on the performance of GSC RAKE receivers. To have a full and exact understanding of the performance of GSC RAKE receivers, the outage probability of GSC RAKE receivers needs to be analyzed as closed-form expressions. The major difficulty in this problem is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for outage probability of GSC RAKE receivers subject to self-interference over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.

  11. Assessment of climate change using methods of mathematic statistics and theory of probability

    International Nuclear Information System (INIS)

    Trajanoska, Lidija; Kaevski, Ivancho

    2004-01-01

    In simple terms: 'Climate' is the average of 'weather'. The Earth's weather system is a complex machine composed of coupled sub-systems (ocean, air, land, ice and the biosphere) between which energy are exchanged. The understanding and study of climate change does not only rely on the understanding of the physics of climate change but is linked to the following question: 'How we can detect change in a system that is changing all the time under its own volition'? What is even the meaning of 'change' in such a situation? The concept of 'change' we should transform into the concept of 'significant and long-term' then this re-phrasing allows for a definition in mathematical terms. Significant change in a system becomes a measure of how large an observed change is in terms of the variability one would see under 'normal' conditions. Example could be the analyses of the yearly temperature of the air and precipitations, like in this paper. A large amount of data are selected as representing the 'before' case (change) and another set of data are selected as being the 'after' case and then the average in these two cases are compared. These comparisons are in the form of 'hypothesis tests' in which one tests whether the hypothesis that there has Open no change can be rejected. Both parameter and nonparametric statistic methods are used in the theory of mathematic statistic. The most indicative changeable which show global change is an average, standard deviation and probability function distribution on examined time series. Examined meteorological series are taken like haphazard process so we can mathematic statistic applied.(Author)

  12. Analysis of femtosecond pump-probe photoelectron-photoion coincidence measurements applying Bayesian probability theory

    Science.gov (United States)

    Rumetshofer, M.; Heim, P.; Thaler, B.; Ernst, W. E.; Koch, M.; von der Linden, W.

    2018-06-01

    Ultrafast dynamical processes in photoexcited molecules can be observed with pump-probe measurements, in which information about the dynamics is obtained from the transient signal associated with the excited state. Background signals provoked by pump and/or probe pulses alone often obscure these excited-state signals. Simple subtraction of pump-only and/or probe-only measurements from the pump-probe measurement, as commonly applied, results in a degradation of the signal-to-noise ratio and, in the case of coincidence detection, the danger of overrated background subtraction. Coincidence measurements additionally suffer from false coincidences, requiring long data-acquisition times to keep erroneous signals at an acceptable level. Here we present a probabilistic approach based on Bayesian probability theory that overcomes these problems. For a pump-probe experiment with photoelectron-photoion coincidence detection, we reconstruct the interesting excited-state spectrum from pump-probe and pump-only measurements. This approach allows us to treat background and false coincidences consistently and on the same footing. We demonstrate that the Bayesian formalism has the following advantages over simple signal subtraction: (i) the signal-to-noise ratio is significantly increased, (ii) the pump-only contribution is not overestimated, (iii) false coincidences are excluded, (iv) prior knowledge, such as positivity, is consistently incorporated, (v) confidence intervals are provided for the reconstructed spectrum, and (vi) it is applicable to any experimental situation and noise statistics. Most importantly, by accounting for false coincidences, the Bayesian approach allows us to run experiments at higher ionization rates, resulting in a significant reduction of data acquisition times. The probabilistic approach is thoroughly scrutinized by challenging mock data. The application to pump-probe coincidence measurements on acetone molecules enables quantitative interpretations

  13. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  14. The Use of Probability Theory as a Basis for Planning and Controlling Overhead Costs in Education and Industry. Final Report.

    Science.gov (United States)

    Vinson, R. B.

    In this report, the author suggests changes in the treatment of overhead costs by hypothesizing that "the effectiveness of standard costing in planning and controlling overhead costs can be increased through the use of probability theory and associated statistical techniques." To test the hypothesis, the author (1) presents an overview of the…

  15. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  16. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  17. Item response theory at subject- and group-level

    NARCIS (Netherlands)

    Tobi, Hilde

    1990-01-01

    This paper reviews the literature about item response models for the subject level and aggregated level (group level). Group-level item response models (IRMs) are used in the United States in large-scale assessment programs such as the National Assessment of Educational Progress and the California

  18. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    Science.gov (United States)

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    Science.gov (United States)

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  20. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  1. Probability theory for 3-layer remote sensing in ideal gas law environment.

    Science.gov (United States)

    Ben-David, Avishai; Davidson, Charles E

    2013-08-26

    We extend the probability model for 3-layer radiative transfer [Opt. Express 20, 10004 (2012)] to ideal gas conditions where a correlation exists between transmission and temperature of each of the 3 layers. The effect on the probability density function for the at-sensor radiances is surprisingly small, and thus the added complexity of addressing the correlation can be avoided. The small overall effect is due to (a) small perturbations by the correlation on variance population parameters and (b) cancellation of perturbation terms that appear with opposite signs in the model moment expressions.

  2. Some highlights on the work in probability theory in India during ...

    Indian Academy of Sciences (India)

    processes, growth of cancer cells, queuing theory ... context are: • J Medhi ... person dynamic game, the role of the Skorohod ... Neumann and oblique derivative boundary value ... belief is wrong. ... work has been done under the leadership of.

  3. Bias and spread in extreme value theory measurements of probability of error

    Science.gov (United States)

    Smith, J. G.

    1972-01-01

    Extreme value theory is examined to explain the cause of the bias and spread in performance of communications systems characterized by low bit rates and high data reliability requirements, for cases in which underlying noise is Gaussian or perturbed Gaussian. Experimental verification is presented and procedures that minimize these effects are suggested. Even under these conditions, however, extreme value theory test results are not particularly more significant than bit error rate tests.

  4. Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory

    Science.gov (United States)

    Chruściński, Dariusz

    2013-03-01

    Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.

  5. Quantum-correlation breaking channels, quantum conditional probability and Perron–Frobenius theory

    International Nuclear Information System (INIS)

    Chruściński, Dariusz

    2013-01-01

    Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum–classical and classical–classical channels. Applying the quantum analog of Perron–Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum–classical channels to arbitrary quantum channels.

  6. Biedenharn transformation in the theory of H ion. Probabilities of radiative transitions

    International Nuclear Information System (INIS)

    Zapryagaev, S.A.

    1987-01-01

    The solution of the Dirac equation in the Coulomb field is investigated by means of an anti-unitary transformation, reducing the set of relativistic equations to a non-relativistic equation. The obtained solutions are used to calculate probabilities of radiational transitions between fine-structure and hyperfine-structure levels of the H ion with an arbitrary nuclear charge

  7. Theory and analysis of accuracy for the method of characteristics direction probabilities with boundary averaging

    International Nuclear Information System (INIS)

    Liu, Zhouyu; Collins, Benjamin; Kochunas, Brendan; Downar, Thomas; Xu, Yunlin; Wu, Hongchun

    2015-01-01

    Highlights: • The CDP combines the benefits of the CPM’s efficiency and the MOC’s flexibility. • Boundary averaging reduces the computation effort with losing minor accuracy. • An analysis model is used to justify the choice of optimize averaging strategy. • Numerical results show the performance and accuracy. - Abstract: The method of characteristic direction probabilities (CDP) combines the benefits of the collision probability method (CPM) and the method of characteristics (MOC) for the solution of the integral form of the Botlzmann Transport Equation. By coupling only the fine regions traversed by the characteristic rays in a particular direction, the computational effort required to calculate the probability matrices and to solve the matrix system is considerably reduced compared to the CPM. Furthermore, boundary averaging is performed to reduce the storage and computation but the capability of dealing with complicated geometries is preserved since the same ray tracing information is used as in MOC. An analysis model for the outgoing angular flux is used to analyze a variety of outgoing angular flux averaging methods for the boundary and to justify the choice of optimize averaging strategy. The boundary average CDP method was then implemented in the Michigan PArallel Characteristic based Transport (MPACT) code to perform 2-D and 3-D transport calculations. The numerical results are given for different cases to show the effect of averaging on the outgoing angular flux, region scalar flux and the eigenvalue. Comparison of the results with the case with no averaging demonstrates that an angular dependent averaging strategy is possible for the CDP to improve its computational performance without compromising the achievable accuracy

  8. Classical many-body theory with retarded interactions: Dynamical irreversibility and determinism without probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Zakharov, A.Yu., E-mail: Anatoly.Zakharov@novsu.ru; Zakharov, M.A., E-mail: ma_zakharov@list.ru

    2016-01-28

    The exact equations of motion for microscopic density of classical many-body system with account of inter-particle retarded interactions is derived. It is shown that interactions retardation leads to irreversible behavior of many-body systems. - Highlights: • A new form of equation of motion of classical many-body system is proposed. • Interactions retardation as one of the mechanisms of many-body system irreversibility. • Irreversibility and determinism without probabilities. • The possible way to microscopic foundation of thermodynamics.

  9. Modeling self on others: An import theory of subjectivity and selfhood.

    Science.gov (United States)

    Prinz, Wolfgang

    2017-03-01

    This paper outlines an Import Theory of subjectivity and selfhood. Import theory claims that subjectivity is initially perceived as a key feature of other minds before it then becomes imported from other minds to own minds whereby it lays the ground for mental selfhood. Import theory builds on perception-production matching, which in turn draws on both representational mechanisms and social practices. Representational mechanisms rely on common coding of perception and production. Social practices rely on action mirroring in dyadic interactions. The interplay between mechanisms and practices gives rise to model self on others. Individuals become intentional agents in virtue of perceiving others mirroring themselves. The outline of the theory is preceded by an introductory section that locates import theory in the broader context of competing approaches, and it is followed by a concluding section that assesses import theory in terms of empirical evidence and explanatory power. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Strong lensing probability in TeVeS (tensor–vector–scalar) theory

    International Nuclear Information System (INIS)

    Chen Daming

    2008-01-01

    We recalculate the strong lensing probability as a function of the image separation in TeVeS (tensor–vector–scalar) cosmology, which is a relativistic version of MOND (MOdified Newtonian Dynamics). The lens is modeled by the Hernquist profile. We assume an open cosmology with Ω b = 0.04 and Ω Λ = 0.5 and three different kinds of interpolating functions. Two different galaxy stellar mass functions (GSMF) are adopted: PHJ (Panter, Heavens and Jimenez 2004 Mon. Not. R. Astron. Soc. 355 764) determined from SDSS data release 1 and Fontana (Fontana et al 2006 Astron. Astrophys. 459 745) from GOODS-MUSIC catalog. We compare our results with both the predicted probabilities for lenses from singular isothermal sphere galaxy halos in LCDM (Lambda cold dark matter) with a Schechter-fit velocity function, and the observational results for the well defined combined sample of the Cosmic Lens All-Sky Survey (CLASS) and Jodrell Bank/Very Large Array Astrometric Survey (JVAS). It turns out that the interpolating function μ(x) = x/(1+x) combined with Fontana GSMF matches the results from CLASS/JVAS quite well

  11. Strong lensing probability in TeVeS (tensor-vector-scalar) theory

    Science.gov (United States)

    Chen, Da-Ming

    2008-01-01

    We recalculate the strong lensing probability as a function of the image separation in TeVeS (tensor-vector-scalar) cosmology, which is a relativistic version of MOND (MOdified Newtonian Dynamics). The lens is modeled by the Hernquist profile. We assume an open cosmology with Ωb = 0.04 and ΩΛ = 0.5 and three different kinds of interpolating functions. Two different galaxy stellar mass functions (GSMF) are adopted: PHJ (Panter, Heavens and Jimenez 2004 Mon. Not. R. Astron. Soc. 355 764) determined from SDSS data release 1 and Fontana (Fontana et al 2006 Astron. Astrophys. 459 745) from GOODS-MUSIC catalog. We compare our results with both the predicted probabilities for lenses from singular isothermal sphere galaxy halos in LCDM (Lambda cold dark matter) with a Schechter-fit velocity function, and the observational results for the well defined combined sample of the Cosmic Lens All-Sky Survey (CLASS) and Jodrell Bank/Very Large Array Astrometric Survey (JVAS). It turns out that the interpolating function μ(x) = x/(1+x) combined with Fontana GSMF matches the results from CLASS/JVAS quite well.

  12. Computational modeling of the probability of destructions in total joint endoprosthesis ceramic heads using Weibull's theory

    Czech Academy of Sciences Publication Activity Database

    Janíček, P.; Fuis, Vladimír; Málek, M.

    2010-01-01

    Roč. 14, č. 4 (2010), s. 42-51 ISSN 1335-2393 Institutional research plan: CEZ:AV0Z20760514 Keywords : computational modeling * ceramic head * in vivo destructions * hip joint endoprosthesis * probabily of rupture Subject RIV: BO - Biophysics

  13. Probability distribution of distance in a uniform ellipsoid: Theory and applications to physics

    International Nuclear Information System (INIS)

    Parry, Michelle; Fischbach, Ephraim

    2000-01-01

    A number of authors have previously found the probability P n (r) that two points uniformly distributed in an n-dimensional sphere are separated by a distance r. This result greatly facilitates the calculation of self-energies of spherically symmetric matter distributions interacting by means of an arbitrary radially symmetric two-body potential. We present here the analogous results for P 2 (r;ε) and P 3 (r;ε) which respectively describe an ellipse and an ellipsoid whose major and minor axes are 2a and 2b. It is shown that for ε=(1-b 2 /a 2 ) 1/2 ≤1, P 2 (r;ε) and P 3 (r;ε) can be obtained as an expansion in powers of ε, and our results are valid through order ε 4 . As an application of these results we calculate the Coulomb energy of an ellipsoidal nucleus, and compare our result to an earlier result quoted in the literature. (c) 2000 American Institute of Physics

  14. From axiomatics of quantum probability to modelling geological uncertainty and management of intelligent hydrocarbon reservoirs with the theory of open quantum systems

    Science.gov (United States)

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia

    2018-04-01

    As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper, we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E. The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. `explore or not?'; `open new well or not?'; `contaminated by water or not?'; `double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism). This article is part of the theme issue `Hilbert's sixth problem'.

  15. From axiomatics of quantum probability to modelling geological uncertainty and management of intelligent hydrocarbon reservoirs with the theory of open quantum systems.

    Science.gov (United States)

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia

    2018-04-28

    As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper , we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E ; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. 'explore or not?'; 'open new well or not?'; 'contaminated by water or not?'; 'double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism).This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  16. Subjectivity

    Directory of Open Access Journals (Sweden)

    Jesús Vega Encabo

    2015-11-01

    Full Text Available In this paper, I claim that subjectivity is a way of being that is constituted through a set of practices in which the self is subject to the dangers of fictionalizing and plotting her life and self-image. I examine some ways of becoming subject through narratives and through theatrical performance before others. Through these practices, a real and active subjectivity is revealed, capable of self-knowledge and self-transformation. 

  17. Increased probability of repetitive spinal motoneuron activation by transcranial magnetic stimulation after muscle fatigue in healthy subjects

    DEFF Research Database (Denmark)

    Andersen, Birgit; Felding, Ulrik Ascanius; Krarup, Christian

    2012-01-01

    Triple stimulation technique (TST) has previously shown that transcranial magnetic stimulation (TMS) fails to activate a proportion of spinal motoneurons (MNs) during motor fatigue. The TST response depression without attenuation of the conventional motor evoked potential suggested increased...... probability of repetitive spinal MN activation during exercise even if some MNs failed to discharge by the brain stimulus. Here we used a modified TST (Quadruple stimulation; QuadS and Quintuple stimulation; QuintS) to examine the influence of fatiguing exercise on second and third MN discharges after......, reflecting that a greater proportion of spinal MNs were activated 2 or 3 times by the transcranial stimulus. The size of QuadS responses did not return to pre-contraction levels during 10 min observation time indicating long-lasting increase in excitatory input to spinal MNs. In addition, the post...

  18. Optimal Volume for Concert Halls Based on Ando’s Subjective Preference and Barron Revised Theories

    Directory of Open Access Journals (Sweden)

    Salvador Cerdá

    2014-03-01

    Full Text Available The Ando-Beranek’s model, a linear version of Ando’s subjective preference theory, obtained by the authors in a recent work, was combined with Barron revised theory. An optimal volume region for each reverberation time was obtained for classical music in symphony orchestra concert halls. The obtained relation was tested with good agreement with the top rated halls reported by Beranek and other halls with reported anomalies.

  19. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  20. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  1. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information.

    Science.gov (United States)

    Perlin, Mark William

    2015-01-01

    DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI), used by crime labs for over 15 years. When testing 13 short tandem repeat (STR) genetic loci, the CPI(-1) value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR), spans a much broader range. This study examined probability of inclusion (PI) mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI(-1)) values were examined and compared with corresponding log(LR) values. The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN), CPI(-1) increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN rather than measuring identification information. A quantitative

  2. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    Directory of Open Access Journals (Sweden)

    Mark William Perlin

    2015-01-01

    Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN

  3. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  4. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  5. Improving subject recruitment, retention, and participation in research through Peplau's theory of interpersonal relations.

    Science.gov (United States)

    Penckofer, Sue; Byrn, Mary; Mumby, Patricia; Ferrans, Carol Estwing

    2011-04-01

    Recruitment and retention of persons participating in research is one of the most significant challenges faced by investigators. Although incentives are often used to improve recruitment and retention, evidence suggests that the relationship of the patient to study personnel may be the single, most important factor in subject accrual and continued participation. Peplau's theory of interpersonal relations provides a framework to study the nurse-patient relationship during the research process. In this paper the authors provide a brief summary of research strategies that have been used for the recruitment and retention of subjects and an overview of Peplau's theory of interpersonal relations including its use in research studies. In addition, a discussion of how this theory was used for the successful recruitment and retention of women with type 2 diabetes who participated in a clinical trial using a nurse-delivered psychoeducational intervention for depression is addressed.

  6. the theory of probability

    Indian Academy of Sciences (India)

    the event A) such that the frequencies v get closer "generally speaking" to p as the number of ... Before showing how this is done, we must enumerate .... sequences is obviously equal to the number of combinations of n things taken k at a time, ...

  7. the theory of probability

    Indian Academy of Sciences (India)

    the event A demonstrate the absence of any law connecting the complex of conditions S and the event A? For example, let it be established that lamps of a specific type, manufactured in a certain factory (condition S) sometimes continue to burn more than 2,000 hours (event A), but some- times burn out and become useless ...

  8. What is Probability Theory?

    Indian Academy of Sciences (India)

    IAS Admin

    statistics at all levels. .... P(Ai) for k < ∞ and A1,A2, ··· ,Ak ∈ F and Ai ∩ Aj = ∅ for i = j. Next, it is reasonable to require that F be closed .... roll of dice, card games such as Bridge. ..... ing data (i.e., generating random variables) according to ...

  9. Subjective socioeconomic status causes aggression: A test of the theory of social deprivation.

    Science.gov (United States)

    Greitemeyer, Tobias; Sagioglou, Christina

    2016-08-01

    Seven studies (overall N = 3690) addressed the relation between people's subjective socioeconomic status (SES) and their aggression levels. Based on relative deprivation theory, we proposed that people low in subjective SES would feel at a disadvantage, which in turn would elicit aggressive responses. In 3 correlational studies, subjective SES was negatively related to trait aggression. Importantly, this relation held when controlling for measures that are related to 1 or both subjective SES and trait aggression, such as the dark tetrad and the Big Five. Four experimental studies then demonstrated that participants in a low status condition were more aggressive than were participants in a high status condition. Compared with a medium-SES condition, participants of low subjective SES were more aggressive rather than participants of high subjective SES being less aggressive. Moreover, low SES increased aggressive behavior toward targets that were the source for participants' experience of disadvantage but also toward neutral targets. Sequential mediation analyses suggest that the experience of disadvantage underlies the effect of subjective SES on aggressive affect, whereas aggressive affect was the proximal determinant of aggressive behavior. Taken together, the present research found comprehensive support for key predictions derived from the theory of relative deprivation of how the perception of low SES is related to the person's judgments, emotional reactions, and actions. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Facilitating Group Decision-Making: Facilitator's Subjective Theories on Group Coordination

    Directory of Open Access Journals (Sweden)

    Michaela Kolbe

    2008-10-01

    Full Text Available A key feature of group facilitation is motivating and coordinating people to perform their joint work. This paper focuses on group coordination which is a prerequisite to group effectiveness, especially in complex tasks. Decision-making in groups is a complex task that consequently needs to be coordinated by explicit rather than implicit coordination mechanisms. Based on the embedded definition that explicit coordination does not just happen but is purposely executed by individuals, we argue that individual coordination intentions and mechanisms should be taken into account. Thus far, the subjective perspective of coordination has been neglected in coordination theory, which is understandable given the difficulties in defining and measuring subjective aspects of group facilitation. We therefore conducted focused interviews with eight experts who either worked as senior managers or as experienced group facilitators and analysed their approaches to group coordination using methods of content analysis. Results show that these experts possess sophisticated mental representations of their coordination behaviour. These subjective coordination theories can be organised in terms of coordination schemes in which coordination-releasing situations are facilitated by special coordination mechanisms that, in turn, lead to the perception of specific consequences. We discuss the importance of these subjective coordination theories for effectively facilitating group decision-making and minimising process losses. URN: urn:nbn:de:0114-fqs0901287

  11. Evaluation of probability and hazard in nuclear energy

    International Nuclear Information System (INIS)

    Novikov, V.Ya.; Romanov, N.L.

    1979-01-01

    Various methods of evaluation of accident probability on NPP are proposed because of NPP security statistic evaluation unreliability. The conception of subjective probability for quantitative analysis of security and hazard are described. Intrepretation of probability as real faith of an expert is assumed as a basis of the conception. It is suggested to study the event uncertainty in the framework of subjective probability theory which not only permits but demands to take into account expert opinions when evaluating the probability. These subjective expert evaluations effect to a certain extent the calculation of the usual mathematical event probability. The above technique is advantageous to use for consideration of a separate experiment or random event

  12. End-to-end probability for an interacting center vortex world line in Yang-Mills theory

    International Nuclear Information System (INIS)

    Teixeira, Bruno F.I.; Lemos, Andre L.L. de; Oxman, Luis E.

    2011-01-01

    Full text: The understanding of quark confinement is a very important open problem in Yang-Mills theory. In this regard, nontrivial topological defects are expected to play a relevant role to achieve a solution. Here we are interested in how to deal with these structures, relying on the Cho-Faddeev-Niemi decomposition and the possibility it offers to describe defects in terms of a local color frame. In particular, the path integral for a single center vortex is a fundamental object to handle the ensemble integration. As is well-known, in three dimensions center vortices are string-like and the associated physics is closely related with that of polymers. Using recent techniques developed in the latter context, we present in this work a detailed derivation of the equation for the end-to-end probability for a center vortex world line, including the effects of interactions. Its solution can be associated with a Green function that depends on the position and orientation at the boundaries, where monopole-like instantons are placed. In the limit of semi flexible polymers, an expansion only keeping the lower angular momenta for the final orientation leads to a reduced Green function for a complex vortex field minimally coupled to the dual Yang-Mills fields. This constitutes a key ingredient to propose an effective model for correlated monopoles, center vortices and the dual fields. (author)

  13. Applying Probability Theory for the Quality Assessment of a Wildfire Spread Prediction Framework Based on Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Andrés Cencerrado

    2013-01-01

    Full Text Available This work presents a framework for assessing how the existing constraints at the time of attending an ongoing forest fire affect simulation results, both in terms of quality (accuracy obtained and the time needed to make a decision. In the wildfire spread simulation and prediction area, it is essential to properly exploit the computational power offered by new computing advances. For this purpose, we rely on a two-stage prediction process to enhance the quality of traditional predictions, taking advantage of parallel computing. This strategy is based on an adjustment stage which is carried out by a well-known evolutionary technique: Genetic Algorithms. The core of this framework is evaluated according to the probability theory principles. Thus, a strong statistical study is presented and oriented towards the characterization of such an adjustment technique in order to help the operation managers deal with the two aspects previously mentioned: time and quality. The experimental work in this paper is based on a region in Spain which is one of the most prone to forest fires: El Cap de Creus.

  14. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  15. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  16. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  17. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  18. Bonding in Heavier Group 14 Zero-Valent Complexes-A Combined Maximum Probability Domain and Valence Bond Theory Approach.

    Science.gov (United States)

    Turek, Jan; Braïda, Benoît; De Proft, Frank

    2017-10-17

    The bonding in heavier Group 14 zero-valent complexes of a general formula L 2 E (E=Si-Pb; L=phosphine, N-heterocyclic and acyclic carbene, cyclic tetrylene and carbon monoxide) is probed by combining valence bond (VB) theory and maximum probability domain (MPD) approaches. All studied complexes are initially evaluated on the basis of the structural parameters and the shape of frontier orbitals revealing a bent structural motif and the presence of two lone pairs at the central E atom. For the VB calculations three resonance structures are suggested, representing the "ylidone", "ylidene" and "bent allene" structures, respectively. The influence of both ligands and central atoms on the bonding situation is clearly expressed in different weights of the resonance structures for the particular complexes. In general, the bonding in the studied E 0 compounds, the tetrylones, is best described as a resonating combination of "ylidone" and "ylidene" structures with a minor contribution of the "bent allene" structure. Moreover, the VB calculations allow for a straightforward assessment of the π-backbonding (E→L) stabilization energy. The validity of the suggested resonance model is further confirmed by the complementary MPD calculations focusing on the E lone pair region as well as the E-L bonding region. Likewise, the MPD method reveals a strong influence of the σ-donating and π-accepting properties of the ligand. In particular, either one single domain or two symmetrical domains are found in the lone pair region of the central atom, supporting the predominance of either the "ylidene" or "ylidone" structures having one or two lone pairs at the central atom, respectively. Furthermore, the calculated average populations in the lone pair MPDs correlate very well with the natural bond orbital (NBO) populations, and can be related to the average number of electrons that is backdonated to the ligands. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Using extreme value theory approaches to forecast the probability of outbreak of highly pathogenic influenza in Zhejiang, China.

    Directory of Open Access Journals (Sweden)

    Jiangpeng Chen

    Full Text Available Influenza is a contagious disease with high transmissibility to spread around the world with considerable morbidity and mortality and presents an enormous burden on worldwide public health. Few mathematical models can be used because influenza incidence data are generally not normally distributed. We developed a mathematical model using Extreme Value Theory (EVT to forecast the probability of outbreak of highly pathogenic influenza.The incidence data of highly pathogenic influenza in Zhejiang province from April 2009 to November 2013 were retrieved from the website of Health and Family Planning Commission of Zhejiang Province. MATLAB "VIEM" toolbox was used to analyze data and modelling. In the present work, we used the Peak Over Threshold (POT model, assuming the frequency as a Poisson process and the intensity to be Pareto distributed, to characterize the temporal variability of the long-term extreme incidence of highly pathogenic influenza in Zhejiang, China.The skewness and kurtosis of the incidence of highly pathogenic influenza in Zhejiang between April 2009 and November 2013 were 4.49 and 21.12, which indicated a "fat tail" distribution. A QQ plot and a mean excess plot were used to further validate the features of the distribution. After determining the threshold, we modeled the extremes and estimated the shape parameter and scale parameter by the maximum likelihood method. The results showed that months in which the incidence of highly pathogenic influenza is about 4462/2286/1311/487 are predicted to occur once every five/three/two/one year, respectively.Despite the simplicity, the present study successfully offers the sound modeling strategy and a methodological avenue to implement forecasting of an epidemic in the midst of its course.

  20. A large deformation theory of solids subject to electromagnetic loads and its application

    International Nuclear Information System (INIS)

    Nishiguchi, I.; Sasaki, M.

    1993-01-01

    A large deformation theory of deformable solids is proposed in which the interaction with electromagnetic fields is taken into account. Weak forms of the Maxwell's equations in a fixed reference configuration together with the balance of momentum constitute the governing equations for our theory. The weak forms of the Maxwell's equations in a reference configuration can be derived by the direct transformation from spatial weak forms. The results coincide with the weak forms obtained from the local expressions by Lax and Nelson though we made a distinction between the covariant and contravariant vector explicitly. For the deformable body subject to the electromagnetic fields, weak forms of the Ampere's law and/or the Faraday's law, when combined with the weak form of the balance of momentum, can serve as the governing equations of the theory. As is known, however, these equations are not sufficient to describe the response of a specific material due to a given loading. As for the momentum balance, we need the dependency of stress on the deformation and objective constitutive equations of hyperelasticity, hypoelasticity and inelasticity are available. Parallel to these, objective constitutive equations for the electromagnetism are discussed. As an application of the theory, linearized equations for quasi-static deformation under magnetic field is derived based on the vector potential formulation. (author)

  1. [Social intelligence deficits in autistic children and adolescents--subjective theories of psychosocial health care professionals].

    Science.gov (United States)

    Krech, M; Probst, P

    1998-10-01

    The paper is concerned with personal theories of health care professionals about deficiencies in social intelligence of autistic persons. In the component-model of social intelligence means the ability of individuals or groups, to interact with each other in social situations. This contains social perception, social behavior as well as social conceptions and refers to emotional, cognitive and normative aspects. 33 interviewees, working as psychologists or teachers in kindergartens, schools or therapy institutions, are questioned by a half-standardized single interview concerning their beliefs about nonverbal social abilities, social perspective taking, and construction of a theory of mind in autistic persons. The major finding is: The impairments can be found in all aspects of social intelligence. Especially emotional handicaps, which are quoted by more than 80% of the interviewees, and low cognitive preconditions of mastering social stimuli, which are quoted by nearly all interviewees, are relevant. The subjective theories of the interviewees are in accordance to the models of parents as well as the models of the leading experts. The professional relationship to autistic persons and the practical experiences of the health care professionals lead to their specific personal theories of deficiencies in social intelligence of autistic people with wide consequences in respect to the professional contact with the autistic children and young adults.

  2. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  3. Stochastic Analysis and Applied Probability(3.3.1): Topics in the Theory and Applications of Stochastic Analysis

    Science.gov (United States)

    2015-08-13

    Critical Catalyst Reactant Branching Processes with Controlled Immigration , Annals of Applied Probability (03 2012) Amarjit Budhiraja, Rami Atar ...Markus Fischer. Large Deviation Properties of Weakly Interacting Processes via Weak Convergence Methods, Annals of Probability (10 2010) Rami Atar ...Dimensional Forward-Backward Stochastic Differen- tial Equations and the KPZ Equation Electron. J. Probab., 19 (2014), no. 40, 121. [2] R. Atar and A

  4. Optimal Release Time and Sensitivity Analysis Using a New NHPP Software Reliability Model with Probability of Fault Removal Subject to Operating Environments

    Directory of Open Access Journals (Sweden)

    Kwang Yoon Song

    2018-05-01

    Full Text Available With the latest technological developments, the software industry is at the center of the fourth industrial revolution. In today’s complex and rapidly changing environment, where software applications must be developed quickly and easily, software must be focused on rapidly changing information technology. The basic goal of software engineering is to produce high-quality software at low cost. However, because of the complexity of software systems, software development can be time consuming and expensive. Software reliability models (SRMs are used to estimate and predict the reliability, number of remaining faults, failure intensity, total and development cost, etc., of software. Additionally, it is very important to decide when, how, and at what cost to release the software to users. In this study, we propose a new nonhomogeneous Poisson process (NHPP SRM with a fault detection rate function affected by the probability of fault removal on failure subject to operating environments and discuss the optimal release time and software reliability with the new NHPP SRM. The example results show a good fit to the proposed model, and we propose an optimal release time for a given change in the proposed model.

  5. Monte Carlo simulation of γ and fission transfer-induced probabilities using extended -matrix theory: Application to the 237U∗ system

    Directory of Open Access Journals (Sweden)

    Bouland Olivier

    2017-01-01

    Full Text Available This paper deals with simultaneous neutron-induced average partial cross sections and surrogate-like probability simulations over several excitation and de-excitation channels of the compound nucleus. Present calculations, based on one-dimensional fission barrier extended -matrix theory using Monte Carlo samplings of both first and second well resonance parameters, avoid the surrogate-reaction method historically taken for surrogate data analyses that proved to be very poor in terms of extrapolated neutron-induced capture cross sections. Present theoretical approach is portrayed and subsequent results can be compared for the first time with experimental γ-decay probabilities; thanks to brand new simultaneous 238U(3He,4Heγ and 238U(3He,4He f surrogate measurements. Future integration of our strategy in standard neutron cross section data evaluation remains tied to the developments made in terms of direct reaction population probability calculations.

  6. Development of a subjective cognitive decline questionnaire using item response theory: a pilot study.

    Science.gov (United States)

    Gifford, Katherine A; Liu, Dandan; Romano, Raymond; Jones, Richard N; Jefferson, Angela L

    2015-12-01

    Subjective cognitive decline (SCD) may indicate unhealthy cognitive changes, but no standardized SCD measurement exists. This pilot study aims to identify reliable SCD questions. 112 cognitively normal (NC, 76±8 years, 63% female), 43 mild cognitive impairment (MCI; 77±7 years, 51% female), and 33 diagnostically ambiguous participants (79±9 years, 58% female) were recruited from a research registry and completed 57 self-report SCD questions. Psychometric methods were used for item-reduction. Factor analytic models assessed unidimensionality of the latent trait (SCD); 19 items were removed with extreme response distribution or trait-fit. Item response theory (IRT) provided information about question utility; 17 items with low information were dropped. Post-hoc simulation using computerized adaptive test (CAT) modeling selected the most commonly used items (n=9 of 21 items) that represented the latent trait well (r=0.94) and differentiated NC from MCI participants (F(1,146)=8.9, p=0.003). Item response theory and computerized adaptive test modeling identified nine reliable SCD items. This pilot study is a first step toward refining SCD assessment in older adults. Replication of these findings and validation with Alzheimer's disease biomarkers will be an important next step for the creation of a SCD screener.

  7. The mean distance to the nth neighbour in a uniform distribution of random points: an application of probability theory

    International Nuclear Information System (INIS)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K

    2008-01-01

    We study different ways of determining the mean distance (r n ) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating (r n ). Next, we describe two alternative means of deriving the exact expression of (r n ): we review the method using absolute probability and develop an alternative method using conditional probability. Finally, we obtain an approximation to (r n ) from the mean volume between the reference point and its nth neighbour and compare it with the heuristic and exact results

  8. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  9. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  10. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  11. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki; Park, Kihong; Alouini, Mohamed-Slim

    2017-01-01

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  12. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki

    2017-07-28

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  13. Key Informant Models for Measuring Group-Level Variables in Small Groups: Application to Plural Subject Theory

    Science.gov (United States)

    Algesheimer, René; Bagozzi, Richard P.; Dholakia, Utpal M.

    2018-01-01

    We offer a new conceptualization and measurement models for constructs at the group-level of analysis in small group research. The conceptualization starts with classical notions of group behavior proposed by Tönnies, Simmel, and Weber and then draws upon plural subject theory by philosophers Gilbert and Tuomela to frame a new perspective…

  14. What's Foucault Got to Do with It? History, Theory, and Becoming Subjected

    Science.gov (United States)

    Butchart, Ronald E.

    2011-01-01

    The three essays that make up this issue on theory in educational history by Eileen Tamura, Caroline Eick, and Roland Sintos Coloma constitute an indictment of the field of the history of education for its neglect of theory. Read linearly, from the Introduction through Coloma, the indictment becomes increasingly strident, moving from a gentle call…

  15. A work-family conflict/subjective well-being process model: a test of competing theories of longitudinal effects.

    Science.gov (United States)

    Matthews, Russell A; Wayne, Julie Holliday; Ford, Michael T

    2014-11-01

    In the present study, we examine competing predictions of stress reaction models and adaptation theories regarding the longitudinal relationship between work-family conflict and subjective well-being. Based on data from 432 participants over 3 time points with 2 lags of varying lengths (i.e., 1 month, 6 months), our findings suggest that in the short term, consistent with prior theory and research, work-family conflict is associated with poorer subjective well-being. Counter to traditional work-family predictions but consistent with adaptation theories, after accounting for concurrent levels of work-family conflict as well as past levels of subjective well-being, past exposure to work-family conflict was associated with higher levels of subjective well-being over time. Moreover, evidence was found for reverse causation in that greater subjective well-being at 1 point in time was associated with reduced work-family conflict at a subsequent point in time. Finally, the pattern of results did not vary as a function of using different temporal lags. We discuss the theoretical, research, and practical implications of our findings. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  16. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  17. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  18. An introduction to decision theory

    NARCIS (Netherlands)

    Peterson, M.B.

    2009-01-01

    This up-to-date introduction to decision theory offers comprehensive and accessible discussions of decision making under ignorance and risk, the foundations of utility theory, the debate over subjective and objective probability, Bayesianism, causal decision theory, game theory and social choice

  19. Using the Theory of Planned Behaviour to Understand Students' Subject Choices in Post-Compulsory Education

    Science.gov (United States)

    Taylor, Rachel Charlotte

    2015-01-01

    In recent years, there have been concerns in the UK regarding the uptake of particular subjects in post-compulsory education. Whilst entries for Advanced level (A-level) subjects such as media studies have experienced considerable growth, entries for A-level physics have, until recently, been declining, prompting fears of a skills crisis in future…

  20. A Theory-Driven Approach to Subject Design in Teacher Education

    Science.gov (United States)

    Zundans-Fraser, Lucia; Auhl, Greg

    2016-01-01

    The intent of this study was to examine how a theoretically-designed subject in an undergraduate teacher education course impacted on the learning and confidence of pre-service teachers in catering for the needs of students with diverse needs. The subject design utilised theoretical principles of self-organisation that were incorporated with the…

  1. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  2. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  3. The Devaluation of the Subject in Popper's Theory of World 3

    Czech Academy of Sciences Publication Activity Database

    Parusniková, Zuzana

    2016-01-01

    Roč. 46, č. 3 (2016), s. 304-317 ISSN 0048-3931 Institutional support: RVO:67985955 Keywords : Karl Popper * World 3 * objective knowledge * criticism * creativity Subject RIV: AA - Philosophy ; Religion Impact factor: 0.392, year: 2016

  4. The role of subjective norms in theory of planned behavior in the context of organic food consumption

    OpenAIRE

    Al-Swidi, Abdullah; Huque, Sheikh Mohammed Rafiul; Hafeez, Muhammad Haroon; Shariff, Mohd Noor Mohd

    2014-01-01

    The purpose of the paper is to investigate the applicability of theory of planned behavior (TPB) with special emphasis on measuring the direct and moderating effect of subjective norms on attitude, perceived behavioral control and buying intention in context of buying organic food. Structured questionnaires were randomly distributed among academic staffs and students of two universities in southern Punjab, Pakistan. Structural equation modeling was employed to test the proposed model fit....

  5. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  6. Probability theory a concise course

    CERN Document Server

    Rozanov, Y A

    1977-01-01

    This clear exposition begins with basic concepts and moves on to combination of events, dependent events and random variables, Bernoulli trials and the De Moivre-Laplace theorem, a detailed treatment of Markov chains, continuous Markov processes, and more. Includes 150 problems, many with answers. Indispensable to mathematicians and natural scientists alike.

  7. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    Science.gov (United States)

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  8. Contemporary Leadership Theories. Enhancing the Understanding of the Complexity, Subjectivity and Dynamic of Leadership

    DEFF Research Database (Denmark)

    Winkler, Ingo

    This book provides a comprehensive overview of basic theoretical approaches of today's leadership research. These approaches conceive leadership as an interactive and complex process. They stress the significance of the individual perception for developing and forming leadership relations....... Leadership is understood as product of complex social relationships embedded in the logic and dynamic of the social system. The book discusses theoretical approaches from top leadership journals, but also addresses various alternatives that are suitable to challenge mainstream leadership research....... It includes attributional and psychodynamic approaches, charismatic leadership theories, and theoretical approaches that define leader-member relations in terms of exchange relations leadership under symbolic and political perspectives, in the light of role theory and as process of social learning....

  9. Gestalt Theory in Visual Screen Design — A New Look at an old subject

    OpenAIRE

    Chang, D.; Dooley, L.; Tuovinen, J. E

    2002-01-01

    Although often presented as a single basis for educational visual screen design, Gestalt theory is not a single small set of visual principles uniformly applied by all designers. In fact, it appears that instructional visual design literature often deals with only a small set of Gestalt laws. In this project Gestalt literature was consulted to distil the most relevant Gestalt laws for educational visual screen design. Eleven laws were identified. They deal with balance/symmetry, continuation,...

  10. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  11. Top-down versus bottom-up theories of subjective well-being

    NARCIS (Netherlands)

    B. Headey; R. Veenhoven (Ruut); A.J. Wearing

    1991-01-01

    textabstractThis paper addresses issues of causal direction in research on subjective well being (SWB). Previous researchers have generally assumed that such variables as domain satisfactions, social support, life events, and levels of expectation and aspiration are causes of SWB. Critics have

  12. Item Response Theory at Subject- and Group-Level. Research Report 90-1.

    Science.gov (United States)

    Tobi, Hilde

    This paper reviews the literature about item response models for the subject level and aggregated level (group level). Group-level item response models (IRMs) are used in the United States in large-scale assessment programs such as the National Assessment of Educational Progress and the California Assessment Program. In the Netherlands, these…

  13. Brecht, Hegel, Lacan: Brecht's Theory of Gest and the Problem of the Subject

    Directory of Open Access Journals (Sweden)

    Philip E. Bishop

    1986-01-01

    Full Text Available Brecht used the term "gest" to describe the generic components of human social behavior. He schooled actors in "decomposing" real conduct into distinct gestic images, which were criticized, compared, and altered by other actor-spectators. In his pedagogic theater, Brecht's young players engaged in a reciprocal process of acting and observing, which prepared them to act critically outside the theater. This gestic reciprocality echoes the master-slave dialectic in Hegel's Phenomenology and Lacan's description of the mirror phase. In Hegel, a subject achieves mastery (or self-consciousness through the recognition of another subject. In Lacan, the infant recognizes itself in an (alienated mirror-image and in its dramatic interactions with other infants. In each of these inter-subjective dialectics, the subject achieves sovereignty through the recognition of others and through a dramatic exchange with others. For Brecht, however, the structural roles of actor and spectator, teacher and student, were reversible, thus yielding a utopian notion of shared or collective sovereignty that is absent from Lacan. Furthermore, Brecht hoped that the sovereignty gained in the gestic theater would be transferred to actions outside the theater, on the stage of history.

  14. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  15. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    Science.gov (United States)

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  16. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    Science.gov (United States)

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  17. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  18. The subject of pedagogy from theory to practice--the view of newly registered nurses.

    Science.gov (United States)

    Ivarsson, Bodil; Nilsson, Gunilla

    2009-07-01

    The aim was to describe, from the newly registered nurses' perspective, specific events when using their pedagogical knowledge in their everyday clinical practice. The design was qualitative and the critical incident technique was used. Data was collected via interviews with ten newly registered nurses who graduated from the same University program 10 months earlier and are now employed at a university hospital. Two categories emerged in the analyses. The first category was "Pedagogical methods in theory" with the sub-categories Theory and the application of the course in practice, Knowledge of pedagogy and Information as a professional competence. The second category was "Pedagogical methods in everyday clinical practice" with sub-categories Factual knowledge versus pedagogical knowledge, Information and relatives, Difficulties when giving information, Understanding information received, Pedagogical tools, Collaboration in teams in pedagogical situations, and Time and giving information. By identifying specific events regarding pedagogical methods the findings can be useful for everyone from teachers and health-care managers to nurse students and newly registered nurses, to improve teaching methods in nurse education.

  19. Possible further subjects of study in the field of fusion plasma theory in Latvia

    International Nuclear Information System (INIS)

    Dumbrajs, O.

    2004-01-01

    Full text: First fusion plasma theory relevant studies in Latvia were related to edge localized modes (ELMs). This work has been carried out at the Institute of Solid Physics and is nearing successful completion. The next suggested topic 'Stochastisation of magnetic field lines and its impact on fusion plasma' is related to the theory of ergotic magnetic fields. The stochastisation of magnetic field lines is thought to play a major role in fast energy loss events from magnetically confined fusion plasma due to magnetohydrodynamic (MHD) modes. Classical examples are sawtooth crashes and disruptions. Here it is thought that stochastisation plays a role in the enhanced reconnection rate, which is often observed. More recently, this process has also been proposed as an explanation for the neoclassical tearing modes (NTM) phenomenon, which is repetitive rapid decrease of a neoclassical magnetic island due to its interaction with other MHD modes. The timescale for this phenomenon is clearly too fast to be explained by a conventional reconnection. The theoretical study of the onset of stochastisation will be illustrated for plasma parameters typical for the ASDEX Upgrade tokamak operated at the Max-Planc-Institute for Plasma Physics in Garching, Germany

  20. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  1. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  2. INTRODUCTION IN TECHNOLOGY CONCEPTUALIZATION THE SUBJECT DOMAIN OF SOCIOLOGY: EXPANSION OF THE THEORY (in the example of relationship/kinship

    Directory of Open Access Journals (Sweden)

    A. Yu. Ivanov

    2017-01-01

    Full Text Available Presented article is the second of two articles, the aim of which is to introduce the reader has no special mathematical training, with the possibilities of application of mathematical methods developed in the scientific direction of “Conceptual analysis and design of systems of organizational management (CAD SOM”, designed to solve a variety of tasks, such as technical and humanitarian spheres on the basis of the proposed methodological approach to the mathematization of the theoretical knowledge. At the heart of this methodological approach is a process of conceptualization, which is understood as a theoretical study of qualitative aspects of a selected domain using mathematical forms (axiomatic theory, the locking connection between the concepts of logical derivability characterizing this subject area. Designed axiomatic theory – conceptual scheme – is the basis for building database structures, decision-making processes, a variety of phenomena subject area, structure and genesis of domain analysis and other tasks. One of the main advantages of the sending of methodological approach is the ability to work with complex regions based on the controlled synthesis tool terminal theory of conceptual schemes, explicated simple fragments of the subject area. Given the non-mathematical preparation of the reader, the contents of the methods illustrated by conceptualizing a conceptually simple subject areas – areas related relations, as well as the choice of one of the most simple goals conceptualization – structuring the domain and build a variety of its phenomena. The first article was given a brief description of mathematical methods, describes the main stages of the conceptualization of the subject areas, ranging from the definition of the boundaries of the domain and ending with the theory of the synthesis of the terminal and determine its compliance with the tasks of conceptualizing. In the chosen example – areas related relations

  3. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  4. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  5. Theory of mind network activity is altered in subjects with familial liability for schizophrenia

    Science.gov (United States)

    Mohnke, Sebastian; Erk, Susanne; Schnell, Knut; Romanczuk-Seiferth, Nina; Schmierer, Phöbe; Romund, Lydia; Garbusow, Maria; Wackerhagen, Carolin; Ripke, Stephan; Grimm, Oliver; Haller, Leila; Witt, Stephanie H.; Degenhardt, Franziska; Tost, Heike; Heinz, Andreas; Meyer-Lindenberg, Andreas; Walter, Henrik

    2016-01-01

    As evidenced by a multitude of studies, abnormalities in Theory of Mind (ToM) and its neural processing might constitute an intermediate phenotype of schizophrenia. If so, neural alterations during ToM should be observable in unaffected relatives of patients as well, since they share a considerable amount of genetic risk. While behaviorally, impaired ToM function is confirmed meta-analytically in relatives, evidence on aberrant function of the neural ToM network is sparse and inconclusive. The present study therefore aimed to further explore the neural correlates of ToM in relatives of schizophrenia. About 297 controls and 63 unaffected first-degree relatives of patients with schizophrenia performed a ToM task during functional magnetic resonance imaging. Consistent with the literature relatives exhibited decreased activity of the medial prefrontal cortex. Additionally, increased recruitment of the right middle temporal gyrus and posterior cingulate cortex was found, which was related to subclinical paranoid symptoms in relatives. These results further support decreased medial prefrontal activation during ToM as an intermediate phenotype of genetic risk for schizophrenia. Enhanced recruitment of posterior ToM areas in relatives might indicate inefficiency mechanisms in the presence of genetic risk. PMID:26341902

  6. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    Science.gov (United States)

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  7. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  8. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  9. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  10. When does quality improvement count as research? Human subject protection and theories of knowledge.

    Science.gov (United States)

    Lynn, J

    2004-02-01

    The publication of insights from a quality improvement project recently precipitated a ruling by the lead federal regulatory agency that regulations providing protection for human subjects of research should apply. The required research review process did not match the rapid changes, small samples, limited documentation, clinician management, and type of information commonly used in quality improvement. Yet quality improvement can risk harm to patients, so some review might be in order. The boundaries and processes are not clear. Efforts have been made to determine what constitutes "research", but this has proved difficult and often yields irrational guidance with regard to protection of patients. Society needs a workable way to separate activities that will improve care, on the one hand, and those that constitute research, on the other. Practitioners who lead both quality improvement and research projects claim that those which rapidly give feedback to the care system that generated the data, aiming to change practices within that system, are "quality improvement" no matter whether the findings are published, whether the project is grant funded, and whether contemporaneous controls do not have the intervention. This criterion has not previously been proposed as a possible demarcation. The quandaries of which projects to put through research review and how to ensure ethical implementation of quality improvement need to be resolved.

  11. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  12. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  13. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  14. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  15. Subjective expected utility without preferences

    OpenAIRE

    Bouyssou , Denis; Marchant , Thierry

    2011-01-01

    This paper proposes a theory of subjective expected utility based on primitives only involving the fact that an act can be judged either "attractive" or "unattractive". We give conditions implying that there are a utility function on the set of consequences and a probability distribution on the set of states such that attractive acts have a subjective expected utility above some threshold. The numerical representation that is obtained has strong uniqueness properties.

  16. Representation theory of finite monoids

    CERN Document Server

    Steinberg, Benjamin

    2016-01-01

    This first text on the subject provides a comprehensive introduction to the representation theory of finite monoids. Carefully worked examples and exercises provide the bells and whistles for graduate accessibility, bringing a broad range of advanced readers to the forefront of research in the area. Highlights of the text include applications to probability theory, symbolic dynamics, and automata theory. Comfort with module theory, a familiarity with ordinary group representation theory, and the basics of Wedderburn theory, are prerequisites for advanced graduate level study. Researchers in algebra, algebraic combinatorics, automata theory, and probability theory, will find this text enriching with its thorough presentation of applications of the theory to these fields. Prior knowledge of semigroup theory is not expected for the diverse readership that may benefit from this exposition. The approach taken in this book is highly module-theoretic and follows the modern flavor of the theory of finite dimensional ...

  17. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  18. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  19. The Role of Cooperative Learning Type Team Assisted Individualization to Improve the Students' Mathematics Communication Ability in the Subject of Probability Theory

    Science.gov (United States)

    Tinungki, Georgina Maria

    2015-01-01

    The importance of learning mathematics can not be separated from its role in all aspects of life. Communicating ideas by using mathematics language is even more practical, systematic, and efficient. In order to overcome the difficulties of students who have insufficient understanding of mathematics material, good communications should be built in…

  20. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  1. Dynamic analysis of isotropic nanoplates subjected to moving load using state-space method based on nonlocal second order plate theory

    Energy Technology Data Exchange (ETDEWEB)

    Nami, Mohammad Rahim [Shiraz University, Shiraz, Iran (Iran, Islamic Republic of); Janghorban, Maziar [Islamic Azad University, Marvdash (Iran, Islamic Republic of)

    2015-06-15

    In this work, dynamic analysis of rectangular nanoplates subjected to moving load is presented. In order to derive the governing equations of motion, second order plate theory is used. To capture the small scale effects, the nonlocal elasticity theory is adopted. It is assumed that the nanoplate is subjected to a moving concentrated load with the constant velocity V in the x direction. To solve the governing equations, state-space method is used to find the deflections of rectangular nanoplate under moving load. The results obtained here reveal that the nonlocality has significant effect on the deflection of rectangular nanoplate subjected to moving load.

  2. Probability functions in the context of signed involutive meadows

    NARCIS (Netherlands)

    Bergstra, J.A.; Ponse, A.

    2016-01-01

    The Kolmogorov axioms for probability functions are placed in the context of signed meadows. A completeness theorem is stated and proven for the resulting equational theory of probability calculus. Elementary definitions of probability theory are restated in this framework.

  3. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  4. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  5. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  6. Classical test theory and Rasch analysis validation of the Upper Limb Functional Index in subjects with upper limb musculoskeletal disorders.

    Science.gov (United States)

    Bravini, Elisabetta; Franchignoni, Franco; Giordano, Andrea; Sartorio, Francesco; Ferriero, Giorgio; Vercelli, Stefano; Foti, Calogero

    2015-01-01

    To perform a comprehensive analysis of the psychometric properties and dimensionality of the Upper Limb Functional Index (ULFI) using both classical test theory and Rasch analysis (RA). Prospective, single-group observational design. Freestanding rehabilitation center. Convenience sample of Italian-speaking subjects with upper limb musculoskeletal disorders (N=174). Not applicable. The Italian version of the ULFI. Data were analyzed using parallel analysis, exploratory factor analysis, and RA for evaluating dimensionality, functioning of rating scale categories, item fit, hierarchy of item difficulties, and reliability indices. Parallel analysis revealed 2 factors explaining 32.5% and 10.7% of the response variance. RA confirmed the failure of the unidimensionality assumption, and 6 items out of the 25 misfitted the Rasch model. When the analysis was rerun excluding the misfitting items, the scale showed acceptable fit values, loading meaningfully to a single factor. Item separation reliability and person separation reliability were .98 and .89, respectively. Cronbach alpha was .92. RA revealed weakness of the scale concerning dimensionality and internal construct validity. However, a set of 19 ULFI items defined through the statistical process demonstrated a unidimensional structure, good psychometric properties, and clinical meaningfulness. These findings represent a useful starting point for further analyses of the tool (based on modern psychometric approaches and confirmatory factor analysis) in larger samples, including different patient populations and nationalities. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  7. Dangerous "spin": the probability myth of evidence-based prescribing - a Merleau-Pontyian approach.

    Science.gov (United States)

    Morstyn, Ron

    2011-08-01

    The aim of this study was to examine logical positivist statistical probability statements used to support and justify "evidence-based" prescribing rules in psychiatry when viewed from the major philosophical theories of probability, and to propose "phenomenological probability" based on Maurice Merleau-Ponty's philosophy of "phenomenological positivism" as a better clinical and ethical basis for psychiatric prescribing. The logical positivist statistical probability statements which are currently used to support "evidence-based" prescribing rules in psychiatry have little clinical or ethical justification when subjected to critical analysis from any of the major theories of probability and represent dangerous "spin" because they necessarily exclude the individual , intersubjective and ambiguous meaning of mental illness. A concept of "phenomenological probability" founded on Merleau-Ponty's philosophy of "phenomenological positivism" overcomes the clinically destructive "objectivist" and "subjectivist" consequences of logical positivist statistical probability and allows psychopharmacological treatments to be appropriately integrated into psychiatric treatment.

  8. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  9. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    of uncertain parameters. Monte Carlo simulation is readily used for practical calculations. However, an alternative approach is offered by possibility theory making use of possibility distributions such as intervals and fuzzy intervals. This approach is well suited to represent lack of knowledge or imprecision......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  10. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  11. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  12. Introduction to representation theory

    CERN Document Server

    Etingof, Pavel; Hensel, Sebastian; Liu, Tiankai; Schwendner, Alex

    2011-01-01

    Very roughly speaking, representation theory studies symmetry in linear spaces. It is a beautiful mathematical subject which has many applications, ranging from number theory and combinatorics to geometry, probability theory, quantum mechanics, and quantum field theory. The goal of this book is to give a "holistic" introduction to representation theory, presenting it as a unified subject which studies representations of associative algebras and treating the representation theories of groups, Lie algebras, and quivers as special cases. Using this approach, the book covers a number of standard topics in the representation theories of these structures. Theoretical material in the book is supplemented by many problems and exercises which touch upon a lot of additional topics; the more difficult exercises are provided with hints. The book is designed as a textbook for advanced undergraduate and beginning graduate students. It should be accessible to students with a strong background in linear algebra and a basic k...

  13. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  14. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  15. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  16. A theory of Bayesian decision making

    OpenAIRE

    Karni, Edi

    2009-01-01

    This paper presents a complete, choice-based, axiomatic Bayesian decision theory. It introduces a new choice set consisting of information-contingent plans for choosing actions and bets and subjective expected utility model with effect-dependent utility functions and action-dependent subjective probabilities which, in conjunction with the updating of the probabilities using Bayes’ rule, gives rise to a unique prior and a set of action-dependent posterior probabilities representing the decisio...

  17. Assessing Goodness of Fit in Item Response Theory with Nonparametric Models: A Comparison of Posterior Probabilities and Kernel-Smoothing Approaches

    Science.gov (United States)

    Sueiro, Manuel J.; Abad, Francisco J.

    2011-01-01

    The distance between nonparametric and parametric item characteristic curves has been proposed as an index of goodness of fit in item response theory in the form of a root integrated squared error index. This article proposes to use the posterior distribution of the latent trait as the nonparametric model and compares the performance of an index…

  18. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  19. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  20. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  1. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  2. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  3. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  4. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  5. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  6. Choice-centred versus subject-centred theories in the social sciences. The influence of simplification on explananda

    NARCIS (Netherlands)

    Lindenberg, S

    The idea, originating in economics and forcefully brought back by Goldthorpe, that rational choice theory and large-scale data analysis are symbiotic, is very attractive. Rational choice is in dire need of explananda which can be provided by large-scale data analysis, while large-scale data analysis

  7. A Cross-Cultural Study Testing the Universality of Basic Psychological Needs Theory across Different Academic Subjects

    Science.gov (United States)

    Erturan-Ilker, Gökçe; Quested, Eleanor; Appleton, Paul; Duda, Joan L.

    2018-01-01

    Basic Psychological Needs Theory (BPNT) suggests that autonomy-supportive teachers can promote the satisfaction of students' three basic psychological needs (i.e., the need for autonomy, competence, and relatedness) and this is essential for optimal functioning and personal well-being. The role of need satisfaction as a determinant of well-being…

  8. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  9. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  11. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  12. Travel Mode Use, Travel Mode Shift and Subjective Well-Being: Overview of Theories, Empirical Findings and Policy Implications

    NARCIS (Netherlands)

    Ettema, D.F.; Friman, M.; Gärling, Tommy; Olsson, Lars

    2016-01-01

    This chapter discusses how travel by different travel modes is related to primarily subjective well-being but also to health or physical well-being. Studies carried out in different geographic contexts consistently show that satisfaction with active travel modes is higher than travel by car and

  13. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  14. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  15. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  16. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  17. Situated technology in reproductive health care: Do we need a new theory of the subject to promote person-centred care?

    Science.gov (United States)

    Stankovic, Biljana

    2017-01-01

    Going through reproductive experiences (especially pregnancy and childbirth) in contemporary Western societies almost inevitably involves interaction with medical practitioners and various medical technologies in institutional context. This has important consequences for women as embodied subjects. A critical appraisal of these consequences-coming dominantly from feminist scholarship-relied on a problematic theory of both technology and the subject, which are in contemporary approaches no longer considered as given, coherent and well individualized wholes, but as complex constellations that are locally situated and that can only be described empirically. In this study, we will be relying on the developments in phenomenological theory to reconceptualize women as technologically mediated embodied subjects and on the new paradigms in philosophy of technology and STS to reconstruct medical technology as situated-with the aim of reconceptualizing their relationship and exploring different possibilities for the mediating role of medical technology. It will be argued that technologization of female reproductive processes and alienating consequences for women are not necessary or directly interrelated. The role of technology varies from case to case and depends mainly on the nontechnological and relational aspects of institutional context, in which medical practitioners play a decisive role. © 2016 John Wiley & Sons Ltd.

  18. Understanding of Accountancy Graduates on the Relevant Concepts Taught in the Subject Accounting Theory at HEI in Greater Florianópolis

    Directory of Open Access Journals (Sweden)

    Fabiana Frigo Souza

    2017-03-01

    Full Text Available This research aims to identify the understanding of the undergraduate students in Accountancy about the relevant concepts taught in the discipline Accounting Theory. To reach this goal, a questionnaire was sent to selected institutions or applied in person, obtaining a total of 65 respondents who had already studied Accounting Theory. The results of this research show that students perceive the concepts related to the discipline in a way more linked to standardization and that, for most respondents, the discipline Accounting Theory was considered of fundamental importance and should not be eliminated. In addition, it cannot be affirmed that there is a relationship between the area and the time of action of the respondents and their perceptions regarding the concepts of the discipline. It was also observed that there is little discussion about some subjects, in which some students are totally unaware, like in the case of Agency Theory and Earnings Management, which may indicate a gap in the teaching of the discipline. For future research, the analysis of distance learning is suggested, as well as research that seeks to analyze the existence of this possible gap observed.

  19. Usability of a theory of visual attention (TVA) for parameter-based measurement of attention I: evidence from normal subjects

    DEFF Research Database (Denmark)

    Finke, Kathrin; Bublak, Peter; Krummenacher, Joseph

    2005-01-01

    four separable attentional components: processing speed, working memory storage capacity, spatial distribution of attention, and top-down control. A number of studies (Duncan et al., 1999; Habekost & Bundesen, 2003; Peers et al., 2005) have already demonstrated the clinical relevance......The present study investigated the usability of whole and partial report of briefly displayed letter arrays as a diagnostic tool for the assessment of attentional functions. The tool is based on Bundesen's (1990, 1998, 2002; Bundesen et al., 2005) theory of visual attention (TVA), which assumes...... of these parameters. The present study was designed to examine whether (a) a shortened procedure bears sufficient accuracy and reliability, (b) whether the procedures reveal attentional constructs with clinical relevance, and (c) whether the mathematically independent parameters are also empirically independent...

  20. Improvement in cognitive and affective theory of mind with observation and imitation treatment in subjects with schizophrenia

    Directory of Open Access Journals (Sweden)

    Maria C. Pino

    2015-06-01

    Full Text Available Objective: the main objective of this study is to consider Theory of Mind (ToM, i.e. the ability to perceive other people in terms of thinking, believing and emotions, as a target for effective rehabilitative intervention, using Emotion and ToM Imitation Training (ETIT, aimed at improving social cognition and social functioning in schizophrenia. ToM impairment is a key feature of schizophrenia. According to recent literature, ToM is a multidimensional process requiring at least two components: cognitive and affective. Cognitive ToM seems to be a prerequisite for affective ToM, which requires intact empathic ability. Method: seven patients with schizophrenia completed ETIT treatment and were compared to 7 patients who participated in Problem Solving Training (PST. The participants were assessed at pre and post treatment regarding measures of cognitive (Advanced Theory of Mind Task and Social Situation Test and affective (Emotion Attribution Task and Eyes Task ToM and also empathy (Empathy Quotient. Results: our results showed that when compared to the control group, ETIT participants improved in three social cognition components evaluated (cognitive and affective ToM and empathy. Improvement in cognitive and affective ToM was found within the ETIT group pre and post treatment. Conclusions: Action observation and imitation could be important goals for future “low cost” rehabilitation treatment in several disorders in which the deficit of social cognition is considered as “core” to the disease. This represents a new perspective in the rehabilitation field.

  1. Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization

    Science.gov (United States)

    Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.

    2014-01-01

    Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406

  2. Probability theory: the logic of science

    National Research Council Canada - National Science Library

    Jaynes, E. T; Bretthorst, G. Larry

    2005-01-01

    ... by G. Larry Bretthorst Copyright   Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo Cambridge University Press...

  3. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  4. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  5. What Influences Chinese Adolescents’ Choice Intention between Playing Online Games and Learning? Application of Theory of Planned Behavior with Subjective Norm Manipulated as Peer Support and Parental Monitoring

    Science.gov (United States)

    Wang, Jia; Liu, Ru-De; Ding, Yi; Liu, Ying; Xu, Le; Zhen, Rui

    2017-01-01

    This study investigated how and why Chinese adolescents choose between playing online games and doing homework, using the model of the theory of planned behavior (TPB) in which the subjective norm was manipulated as two sub-elements (peer support and parental monitoring). A total of 530 students from an elementary school and a middle school in China were asked to complete the measures assessing two predictors of TPB: attitude and perceived behavioral control (PBC). Next, they completed a survey about their choice intention between playing an online game and doing homework in three different situations, wherein a conflict between playing online games and doing homework was introduced and subjective norm was manipulated as peers supporting and parents objecting to playing online games. The results showed that adolescents’ attitude and PBC, as well as the perception of obtaining or not obtaining support from their peers and caregivers (manipulated subjective norm), significantly influenced their choice intention in online gaming situations. These findings contribute to the understanding of the factors affecting adolescents’ online gaming, which has been a concern of both caregivers and educators. With regard to the theoretical implications, this study extended previous work by providing evidence that TPB can be applied to analyze choice intention. Moreover, this study illuminated the effects of the separating factors of subjective norm on choice intention between playing online games and studying. PMID:28458649

  6. What Influences Chinese Adolescents' Choice Intention between Playing Online Games and Learning? Application of Theory of Planned Behavior with Subjective Norm Manipulated as Peer Support and Parental Monitoring.

    Science.gov (United States)

    Wang, Jia; Liu, Ru-De; Ding, Yi; Liu, Ying; Xu, Le; Zhen, Rui

    2017-01-01

    This study investigated how and why Chinese adolescents choose between playing online games and doing homework, using the model of the theory of planned behavior (TPB) in which the subjective norm was manipulated as two sub-elements (peer support and parental monitoring). A total of 530 students from an elementary school and a middle school in China were asked to complete the measures assessing two predictors of TPB: attitude and perceived behavioral control (PBC). Next, they completed a survey about their choice intention between playing an online game and doing homework in three different situations, wherein a conflict between playing online games and doing homework was introduced and subjective norm was manipulated as peers supporting and parents objecting to playing online games. The results showed that adolescents' attitude and PBC, as well as the perception of obtaining or not obtaining support from their peers and caregivers (manipulated subjective norm), significantly influenced their choice intention in online gaming situations. These findings contribute to the understanding of the factors affecting adolescents' online gaming, which has been a concern of both caregivers and educators. With regard to the theoretical implications, this study extended previous work by providing evidence that TPB can be applied to analyze choice intention. Moreover, this study illuminated the effects of the separating factors of subjective norm on choice intention between playing online games and studying.

  7. The determinants of physician attitudes and subjective norms toward drug information sources: modification and test of the theory of reasoned action.

    Science.gov (United States)

    Gaither, C A; Bagozzi, R P; Ascione, F J; Kirking, D M

    1997-10-01

    To improve upon the theory of reasoned action and apply it to pharmaceutical research, we investigated the effects of relevant appraisals attributes, and past behavior of physicians on the use of drug information sources. We also examined the moderating effects of practice characteristics. A mail questionnaire asked HMO physicians to evaluate seven common sources of drug information on general appraisals (degree of usefulness and ease of use), specific attributes (availability, quality of information on harmful effects and on drug efficacy), and past behavior when searching for information on a new, simulated H2 antagonist agent. Semantic differential scales were used to measure each appraisal, attribute and past behavior. Information was also collected on practice characteristics. Findings from 108/200 respondents indicated that appraisals and attributes were useful determinants of attitudes and subjective norms toward use. Degree of usefulness and quality of information on harmful effects were important predictors of attitudes toward use for several sources of information. Ease of use and degree of usefulness were important predictors of subjective norms toward use. In many cases, moderating effects of practice characteristics were in opposing directions. Past behavior had significant direct effects on attitudes toward the PDR. The findings suggest ways to improve the usefulness of the theory of reasoned action as a model of decision-making. We also propose practical guidelines that can be used to improve the types of drug information sources used by physicians.

  8. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  9. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  10. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  11. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  12. Controlled cross-over study in normal subjects of naloxone-preceding-lactate infusions; respiratory and subjective responses: relationship to endogenous opioid system, suffocation false alarm theory and childhood parental loss.

    Science.gov (United States)

    Preter, M; Lee, S H; Petkova, E; Vannucci, M; Kim, S; Klein, D F

    2011-02-01

    The expanded suffocation false alarm theory (SFA) hypothesizes that dysfunction in endogenous opioidergic regulation increases sensitivity to CO2, separation distress and panic attacks. In panic disorder (PD) patients, both spontaneous clinical panics and lactate-induced panics markedly increase tidal volume (TV), whereas normals have a lesser effect, possibly due to their intact endogenous opioid system. We hypothesized that impairing the opioidergic system by naloxone could make normal controls parallel PD patients' response when lactate challenged. Whether actual separations and losses during childhood (childhood parental loss, CPL) affected naloxone-induced respiratory contrasts was explored. Subjective panic-like symptoms were analyzed although pilot work indicated that the subjective aspect of anxious panic was not well modeled by this specific protocol. Randomized cross-over sequences of intravenous naloxone (2 mg/kg) followed by lactate (10 mg/kg), or saline followed by lactate, were given to 25 volunteers. Respiratory physiology was objectively recorded by the LifeShirt. Subjective symptomatology was also recorded. Impairment of the endogenous opioid system by naloxone accentuates TV and symptomatic response to lactate. This interaction is substantially lessened by CPL. Opioidergic dysregulation may underlie respiratory pathophysiology and suffocation sensitivity in PD. Comparing specific anti-panic medications with ineffective anti-panic agents (e.g. propranolol) can test the specificity of the naloxone+lactate model. A screen for putative anti-panic agents and a new pharmacotherapeutic approach are suggested. Heuristically, the experimental unveiling of the endogenous opioid system impairing effects of CPL and separation in normal adults opens a new experimental, investigatory area.

  13. Controlled cross-over study in normal subjects of naloxone-preceding-lactate infusions; respiratory and subjective responses: relationship to endogenous opioid system, suffocation false alarm theory and childhood parental loss

    Science.gov (United States)

    Preter, M.; Lee, S. H.; Petkova, E.; Vannucci, M.; Kim, S.; Klein, D. F.

    2015-01-01

    Background The expanded suffocation false alarm theory (SFA) hypothesizes that dysfunction in endogenous opioidergic regulation increases sensitivity to CO2, separation distress and panic attacks. In panic disorder (PD) patients, both spontaneous clinical panics and lactate-induced panics markedly increase tidal volume (TV), whereas normals have a lesser effect, possibly due to their intact endogenous opioid system. We hypothesized that impairing the opioidergic system by naloxone could make normal controls parallel PD patients' response when lactate challenged. Whether actual separations and losses during childhood (childhood parental loss, CPL) affected naloxone-induced respiratory contrasts was explored. Subjective panic-like symptoms were analyzed although pilot work indicated that the subjective aspect of anxious panic was not well modeled by this specific protocol. Method Randomized cross-over sequences of intravenous naloxone (2 mg/kg) followed by lactate (10 mg/kg), or saline followed by lactate, were given to 25 volunteers. Respiratory physiology was objectively recorded by the LifeShirt. Subjective symptomatology was also recorded. Results Impairment of the endogenous opioid system by naloxone accentuates TV and symptomatic response to lactate. This interaction is substantially lessened by CPL. Conclusions Opioidergic dysregulation may underlie respiratory pathophysiology and suffocation sensitivity in PD. Comparing specific anti-panic medications with ineffective anti-panic agents (e.g. propranolol) can test the specificity of the naloxone + lactate model. A screen for putative anti-panic agents and a new pharmacotherapeutic approach are suggested. Heuristically, the experimental unveiling of the endogenous opioid system impairing effects of CPL and separation in normal adults opens a new experimental, investigatory area. PMID:20444308

  14. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  15. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  16. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  17. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  19. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  20. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  1. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  2. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  3. Subjective Expected Utility: A Model of Decision-Making.

    Science.gov (United States)

    Fischoff, Baruch; And Others

    1981-01-01

    Outlines a model of decision making known to researchers in the field of behavioral decision theory (BDT) as subjective expected utility (SEU). The descriptive and predictive validity of the SEU model, probability and values assessment using SEU, and decision contexts are examined, and a 54-item reference list is provided. (JL)

  4. Hegel’s Theory of Moral Action, its Place in his System and the ‘Highest’ Right of the Subject

    Directory of Open Access Journals (Sweden)

    David Rose

    2007-12-01

    Full Text Available There is at present, amongst Hegel scholars and in the interpretative discussions of Hegelrsquo;s social and political theories, the flavour of old-style lsquo;apologyrsquo; for his liberal credentials, as though there exists a real need to prove he holds basic liberal views palatable to the hegemonic, contemporary political worldview. Such an approach is no doubt motivated by the need to reconstruct what is left of the modern moral conscience when Hegel has finished discussing the flaws and contradictions of the Kantian model of moral judgement. The main claim made in the following pages is that the critique of lsquo;subjectiversquo; moralities is neither the sole nor even the main reason for the adoption of an immanent doctrine of ethics. This paper will look to Hegelrsquo;s mature theory of action as motivating the critique of transcendentalism rather than merely filling in the hole left when one rejects Kant and it will discuss what the consequences of this approach are for the role of the moral conscience within the political sphere, arguing that Hegelrsquo;s own conditions of free action would not be met unless the subjective moral conscience was operative in the rational state.

  5. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  6. Sensitivity and bias in decision-making under risk: evaluating the perception of reward, its probability and value.

    Directory of Open Access Journals (Sweden)

    Madeleine E Sharp

    Full Text Available BACKGROUND: There are few clinical tools that assess decision-making under risk. Tests that characterize sensitivity and bias in decisions between prospects varying in magnitude and probability of gain may provide insights in conditions with anomalous reward-related behaviour. OBJECTIVE: We designed a simple test of how subjects integrate information about the magnitude and the probability of reward, which can determine discriminative thresholds and choice bias in decisions under risk. DESIGN/METHODS: Twenty subjects were required to choose between two explicitly described prospects, one with higher probability but lower magnitude of reward than the other, with the difference in expected value between the two prospects varying from 3 to 23%. RESULTS: Subjects showed a mean threshold sensitivity of 43% difference in expected value. Regarding choice bias, there was a 'risk premium' of 38%, indicating a tendency to choose higher probability over higher reward. An analysis using prospect theory showed that this risk premium is the predicted outcome of hypothesized non-linearities in the subjective perception of reward value and probability. CONCLUSIONS: This simple test provides a robust measure of discriminative value thresholds and biases in decisions under risk. Prospect theory can also make predictions about decisions when subjective perception of reward or probability is anomalous, as may occur in populations with dopaminergic or striatal dysfunction, such as Parkinson's disease and schizophrenia.

  7. Sensitivity and Bias in Decision-Making under Risk: Evaluating the Perception of Reward, Its Probability and Value

    Science.gov (United States)

    Sharp, Madeleine E.; Viswanathan, Jayalakshmi; Lanyon, Linda J.; Barton, Jason J. S.

    2012-01-01

    Background There are few clinical tools that assess decision-making under risk. Tests that characterize sensitivity and bias in decisions between prospects varying in magnitude and probability of gain may provide insights in conditions with anomalous reward-related behaviour. Objective We designed a simple test of how subjects integrate information about the magnitude and the probability of reward, which can determine discriminative thresholds and choice bias in decisions under risk. Design/Methods Twenty subjects were required to choose between two explicitly described prospects, one with higher probability but lower magnitude of reward than the other, with the difference in expected value between the two prospects varying from 3 to 23%. Results Subjects showed a mean threshold sensitivity of 43% difference in expected value. Regarding choice bias, there was a ‘risk premium’ of 38%, indicating a tendency to choose higher probability over higher reward. An analysis using prospect theory showed that this risk premium is the predicted outcome of hypothesized non-linearities in the subjective perception of reward value and probability. Conclusions This simple test provides a robust measure of discriminative value thresholds and biases in decisions under risk. Prospect theory can also make predictions about decisions when subjective perception of reward or probability is anomalous, as may occur in populations with dopaminergic or striatal dysfunction, such as Parkinson's disease and schizophrenia. PMID:22493669

  8. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  9. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  10. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  11. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  12. Analytic methods in applied probability in memory of Fridrikh Karpelevich

    CERN Document Server

    Suhov, Yu M

    2002-01-01

    This volume is dedicated to F. I. Karpelevich, an outstanding Russian mathematician who made important contributions to applied probability theory. The book contains original papers focusing on several areas of applied probability and its uses in modern industrial processes, telecommunications, computing, mathematical economics, and finance. It opens with a review of Karpelevich's contributions to applied probability theory and includes a bibliography of his works. Other articles discuss queueing network theory, in particular, in heavy traffic approximation (fluid models). The book is suitable

  13. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  14. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  15. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  16. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  17. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  18. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  19. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  20. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  1. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  2. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  3. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  4. On the discretization of probability density functions and the ...

    Indian Academy of Sciences (India)

    important for most applications or theoretical problems of interest. In statistics ... In probability theory, statistics, statistical mechanics, communication theory, and other .... (1) by taking advantage of SMVT as a general mathematical approach.

  5. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  6. The control processes and subjective well-being of Chinese teachers: Evidence of convergence with and divergence from the key propositions of the motivational theory of life-span development

    Directory of Open Access Journals (Sweden)

    Wan-Chi eWong

    2014-05-01

    Full Text Available An analytical review of the motivational theory of life-span development reveals that this theory has undergone a series of elegant theoretical integrations. Its claim to universality nonetheless brings forth unresolved controversies. With the purpose of scrutinizing the key propositions of this theory, an empirical study was designed to examine the control processes and subjective well-being of Chinese teachers (N = 637. The OPS-Scales (Optimization in Primary and Secondary Control Scales for the Domain of Teaching were constructed to assess patterns of control processes. Three facets of subjective well-being were investigated with the Positive and Negative Affect Schedule, the Life Satisfaction Scale, and the Subjective Vitality Scale. The results revealed certain aspects of alignment with and certain divergences from the key propositions of the motivational theory of life-span development. Neither primacy of primary control nor primacy of secondary control was clearly supported. Notably, using different criteria for subjective well-being yielded different subtypes of primary and secondary control as predictors. The hypothesized life-span trajectories of primary and secondary control received limited support. To advance the theory in this area, we recommend incorporating Lakatos’ ideas about sophisticated falsification by specifying the hard core of the motivational theory of life-span development and articulating new auxiliary hypotheses.

  7. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  8. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  9. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  10. Topos theory

    CERN Document Server

    Johnstone, PT

    2014-01-01

    Focusing on topos theory's integration of geometric and logical ideas into the foundations of mathematics and theoretical computer science, this volume explores internal category theory, topologies and sheaves, geometric morphisms, other subjects. 1977 edition.

  11. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  12. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  13. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  14. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  15. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  16. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  17. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  18. Accurate step-hold tracking of smoothly varying periodic and aperiodic probability.

    Science.gov (United States)

    Ricci, Matthew; Gallistel, Randy

    2017-07-01

    Subjects observing many samples from a Bernoulli distribution are able to perceive an estimate of the generating parameter. A question of fundamental importance is how the current percept-what we think the probability now is-depends on the sequence of observed samples. Answers to this question are strongly constrained by the manner in which the current percept changes in response to changes in the hidden parameter. Subjects do not update their percept trial-by-trial when the hidden probability undergoes unpredictable and unsignaled step changes; instead, they update it only intermittently in a step-hold pattern. It could be that the step-hold pattern is not essential to the perception of probability and is only an artifact of step changes in the hidden parameter. However, we now report that the step-hold pattern obtains even when the parameter varies slowly and smoothly. It obtains even when the smooth variation is periodic (sinusoidal) and perceived as such. We elaborate on a previously published theory that accounts for: (i) the quantitative properties of the step-hold update pattern; (ii) subjects' quick and accurate reporting of changes; (iii) subjects' second thoughts about previously reported changes; (iv) subjects' detection of higher-order structure in patterns of change. We also call attention to the challenges these results pose for trial-by-trial updating theories.

  19. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    Introductory treatment develops the theory of integration in a general context, making it applicable to other branches of analysis. More specialized topics include convergence theorems and random sequences and functions. 1963 edition.

  20. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  1. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  2. Essays on Subjective Survival Probabilities, Consumption, and Retirement Decisions

    NARCIS (Netherlands)

    Kutlu Koc, Vesile

    2015-01-01

    Recent pension reforms in industrialized countries are, in part, motivated by the increased life expectancy. As individuals are expected to take more responsibility in their retirement planning and savings decisions, it is important to understand whether they are aware of improvements in life

  3. Interpreting on The Awakening in Terms of Lacan’s Subject Theory%以拉康的主体理论解读《觉醒》

    Institute of Scientific and Technical Information of China (English)

    江云琴

    2014-01-01

    Edna,the heroine of The Awakening,was not satisfied with the status of her life as a married woman with two children,therefore tried to explore a new and independent life through the liberation of her sexual desire and romantic love,but she ended by killing herself.From the perspective of Lacan’s theory of the three orders, Edna has gone through the psychological journey for three aspects:trying to break away from the socialized roles set by the Other in the symbolic,seeking solace in the imaginary order of sensual pleasures and romance,and fi-nally,facing up to the void at the core of the subject and walking into the sea to attain a union with the Mother.%《觉醒》中的女主人公埃德娜不满足于她为人妻为人母的生活现状,于是尝试着在身体欲望的解放和浪漫爱情中探寻一种新的自主的生活,但最终却走上自杀的道路。从拉康的主体三界理论来看,埃德娜经历了以下三个层面的心理历程:试图摆脱象征界大写他者法则所界定的社会化的角色,沉迷在想象界中身体感官的愉悦和爱情的幻象,面对实在界主体结构中心的“空无”,走入大海,试图回到生命最初的他者———母亲的怀抱。

  4. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  5. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  6. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  7. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  8. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  9. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  10. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  11. Quantum probability for probabilists

    CERN Document Server

    Meyer, Paul-André

    1993-01-01

    In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.

  12. Neutrosophic Probability, Set, And Logic (first version)

    OpenAIRE

    Smarandache, Florentin

    2000-01-01

    This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.

  13. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  14. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  15. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  16. The mathematics of various entertaining subjects

    CERN Document Server

    Rosenhouse, Jason

    Volume 1 : The history of mathematics is filled with major breakthroughs resulting from solutions to recreational problems. Problems of interest to gamblers led to the modern theory of probability, for example, and surreal numbers were inspired by the game of Go. Yet even with such groundbreaking findings and a wealth of popular-level books exploring puzzles and brainteasers, research in recreational mathematics has often been neglected. The Mathematics of Various Entertaining Subjects brings together authors from a variety of specialties to present fascinating problems and solutions in recreational mathematics. Contributors to the book show how sophisticated mathematics can help construct mazes that look like famous people, how the analysis of crossword puzzles has much in common with understanding epidemics, and how the theory of electrical circuits is useful in understanding the classic Towers of Hanoi puzzle. The card game SET is related to the theory of error-correcting codes, and simple tic-tac-toe tak...

  17. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  18. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  19. Decisions under risk in Parkinson's disease: preserved evaluation of probability and magnitude.

    Science.gov (United States)

    Sharp, Madeleine E; Viswanathan, Jayalakshmi; McKeown, Martin J; Appel-Cresswell, Silke; Stoessl, A Jon; Barton, Jason J S

    2013-11-01

    Unmedicated Parkinson's disease patients tend to be risk-averse while dopaminergic treatment causes a tendency to take risks. While dopamine agonists may result in clinically apparent impulse control disorders, treatment with levodopa also causes shift in behaviour associated with an enhanced response to rewards. Two important determinants in decision-making are how subjects perceive the magnitude and probability of outcomes. Our objective was to determine if patients with Parkinson's disease on or off levodopa showed differences in their perception of value when making decisions under risk. The Vancouver Gambling task presents subjects with a choice between one prospect with larger outcome and a second with higher probability. Eighteen age-matched controls and eighteen patients with Parkinson's disease before and after levodopa were tested. In the Gain Phase subjects chose between one prospect with higher probability and another with larger reward to maximize their gains. In the Loss Phase, subjects played to minimize their losses. Patients with Parkinson's disease, on or off levodopa, were similar to controls when evaluating gains. However, in the Loss Phase before levodopa, they were more likely to avoid the prospect with lower probability but larger loss, as indicated by the steeper slope of their group psychometric function (t(24) = 2.21, p = 0.04). Modelling with prospect theory suggested that this was attributable to a 28% overestimation of the magnitude of loss, rather than an altered perception of its probability. While pre-medicated patients with Parkinson's disease show risk-aversion for large losses, patients on levodopa have normal perception of magnitude and probability for both loss and gain. The finding of accurate and normally biased decisions under risk in medicated patients with PD is important because it indicates that, if there is indeed anomalous risk-seeking behaviour in such a cohort, it may derive from abnormalities in components of

  20. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  1. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  2. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  3. A modern theory of random variation with applications in stochastic calculus, financial mathematics, and Feynman integration

    CERN Document Server

    Muldowney, Patrick

    2012-01-01

    A Modern Theory of Random Variation is a new and radical re-formulation of the mathematical underpinnings of subjects as diverse as investment, communication engineering, and quantum mechanics. Setting aside the classical theory of probability measure spaces, the book utilizes a mathematically rigorous version of the theory of random variation that bases itself exclusively on finitely additive probability distribution functions. In place of twentieth century Lebesgue integration and measure theory, the author uses the simpler concept of Riemann sums, and the non-absolute Riemann-type integration of Henstock. Readers are supplied with an accessible approach to standard elements of probability theory such as the central limmit theorem and Brownian motion as well as remarkable, new results on Feynman diagrams and stochastic integrals. Throughout the book, detailed numerical demonstrations accompany the discussions of abstract mathematical theory, from the simplest elements of the subject to the most complex. I...

  4. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  5. Probability of Boulders

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    To collect background information for formulating a description of the expected soil properties along the tunnel line, in 1987 Storebælt initiated a statistical investigation of the occurrence and size of boulders in the Great Belt area. The data for the boulder size distribution were obtained....... The data collection part of the investigation was made on the basis of geological expert advice (Gunnar Larsen, Århus) by the Danish Geotechnical Institute (DGI).The statistical data analysis combined with stochastic modeling based on geometry and sound wave diffraction theory gave a point estimate...

  6. Classical probabilities for Majorana and Weyl spinors

    International Nuclear Information System (INIS)

    Wetterich, C.

    2011-01-01

    Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  7. Conditional probability on MV-algebras

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2005-01-01

    Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005

  8. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  9. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  10. The theory of quantum information

    CERN Document Server

    Watrous, John

    2018-01-01

    This largely self-contained book on the theory of quantum information focuses on precise mathematical formulations and proofs of fundamental facts that form the foundation of the subject. It is intended for graduate students and researchers in mathematics, computer science, and theoretical physics seeking to develop a thorough understanding of key results, proof techniques, and methodologies that are relevant to a wide range of research topics within the theory of quantum information and computation. The book is accessible to readers with an understanding of basic mathematics, including linear algebra, mathematical analysis, and probability theory. An introductory chapter summarizes these necessary mathematical prerequisites, and starting from this foundation, the book includes clear and complete proofs of all results it presents. Each subsequent chapter includes challenging exercises intended to help readers to develop their own skills for discovering proofs concerning the theory of quantum information.

  11. Random vibrations theory and practice

    CERN Document Server

    Wirsching, Paul H; Ortiz, Keith

    1995-01-01

    Random Vibrations: Theory and Practice covers the theory and analysis of mechanical and structural systems undergoing random oscillations due to any number of phenomena— from engine noise, turbulent flow, and acoustic noise to wind, ocean waves, earthquakes, and rough pavement. For systems operating in such environments, a random vibration analysis is essential to the safety and reliability of the system. By far the most comprehensive text available on random vibrations, Random Vibrations: Theory and Practice is designed for readers who are new to the subject as well as those who are familiar with the fundamentals and wish to study a particular topic or use the text as an authoritative reference. It is divided into three major sections: fundamental background, random vibration development and applications to design, and random signal analysis. Introductory chapters cover topics in probability, statistics, and random processes that prepare the reader for the development of the theory of random vibrations a...

  12. The Influence of Subjective Life Expectancy on Retirement Transition and Planning: A Longitudinal Study

    Science.gov (United States)

    Griffin, Barbara; Hesketh, Beryl; Loh, Vanessa

    2012-01-01

    This study examines the construct of subjective life expectancy (SLE), or the estimation of one's probable age of death. Drawing on the tenets of socioemotional selectivity theory (Carstensen, Isaacowitz, & Charles, 1999), we propose that SLE provides individuals with their own unique mental model of remaining time that is likely to affect their…

  13. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  14. Comment on 'The meaning of probability in probabilistic safety analysis'

    International Nuclear Information System (INIS)

    Yellman, Ted W.; Murray, Thomas M.

    1995-01-01

    A recent article in Reliability Engineering and System Safety argues that there is 'fundamental confusion over how to interpret the numbers which emerge from a Probabilistic Safety Analysis [PSA]', [Watson, S. R., The meaning of probability in probabilistic safety analysis. Reliab. Engng and System Safety, 45 (1994) 261-269.] As a standard for comparison, the author employs the 'realist' interpretation that a PSA output probability should be a 'physical property' of the installation being analyzed, 'objectively measurable' without controversy. The author finds all the other theories and philosophies discussed wanting by this standard. Ultimately, he argues that the outputs of a PSA should be considered to be no more than constructs of the computational procedure chosen - just an 'argument' or a 'framework for the debate about safety' rather than a 'representation of truth'. He even suggests that 'competing' PSA's be done - each trying to 'argue' for a different message. The commentors suggest that the position the author arrives at is an overreaction to the subjectivity which is part of any complex PSA, and that that overreaction could in fact easily lead to the belief that PSA's are meaningless. They suggest a broader interpretation, one based strictly on relative frequency--a concept which the commentors believe the author abandoned too quickly. Their interpretation does not require any 'tests' to determine whether a statement of likelihood is qualified to be a 'true' probability and it applies equally well in pure analytical models. It allows anyone's proper numerical statement of the likelihood of an event to be considered a probability. It recognizes that the quality of PSA's and their results will vary. But, unlike the author, the commentors contend that a PSA should always be a search for truth--not a vehicle for adversarial pleadings

  15. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  16. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  17. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  18. Probability of brittle failure

    Science.gov (United States)

    Kim, A.; Bosnyak, C. P.; Chudnovsky, A.

    1991-01-01

    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.

  19. Probability Weighting as Evolutionary Second-best

    OpenAIRE

    Herold, Florian; Netzer, Nick

    2011-01-01

    The economic concept of the second-best involves the idea that multiple simultaneous deviations from a hypothetical first-best optimum may be optimal once the first-best itself can no longer be achieved, since one distortion may partially compensate for another. Within an evolutionary framework, we translate this concept to behavior under uncertainty. We argue that the two main components of prospect theory, the value function and the probability weighting function, are complements in the sec...

  20. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  1. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  2. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  3. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Neutron emission probability at high excitation and isospin

    International Nuclear Information System (INIS)

    Aggarwal, Mamta

    2005-01-01

    One-neutron and two-neutron emission probability at different excitations and varying isospin have been studied. Several degrees of freedom like deformation, rotations, temperature, isospin fluctuations and shell structure are incorporated via statistical theory of hot rotating nuclei

  5. Continuation of probability density functions using a generalized Lyapunov approach

    NARCIS (Netherlands)

    Baars, S.; Viebahn, J. P.; Mulder, T. E.; Kuehn, C.; Wubs, F. W.; Dijkstra, H. A.

    2017-01-01

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial

  6. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  7. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  8. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  9. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  10. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  11. Nokton theory

    OpenAIRE

    SAIDANI Lassaad

    2015-01-01

    The nokton theory is an attempt to construct a theory adapted to every physical phenomenon. Space and time have been discretized. Its laws are iterative and precise. Probability plays an important role here. At first I defined the notion of image function and its mathematical framework. The notion of nokton and its state are the basis of several definitions. I later defined the canonical image function and the canonical contribution. Two constants have been necessary to define the dynam...

  12. Nokton theory

    OpenAIRE

    SAIDANI Lassaad

    2017-01-01

    The nokton theory is an attempt to construct a theory adapted to every physical phenomenon. Space and time have been discretized. Its laws are iterative and precise. Probability plays an important role here. At first I defined the notion of image function and its mathematical framework. The notion of nokton and its state are the basis of several definitions. I later defined the canonical image function and the canonical contribution. Two constants have been necessary to define the dynam...

  13. A Quantum Theoretical Explanation for Probability Judgment Errors

    Science.gov (United States)

    Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.

    2011-01-01

    A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…

  14. Upper Bounds for Ruin Probability with Stochastic Investment Return

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    Risk models with stochastic investment return are widely held in practice, as well as in more challenging research fields. Risk theory is mainly concerned with ruin probability, and a tight bound for ruin probability is the best for practical use. This paper presents a discrete time risk model with stochastic investment return. Conditional expectation properties and martingale inequalities are used to obtain both exponential and non-exponential upper bounds for the ruin probability.

  15. Atomic theories

    CERN Document Server

    Loring, FH

    2014-01-01

    Summarising the most novel facts and theories which were coming into prominence at the time, particularly those which had not yet been incorporated into standard textbooks, this important work was first published in 1921. The subjects treated cover a wide range of research that was being conducted into the atom, and include Quantum Theory, the Bohr Theory, the Sommerfield extension of Bohr's work, the Octet Theory and Isotopes, as well as Ionisation Potentials and Solar Phenomena. Because much of the material of Atomic Theories lies on the boundary between experimentally verified fact and spec

  16. An introduction to information theory

    CERN Document Server

    Reza, Fazlollah M

    1994-01-01

    Graduate-level study for engineering students presents elements of modern probability theory, information theory, coding theory, more. Emphasis on sample space, random variables, capacity, etc. Many reference tables and extensive bibliography. 1961 edition.

  17. Expanding subjectivities

    DEFF Research Database (Denmark)

    Lundgaard Andersen, Linda; Soldz, Stephen

    2012-01-01

    A major theme in recent psychoanalytic thinking concerns the use of therapist subjectivity, especially “countertransference,” in understanding patients. This thinking converges with and expands developments in qualitative research regarding the use of researcher subjectivity as a tool......-Saxon and continental traditions, this special issue provides examples of the use of researcher subjectivity, informed by psychoanalytic thinking, in expanding research understanding....

  18. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  19. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  20. A High-Order Theory for the Analysis of Circular Cylindrical Composite Sandwich Shells with Transversely Compliant Core Subjected to External Loads

    DEFF Research Database (Denmark)

    Rahmani, Omid; Khalili, S.M.R.; Thomsen, Ole Thybo

    2012-01-01

    A new model based on the high order sandwich panel theory is proposed to study the effect of external loads on the free vibration of circular cylindrical composite sandwich shells with transversely compliant core, including also the calculation of the buckling loads. In the present model......, which is based on a 3D elasticity solution for the core material, can be used as a benchmark in future studies of the free vibration and buckling of circular cylindrical composite sandwich shells with a transversely compliant core....

  1. Characteristics of Spatial Synchronization of Encephalograms in Left- and Right-Handed Subjects in Resting State and During Cognitive Testing: a Graph-Theory Analysis

    OpenAIRE

    Lukoyanov M.V.; Grechikhin I.S.; Kalyagin V.A.; Pardalos P.M.; Mukhina I.V.

    2014-01-01

    Hand preference is one of the most striking manifestations of functional brain asymmetry. However, the nature of the phenomenon, as well as its interaction with other brain functions has not been fully understood. Therefore, the study of brain peculiarities of left- and right-handed subjects by neuronal network analysis is of particular interest. The aim of the investigation was to analyze brain network structures according to electroencephalography findings in left- and right-handed subj...

  2. Probability with applications in engineering, science, and technology

    CERN Document Server

    Carlton, Matthew A

    2017-01-01

    This updated and revised first-course textbook in applied probability provides a contemporary and lively post-calculus introduction to the subject of probability. The exposition reflects a desirable balance between fundamental theory and many applications involving a broad range of real problem scenarios. It is intended to appeal to a wide audience, including mathematics and statistics majors, prospective engineers and scientists, and those business and social science majors interested in the quantitative aspects of their disciplines. The textbook contains enough material for a year-long course, though many instructors will use it for a single term (one semester or one quarter). As such, three course syllabi with expanded course outlines are now available for download on the book’s page on the Springer website. A one-term course would cover material in the core chapters (1-4), supplemented by selections from one or more of the remaining chapters on statistical inference (Ch. 5), Markov chains (Ch. 6), stoch...

  3. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  4. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  5. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  6. Heart sounds analysis using probability assessment

    Czech Academy of Sciences Publication Activity Database

    Plešinger, Filip; Viščor, Ivo; Halámek, Josef; Jurčo, Juraj; Jurák, Pavel

    2017-01-01

    Roč. 38, č. 8 (2017), s. 1685-1700 ISSN 0967-3334 R&D Projects: GA ČR GAP102/12/2034; GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : heart sounds * FFT * machine learning * signal averaging * probability assessment Subject RIV: FS - Medical Facilities ; Equipment OBOR OECD: Medical engineering Impact factor: 2.058, year: 2016

  7. Limiting values of large deviation probabilities of quadratic statistics

    NARCIS (Netherlands)

    Jeurnink, Gerardus A.M.; Kallenberg, W.C.M.

    1990-01-01

    Application of exact Bahadur efficiencies in testing theory or exact inaccuracy rates in estimation theory needs evaluation of large deviation probabilities. Because of the complexity of the expressions, frequently a local limit of the nonlocal measure is considered. Local limits of large deviation

  8. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  9. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  10. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  11. Cognitive-psychology expertise and the calculation of the probability of a wrongful conviction.

    Science.gov (United States)

    Rouder, Jeffrey N; Wixted, John T; Christenfeld, Nicholas J S

    2018-05-08

    Cognitive psychologists are familiar with how their expertise in understanding human perception, memory, and decision-making is applicable to the justice system. They may be less familiar with how their expertise in statistical decision-making and their comfort working in noisy real-world environments is just as applicable. Here we show how this expertise in ideal-observer models may be leveraged to calculate the probability of guilt of Gary Leiterman, a man convicted of murder on the basis of DNA evidence. We show by common probability theory that Leiterman is likely a victim of a tragic contamination event rather than a murderer. Making any calculation of the probability of guilt necessarily relies on subjective assumptions. The conclusion about Leiterman's innocence is not overly sensitive to the assumptions-the probability of innocence remains high for a wide range of reasonable assumptions. We note that cognitive psychologists may be well suited to make these calculations because as working scientists they may be comfortable with the role a reasonable degree of subjectivity plays in analysis.

  12. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  13. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  14. p-adic probability interpretation of Bell's inequality

    International Nuclear Information System (INIS)

    Khrennikov, A.

    1995-01-01

    We study the violation of Bell's inequality using a p-adic generalization of the theory of probability. p-adic probability is introduced as a limit of relative frequencies but this limit exists with respect to a p-adic metric. In particular, negative probability distributions are well defined on the basis of the frequency definition. This new type of stochastics can be used to describe hidden-variables distributions of some quantum models. If the hidden variables have a p-adic probability distribution, Bell's inequality is not valid and it is not necessary to discuss the experimental violations of this inequality. ((orig.))

  15. Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.

    Science.gov (United States)

    Lau, Jey Han; Clark, Alexander; Lappin, Shalom

    2017-07-01

    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.

  16. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  17. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  18. Theory of semigroups and applications

    CERN Document Server

    Sinha, Kalyan B

    2017-01-01

    The book presents major topics in semigroups, such as operator theory, partial differential equations, harmonic analysis, probability and statistics and classical and quantum mechanics, and applications. Along with a systematic development of the subject, the book emphasises on the explorations of the contact areas and interfaces, supported by the presentations of explicit computations, wherever feasible. Designed into seven chapters and three appendixes, the book targets to the graduate and senior undergraduate students of mathematics, as well as researchers in the respective areas. The book envisages the pre-requisites of a good understanding of real analysis with elements of the theory of measures and integration, and a first course in functional analysis and in the theory of operators. Chapters 4 through 6 contain advanced topics, which have many interesting applications such as the Feynman–Kac formula, the central limit theorem and the construction of Markov semigroups. Many examples have been given in...

  19. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  20. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    Science.gov (United States)

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  1. Nurses' behavioural intentions towards self-poisoning patients: a theory of reasoned action, comparison of attitudes and subjective norms as predictive variables.

    Science.gov (United States)

    McKinlay, A; Couston, M; Cowan, S

    2001-04-01

    The incidence of self-poisoning is on the increase. Most patients who self-poison are dealt with initially in the general hospital. Therefore, the type and quality of care self-poisoning patients receive will depend, in part, on how they are viewed by nursing staff within the general hospital setting. A knowledge and understanding of the attitudes held by nurses towards self-poisoning patients is therefore important to those involved in the planning and delivery of care towards this client group. Previous studies have examined health care professionals' attitudes towards people who self-poison. Usually, however, these have not focused specifically on nurses' attitudes, and they have ignored the relationship between the attitudes expressed by staff and their intentions to engage in subsequent caring behaviour of one sort or another. It is hence unclear how the findings of such studies are relevant or applicable to nursing policy and practice. The present study aims to address these limitations using a methodology informed by the theory of reasoned action. The study aims to separate out the distinctive roles played by nurses' own attitudes, and the social pressures represented by other people's attitudes, in determining the types of caring behaviour in which nurses intend to engage when dealing with self-poisoning patients. The study adopts a questionnaire-based approach incorporating two specially designed vignettes. The results show that nurses' own attitudes, and what they believe about the attitudes of others, predict their behavioural intentions towards self-poisoning patients. The study also shows that nurses with a more positive orientation towards self-poisoning patients differ in behavioural and normative beliefs from nurses who have a less positive orientation. The implications for future attempts to explore the relationship between nurses' attitudes and subsequent caring behaviour are considered, along with implications for nursing policy and practice.

  2. Introduction to graph theory

    CERN Document Server

    Wilson, Robin J

    1985-01-01

    Graph Theory has recently emerged as a subject in its own right, as well as being an important mathematical tool in such diverse subjects as operational research, chemistry, sociology and genetics. This book provides a comprehensive introduction to the subject.

  3. An analytical model for the prediction of fluid-elastic forces in a rod bundle subjected to axial flow: theory, experimental validation and application to PWR fuel assemblies

    International Nuclear Information System (INIS)

    Beaud, F.

    1997-01-01

    A model predicting the fluid-elastic forces in a bundle of circular cylinders subjected to axial flow is presented in this paper. Whereas previously published models were limited to circular flow channel, the present one allows to take a rectangular flow external boundary into account. For that purpose, an original approach is derived from the standard method of images. This model will eventually be used to predict the fluid-structure coupling between the flow of primary coolant and a fuel assemblies in PWR nuclear reactors. It is indeed of major importance since the flow is shown to induce quite high damping and could therefore mitigate the incidence of an external load like a seismic excitation on the dynamics of the assemblies. The proposed model is validated on two cases from the literature but still needs further comparisons with the experiments being currently carried out on the EDF set-up. The flow has been shown to induce an approximate 12% damping on a PWR fuel assembly, at nominal reactor conditions. The possible grid effect on the fluid-structure coupling has been neglected so far but will soon be investigated at EDF. (author)

  4. Nonequilibrium Green's function theory of resonant steady state photoconduction in a double quantum well FET subject to THz radiation at plasmon frequency

    International Nuclear Information System (INIS)

    Horing, Norman J Morgenstern; Popov, Vyacheslav V

    2006-01-01

    Recent experimental observations by X.G. Peralta and S.J. Allen, et al. of dc photoconductivity resonances in steady source-drain current subject to terahertz radiation in a grid-gated double-quantum well FET suggested an association with plasmon resonances. This association was definitively confirmed for some parameter ranges in our detailed electrodynamic absorbance calculations. In this paper we propose that the reason that the dc photoconductance resonances match the plasmon resonances in semiconductors is based on a nonlinear dynamic screening mechanism. In this, we employ a shielded potential approximation that is nonlinear in the terahertz field to determine the nonequilibrium Green's function and associated density perturbation that govern the nonequilibrium dielectric polarization of the medium. This 'conditioning' of the system by the incident THz radiation results in resonant polarization response at the plasmon frequencies which, in turn, causes a sharp drop of the resistive shielded impurity scattering potentials and attendant increase of the dc source-drain current. This amounts to disabling the impurity scattering mechanism by plasmon resonant behavior in nonlinear screening

  5. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  6. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  7. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  8. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  9. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  10. Daniel Courgeau: Probability and social science: methodological relationships between the two approaches [Review of: . Probability and social science: methodological relationships between the two approaches

    NARCIS (Netherlands)

    Willekens, F.J.C.

    2013-01-01

    Throughout history, humans engaged in games in which randomness plays a role. In the 17th century, scientists started to approach chance scientifically and to develop a theory of probability. Courgeau describes how the relationship between probability theory and social sciences emerged and evolved

  11. SUBJECT INDEX

    Indian Academy of Sciences (India)

    Unknown

    Self-assembly of a Co(II) dimer through H-bonding of water molecules to a 3D ... A density functional theory-based chemical potential equalisation approach to .... Electrochemical determination of hydrogen peroxide using o-dianisidine as ...

  12. Decision making generalized by a cumulative probability weighting function

    Science.gov (United States)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  13. Measure and integration theory

    CERN Document Server

    Burckel, Robert B

    2001-01-01

    This book gives a straightforward introduction to the field as it is nowadays required in many branches of analysis and especially in probability theory. The first three chapters (Measure Theory, Integration Theory, Product Measures) basically follow the clear and approved exposition given in the author's earlier book on ""Probability Theory and Measure Theory"". Special emphasis is laid on a complete discussion of the transformation of measures and integration with respect to the product measure, convergence theorems, parameter depending integrals, as well as the Radon-Nikodym theorem. The fi

  14. Foundations of compositional model theory

    Czech Academy of Sciences Publication Activity Database

    Jiroušek, Radim

    2011-01-01

    Roč. 40, č. 6 (2011), s. 623-678 ISSN 0308-1079 R&D Projects: GA MŠk 1M0572; GA ČR GA201/09/1891; GA ČR GEICC/08/E010 Institutional research plan: CEZ:AV0Z10750506 Keywords : multidimensional probability distribution * conditional independence * graphical Markov model * composition of distributions Subject RIV: IN - Informatics, Computer Science Impact factor: 0.667, year: 2011 http://library.utia.cas.cz/separaty/2011/MTR/jirousek-foundations of compositional model theory.pdf

  15. Local homotopy theory

    CERN Document Server

    Jardine, John F

    2015-01-01

    This monograph on the homotopy theory of topologized diagrams of spaces and spectra gives an expert account of a subject at the foundation of motivic homotopy theory and the theory of topological modular forms in stable homotopy theory. Beginning with an introduction to the homotopy theory of simplicial sets and topos theory, the book covers core topics such as the unstable homotopy theory of simplicial presheaves and sheaves, localized theories, cocycles, descent theory, non-abelian cohomology, stacks, and local stable homotopy theory. A detailed treatment of the formalism of the subject is interwoven with explanations of the motivation, development, and nuances of ideas and results. The coherence of the abstract theory is elucidated through the use of widely applicable tools, such as Barr's theorem on Boolean localization, model structures on the category of simplicial presheaves on a site, and cocycle categories. A wealth of concrete examples convey the vitality and importance of the subject in topology, n...

  16. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  17. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  18. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  19. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  20. Freire對話行動理論的主體觀點及五個規準 Educational Implications of Paulo Freire’s Theory of Dialogic Action Based on a Subjective Perspective and Five Criteria

    Directory of Open Access Journals (Sweden)

    蘇鈺楠 Yu-Nan Su

    2014-06-01

    Full Text Available 面對著包括意識形態在內的宰制結構,批判教育學秉持著動態文化觀,相信主體間互動所產生的能量可以帶動結構的更新與主體的解放,而此種互動的開始便是P. Freire所提出的對話行動理論。本研究旨從主體觀點來探究Freire 對話行動理論,發掘在交互肯認的基礎上其對話理論深受G. W. F. Hegel和K. Marx之影響,且其先前假定係以倫理關懷做出發,在內容上包括了愛、謙遜、信心、希望及批判等五個部分,最後則據此提出在教育上之涵義。 Under a hegemonic structure, critical pedagogy creates the dynamic cultures concept. The interaction of mutual relationships is believed to produce energy, and thereby promote structural evolution and subject emancipation. This initiated relationship and interaction is represents the concept of dialogic action proposed in Freire’s theory. This study focused on Freire’s theory of dialogic action to identify an interrecognized foundation of dialogic theory, which was influenced by Hegel and Marx, and his previous assumption was based on moral concerns. This study addressed five topics: (1 love, (2 humility, (3 confidence, (4 expectations, and (5 criticism. Finally, the methods of applying this theory in education were analyzed by conducting follow-up research.

  1. The Probability of Detection in the Telephone Line of Device of the Unauthorized Removal of Information

    Directory of Open Access Journals (Sweden)

    I. V. Svintsov

    2011-06-01

    Full Text Available The article discusses the theory of quantitative description of the possible presence in the telephone line devices unauthorized removal of information, investigated with the help of probability theory.

  2. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  3. A quantum probability model of causal reasoning

    Directory of Open Access Journals (Sweden)

    Jennifer S Trueblood

    2012-05-01

    Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  4. Graph theory

    CERN Document Server

    Gould, Ronald

    2012-01-01

    This introduction to graph theory focuses on well-established topics, covering primary techniques and including both algorithmic and theoretical problems. The algorithms are presented with a minimum of advanced data structures and programming details. This thoroughly corrected 1988 edition provides insights to computer scientists as well as advanced undergraduates and graduate students of topology, algebra, and matrix theory. Fundamental concepts and notation and elementary properties and operations are the first subjects, followed by examinations of paths and searching, trees, and networks. S

  5. Politics of modern muslim subjectivities

    DEFF Research Database (Denmark)

    Jung, Dietrich; Petersen, Marie Juul; Sparre, Sara Lei

    Examining modern Muslim identity constructions, the authors introduce a novel analytical framework to Islamic Studies, drawing on theories of successive modernities, sociology of religion, and poststructuralist approaches to modern subjectivity, as well as the results of extensive fieldwork...

  6. People's Intuitions about Randomness and Probability: An Empirical Study

    Science.gov (United States)

    Lecoutre, Marie-Paule; Rovira, Katia; Lecoutre, Bruno; Poitevineau, Jacques

    2006-01-01

    What people mean by randomness should be taken into account when teaching statistical inference. This experiment explored subjective beliefs about randomness and probability through two successive tasks. Subjects were asked to categorize 16 familiar items: 8 real items from everyday life experiences, and 8 stochastic items involving a repeatable…

  7. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  8. Probability in reasoning: a developmental test on conditionals.

    Science.gov (United States)

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  10. Practical differences among probabilities, possibilities, and credibilities

    Science.gov (United States)

    Grandin, Jean-Francois; Moulin, Caroline

    2002-03-01

    This paper presents some important differences that exist between theories, which allow the uncertainty management in data fusion. The main comparative results illustrated in this paper are the followings: Incompatibility between decisions got from probabilities and credibilities is highlighted. In the dynamic frame, as remarked in [19] or [17], belief and plausibility of Dempster-Shafer model do not frame the Bayesian probability. This framing can however be obtained by the Modified Dempster-Shafer approach. It also can be obtained in the Bayesian framework either by simulation techniques, or with a studentization. The uncommitted in the Dempster-Shafer way, e.g. the mass accorded to the ignorance, gives a mechanism similar to the reliability in the Bayesian model. Uncommitted mass in Dempster-Shafer theory or reliability in Bayes theory act like a filter that weakens extracted information, and improves robustness to outliners. So, it is logical to observe on examples like the one presented particularly by D.M. Buede, a faster convergence of a Bayesian method that doesn't take into account the reliability, in front of Dempster-Shafer method which uses uncommitted mass. But, on Bayesian masses, if reliability is taken into account, at the same level that the uncommited, e.g. F=1-m, we observe an equivalent rate for convergence. When Dempster-Shafer and Bayes operator are informed by uncertainty, faster or lower convergence can be exhibited on non Bayesian masses. This is due to positive or negative synergy between information delivered by sensors. This effect is a direct consequence of non additivity when considering non Bayesian masses. Unknowledge of the prior in bayesian techniques can be quickly compensated by information accumulated as time goes on by a set of sensors. All these results are presented on simple examples, and developed when necessary.

  11. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  12. Measurement Errors and Uncertainties Theory and Practice

    CERN Document Server

    Rabinovich, Semyon G

    2006-01-01

    Measurement Errors and Uncertainties addresses the most important problems that physicists and engineers encounter when estimating errors and uncertainty. Building from the fundamentals of measurement theory, the author develops the theory of accuracy of measurements and offers a wealth of practical recommendations and examples of applications. This new edition covers a wide range of subjects, including: - Basic concepts of metrology - Measuring instruments characterization, standardization and calibration -Estimation of errors and uncertainty of single and multiple measurements - Modern probability-based methods of estimating measurement uncertainty With this new edition, the author completes the development of the new theory of indirect measurements. This theory provides more accurate and efficient methods for processing indirect measurement data. It eliminates the need to calculate the correlation coefficient - a stumbling block in measurement data processing - and offers for the first time a way to obtain...

  13. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Science.gov (United States)

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  14. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Directory of Open Access Journals (Sweden)

    Karel Doubravsky

    Full Text Available Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (rechecked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  15. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  16. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  17. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  18. Optimization of plasma diagnostics using Bayesian probability theory

    International Nuclear Information System (INIS)

    Dreier, H.; Dinklage, A.; Hirsch, M.; Kornejew, P.; Fischer, R.

    2006-01-01

    The diagnostic set-up for Wendelstein 7-X, a magnetic fusion device presently under construction, is currently in the design process to optimize the outcome under given technical constraints. Compared to traditional design approaches, Bayesian Experimental Design (BED) allows to optimize with respect to physical motivated design criterions. It aims to find the optimal design by maximizing an expected utility function that quantifies the goals of the experiment. The expectation marginalizes over the uncertain physical parameters and the possible values of future data. The approach presented here bases on maximization of an information measure (Kullback-Leibler entropy). As an example, the optimization of an infrared multichannel interferometer is shown in detail. Design aspects like the impact of technical restrictions are discussed

  19. The Bernoullis and the Origin of Probability Theory: Looking back ...

    Indian Academy of Sciences (India)

    1) Through the work of Huygens, it became necessary to make the terms ... concepts in one family of problems trigger similar approaches in related problems of ..... 'chain-line' problem). which ultimately led to the conflict be- tween the two ...

  20. Fitness Probability Distribution of Bit-Flip Mutation.

    Science.gov (United States)

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  1. On the structure of the quantum-mechanical probability models

    International Nuclear Information System (INIS)

    Cufaro-Petroni, N.

    1992-01-01

    In this paper the role of the mathematical probability models in the classical and quantum physics in shortly analyzed. In particular the formal structure of the quantum probability spaces (QPS) is contrasted with the usual Kolmogorovian models of probability by putting in evidence the connections between this structure and the fundamental principles of the quantum mechanics. The fact that there is no unique Kolmogorovian model reproducing a QPS is recognized as one of the main reasons of the paradoxical behaviors pointed out in the quantum theory from its early days. 8 refs

  2. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  3. Towards a definition of SUBJECT in binding domains and subject ...

    African Journals Online (AJOL)

    be antecedents for subject-oriented anaphors (e.g. Maling 1984) ... 1985), it is unclear what actually determines this binding behaviour, or why subjects should ..... contexts can be unified by the fact that both functionally determine their complements. ...... Binding theory, control and pro. ... San Diego: Academic Press. pp. 179 ...

  4. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  5. Judgment, Probability, and Aristotle's Rhetoric.

    Science.gov (United States)

    Warnick, Barbara

    1989-01-01

    Discusses Aristotle's five means of making judgments: intelligence, "episteme" (scientific knowledge), "sophia" (theoretical wisdom), "techne" (art), and "phronesis" (practical wisdom). Sets Aristotle's theory of rhetorical argument within the context of his overall view of human judgment. Notes that…

  6. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  7. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  8. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  9. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  10. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  11. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  12. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  13. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  14. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  15. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  16. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  17. Elementary decision theory

    CERN Document Server

    Chernoff, Herman

    1988-01-01

    This well-respected introduction to statistics and statistical theory covers data processing, probability and random variables, utility and descriptive statistics, computation of Bayes strategies, models, testing hypotheses, and much more. 1959 edition.

  18. Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.

    Science.gov (United States)

    Blakeslee, David W.; And Others

    This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…

  19. Quantum probability and cognitive modeling: some cautions and a promising direction in modeling physics learning.

    Science.gov (United States)

    Franceschetti, Donald R; Gire, Elizabeth

    2013-06-01

    Quantum probability theory offers a viable alternative to classical probability, although there are some ambiguities inherent in transferring the quantum formalism to a less determined realm. A number of physicists are now looking at the applicability of quantum ideas to the assessment of physics learning, an area particularly suited to quantum probability ideas.

  20. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We