WorldWideScience

Sample records for subject probability theory

  1. Support Theory: A Nonextensional Representation of Subjective Probability.

    Science.gov (United States)

    Tversky, Amos; Koehler, Derek J.

    1994-01-01

    A new theory of subjective probability is presented. According to this theory, different descriptions of the same event can give rise to different judgments. Experimental evidence supporting this theory is summarized, demonstrating that the theory provides a unified treatment of a wide range of empirical findings. (SLD)

  2. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  3. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  4. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...

  5. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...

  6. The Theory of Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.

  7. Probability Theory Without Tears!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...

  8. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  9. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  10. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  11. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  12. Records via probability theory

    CERN Document Server

    Ahsanullah, Mohammad

    2015-01-01

    A lot of statisticians, actuarial mathematicians, reliability engineers, meteorologists, hydrologists, economists. Business and sport analysts deal with records which play important roles in various fields of statistics and its application. This book enables a reader to check his/her level of understanding of the theory of record values. We give basic formulae which are more important in the theory and present a lot of examples which illustrate the theoretical statements. For a beginner in record statistics, as well as for graduate students the study of our book needs the basic knowledge of the subject. A more advanced reader can use our book to polish his/her knowledge. An upgraded list of bibliography which will help a reader to enrich his/her theoretical knowledge and widen the experience of dealing with ordered observations, is also given in the book.

  13. Constructor theory of probability

    Science.gov (United States)

    2016-01-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914

  14. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  15. the theory of probability

    Indian Academy of Sciences (India)

    important practical applications in statistical quality control. Of a similar kind are the laws of probability for the scattering of missiles, which are basic in the ..... deviations for different ranges for each type of gun and of shell are found empirically in firing practice on an artillery range. But the subsequent solution of all possible ...

  16. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  17. What is Probability Theory?

    Indian Academy of Sciences (India)

    IAS Admin

    gambling problems in the 18th century. Europe. random) phenomena, especially those evolving over time. The study of motion of physical objects over time by. Newton led to his famous three laws of motion as well as many important developments in the theory of ordi- nary differential equations. Similarly, the construction ...

  18. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  19. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness. Copyright © 2013 Cognitive Science Society, Inc.

  20. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  1. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  2. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  3. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...

  4. Harmonic analysis and the theory of probability

    CERN Document Server

    Bochner, Salomon

    2005-01-01

    Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro

  5. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  6. Subjective Probability and Information Retrieval: A Review of the Psychological Literature.

    Science.gov (United States)

    Thompson, Paul

    1988-01-01

    Reviews the subjective probability estimation literature of six schools of human judgement and decision making: decision theory, behavioral decision theory, psychological decision theory, social judgement theory, information integration theory, and attribution theory. Implications for probabilistic information retrieval are discussed, including…

  7. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  8. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    We evaluate the binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Harrison, Martínez-Correa and Swarthout [2013] found that the binary lottery procedure works robustly to induce risk neutrality when subjects are given one risk task defined over...... objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  9. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  10. A Course on Elementary Probability Theory

    OpenAIRE

    Lo, Gane Samb

    2017-01-01

    This book introduces to the theory of probabilities from the beginning. Assuming that the reader possesses the normal mathematical level acquired at the end of the secondary school, we aim to equip him with a solid basis in probability theory. The theory is preceded by a general chapter on counting methods. Then, the theory of probabilities is presented in a discrete framework. Two objectives are sought. The first is to give the reader the ability to solve a large number of problems related t...

  11. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  12. Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations?

    Directory of Open Access Journals (Sweden)

    David Howden

    2009-10-01

    Full Text Available Frequency probability theorists define an event’s probability distribution as the limit of a repeated set of trials belonging to a homogeneous collective. The subsets of this collective are events which we have deficient knowledge about on an individual level, although for the larger collective we have knowledge its aggregate behavior. Hence, probabilities can only be achieved through repeated trials of these subsets arriving at the established frequencies that define the probabilities. Crovelli (2009 argues that this is a mistaken approach, and that a subjective assessment of individual trials should be used instead. Bifurcating between the two concepts of risk and uncertainty, Crovelli first asserts that probability is the tool used to manage uncertain situations, and then attempts to rebuild a definition of probability theory with this in mind. We show that such an attempt has little to gain, and results in an indeterminate application of entrepreneurial forecasting to uncertain decisions—a process far-removed from any application of probability theory.

  13. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  14. Probability as a theory dependent concept

    NARCIS (Netherlands)

    Atkinson, D; Peijnenburg, J

    1999-01-01

    It is argued that probability should be defined implicitly by the distributions of possible measurement values characteristic of a theory. These distributions are tested by, but not defined in terms of, relative frequencies of occurrences of events of a specified kind. The adoption of an a priori

  15. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  16. Eliciting Subjective Probability Distributions with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2015-01-01

    We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....

  17. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2012-01-01

    . The experimental task consists of a series of standard lottery choices in which the subject is assumed to use conventional risk attitudes to select one lottery or the other and then a series of betting choices in which the subject is presented with a range of bookies offering odds on the outcome of some event...

  18. Probability Theory Without Tears! v V V V V v -

    Indian Academy of Sciences (India)

    gambling dens where probability theory had its humble beginnings.) Also, she or he may be put off by the mathematical hairsplitting of advanced treatises. A good book intended for beginners should avoid these pitfalls. Too austere an approach will make one interested in a working knowledge of the subject un- comfortable ...

  19. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2017-01-01

    Subjective beliefs are elicited routinely in economics experiments. However, such elicitation often suffers from two possible disadvantages. First, beliefs are recovered in the form of a summary statistic, usually the mean, of the underlying latent distribution. Second, recovered beliefs are bias...

  20. A Challenge to Ludwig von Mises’s Theory of Probability

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2010-10-01

    Full Text Available The most interesting and completely overlooked aspect of Ludwig von Mises’s theory of probability is the total absence of any explicit definition for probability in his theory. This paper examines Mises’s theory of probability in light of the fact that his theory possesses no definition for probability. It is argued, first, that Mises’s theory differs in important respects from his brother’s famous theory of probability. A defense of the subjective definition for probability is then provided, which is subsequently used to critique Ludwig von Mises’s theory. It is argued that only the subjective definition for probability comports with Mises’s other philosophical positions. Since Mises did not provide an explicit definition for probability, it is suggested that he ought to have adopted a subjective definition.

  1. Subjective Illness theory and coping

    Directory of Open Access Journals (Sweden)

    Gessmann H.-W.

    2015-03-01

    Full Text Available The article presents a view of a problem of subjective illness theory in context of coping behavior. The article compiles the results of the latest studies of coping; discloses the way subjective illness theory affects the illness coping and patient's health; presents the study of differences in coping behaviour of patients at risk of heart attack and oncology. The article is recommended for specialists, concerned with psychological reasons of pathogenic processes and coping strategies of patients.

  2. Naive Probability: A Mental Model Theory of Extensional Reasoning.

    Science.gov (United States)

    Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul

    1999-01-01

    Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…

  3. Tests of Cumulative Prospect Theory with graphical displays of probability

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-10-01

    Full Text Available Recent research reported evidence that contradicts cumulative prospect theory and the priority heuristic. The same body of research also violates two editing principles of original prospect theory: cancellation (the principle that people delete any attribute that is the same in both alternatives before deciding between them and combination (the principle that people combine branches leading to the same consequence by adding their probabilities. This study was designed to replicate previous results and to test whether the violations of cumulative prospect theory might be eliminated or reduced by using formats for presentation of risky gambles in which cancellation and combination could be facilitated visually. Contrary to the idea that decision behavior contradicting cumulative prospect theory and the priority heuristic would be altered by use of these formats, however, data with two new graphical formats as well as fresh replication data continued to show the patterns of evidence that violate cumulative prospect theory, the priority heuristic, and the editing principles of combination and cancellation. Systematic violations of restricted branch independence also contradicted predictions of ``stripped'' prospect theory (subjectively weighted additive utility without the editing rules.

  4. Probability, statistics and queueing theory, with computer science applications

    CERN Document Server

    Allen, Arnold O

    1978-01-01

    Probability, Statistics, and Queueing Theory: With Computer Science Applications focuses on the use of statistics and queueing theory for the design and analysis of data communication systems, emphasizing how the theorems and theory can be used to solve practical computer science problems. This book is divided into three parts. The first part discusses the basic concept of probability, probability distributions commonly used in applied probability, and important concept of a stochastic process. Part II covers the discipline of queueing theory, while Part III deals with statistical inference. T

  5. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  6. Nonequilibrium random matrix theory: Transition probabilities

    Science.gov (United States)

    Pedro, Francisco Gil; Westphal, Alexander

    2017-03-01

    In this paper we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  7. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. Continuous subjective expected utility with non-additive probabilities

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1989-01-01

    textabstractA well-known theorem of Debreu about additive representations of preferences is applied in a non-additive context, to characterize continuous subjective expected utility maximization for the case where the probability measures may be non-additive. The approach of this paper does not need

  9. Probability Theories and the Justification of Theism

    OpenAIRE

    Portugal, Agnaldo Cuoco

    2003-01-01

    In the present paper I intend to analyse, criticise and suggest an alternative to Richard Swinburne"s use of Bayes"s theorem to justify the belief that there is a God. Swinburne"s contribution here lies in the scope of his project and the interpretation he adopts for Bayes"s formula, a very important theorem of the probability calculus.

  10. Stability and coherence of health experts' upper and lower subjective probabilities about dose-response functions.

    Science.gov (United States)

    Wallsten, T S; Forsyth, B H

    1983-06-01

    As part of a method for assessing health risks associated with primary National Ambient Air Quality Standards. T. B. Feagans and W. F. Biller (Research Triangle Park, North Carolina. EPA Office of Air Quality Planning and Standards, May 1981) developed a technique for encoding experts' subjective probabilities regarding dose--response functions. The encoding technique is based on B. O. Koopman's (Bulletin of the American Mathematical Society, 1940, 46, 763-764; Annals of Mathematics, 1940, 41, 269-292) probability theory, which does not require probabilities to be sharp, but rather allows lower and upper probabilities to be associated with an event. Uncertainty about a dose--response function can be expressed either in terms of the response rate expected at a given concentration or, conversely, in terms of the concentration expected to support a given response rate. Feagans and Biller (1981, cited above) derive the relation between the two conditional probabilities, which is easily extended to upper and lower conditional probabilities. These relations were treated as coherence requirements in an experiment utilizing four ozone and four lead experts as subjects, each providing judgments on two separate occasions. Four subjects strongly satisfied the coherence requirements in both conditions. and three more did no in the second session only. The eighth subject also improved in Session 2. Encoded probabilities were highly correlated between the two sessions, but changed from the first to the second in a manner that improved coherence and reflected greater attention to certain parameters of the dose--response function.

  11. Quantum Probability, Orthogonal Polynomials and Quantum Field Theory

    Science.gov (United States)

    Accardi, Luigi

    2017-03-01

    The main thesis of the present paper is that: Quantum Probability is not a generalization of classical probability, but it is a deeper level of it. Classical random variables have an intrinsic (microscopic) non-commutative structure that generalize usual quantum theory. The study of this generalization is the core of the non-linear quantization program.

  12. Probability Theory, Not the Very Guide of Life

    Science.gov (United States)

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  13. A Short History of Probability Theory and Its Applications

    Science.gov (United States)

    Debnath, Lokenath; Basu, Kanadpriya

    2015-01-01

    This paper deals with a brief history of probability theory and its applications to Jacob Bernoulli's famous law of large numbers and theory of errors in observations or measurements. Included are the major contributions of Jacob Bernoulli and Laplace. It is written to pay the tricentennial tribute to Jacob Bernoulli, since the year 2013…

  14. Probability in biology: overview of a comprehensive theory of probability in living systems.

    Science.gov (United States)

    Nakajima, Toshiyuki

    2013-09-01

    Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Problems from the discrete to the continuous probability, number theory, graph theory, and combinatorics

    CERN Document Server

    Pinsky, Ross G

    2014-01-01

    The primary intent of the book is to introduce an array of beautiful problems in a variety of subjects quickly, pithily and completely rigorously to graduate students and advanced undergraduates. The book takes a number of specific problems and solves them, the needed tools developed along the way in the context of the particular problems. It treats a mélange of topics from combinatorial probability theory, number theory, random graph theory and combinatorics. The problems in this book involve the asymptotic analysis of a discrete construct as some natural parameter of the system tends to infinity. Besides bridging discrete mathematics and mathematical analysis, the book makes a modest attempt at bridging disciplines. The problems were selected with an eye toward accessibility to a wide audience, including advanced undergraduate students. The book could be used for a seminar course in which students present the lectures.

  16. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  17. Bayesian probability theory applications in the physical sciences

    CERN Document Server

    Linden, Wolfgang von der; Toussaint, Udo von

    2014-01-01

    From the basics to the forefront of modern research, this book presents all aspects of probability theory, statistics and data analysis from a Bayesian perspective for physicists and engineers. The book presents the roots, applications and numerical implementation of probability theory, and covers advanced topics such as maximum entropy distributions, stochastic processes, parameter estimation, model selection, hypothesis testing and experimental design. In addition, it explores state-of-the art numerical techniques required to solve demanding real-world problems. The book is ideal for students and researchers in physical sciences and engineering.

  18. Probability and information theory, with applications to radar

    CERN Document Server

    Woodward, P M; Higinbotham, W

    1964-01-01

    Electronics and Instrumentation, Second Edition, Volume 3: Probability and Information Theory with Applications to Radar provides information pertinent to the development on research carried out in electronics and applied physics. This book presents the established mathematical techniques that provide the code in which so much of the mathematical theory of electronics and radar is expressed.Organized into eight chapters, this edition begins with an overview of the geometry of probability distributions in which moments play a significant role. This text then examines the mathematical methods in

  19. Probability Theory as Logic: Data Assimilation for Multiple Source Reconstruction

    Science.gov (United States)

    Yee, Eugene

    2012-03-01

    Probability theory as logic (or Bayesian probability theory) is a rational inferential methodology that provides a natural and logically consistent framework for source reconstruction. This methodology fully utilizes the information provided by a limited number of noisy concentration data obtained from a network of sensors and combines it in a consistent manner with the available prior knowledge (mathematical representation of relevant physical laws), hence providing a rigorous basis for the assimilation of this data into models of atmospheric dispersion for the purpose of contaminant source reconstruction. This paper addresses the application of this framework to the reconstruction of contaminant source distributions consisting of an unknown number of localized sources, using concentration measurements obtained from a sensor array. To this purpose, Bayesian probability theory is used to formulate the full joint posterior probability density function for the parameters of the unknown source distribution. A simulated annealing algorithm, applied in conjunction with a reversible-jump Markov chain Monte Carlo technique, is used to draw random samples of source distribution models from the posterior probability density function. The methodology is validated against a real (full-scale) atmospheric dispersion experiment involving a multiple point source release.

  20. Tsallis Entropy, Escort Probability and the Incomplete Information Theory

    Directory of Open Access Journals (Sweden)

    Parvin Sadeghi

    2010-12-01

    Full Text Available Non-extensive statistical mechanics appears as a powerful way to describe complex systems. Tsallis entropy, the main core of this theory has been remained as an unproven assumption. Many people have tried to derive the Tsallis entropy axiomatically. Here we follow the work of Wang (EPJB, 2002 and use the incomplete information theory to retrieve the Tsallis entropy. We change the incomplete information axioms to consider the escort probability and obtain a correct form of Tsallis entropy in comparison with Wang’s work.

  1. On Dobrushin's way from probability theory to statistical physics

    CERN Document Server

    Minlos, R A; Suhov, Yu M; Suhov, Yu

    2000-01-01

    R. Dobrushin worked in several branches of mathematics (probability theory, information theory), but his deepest influence was on mathematical physics. He was one of the founders of the rigorous study of statistical physics. When Dobrushin began working in that direction in the early sixties, only a few people worldwide were thinking along the same lines. Now there is an army of researchers in the field. This collection is devoted to the memory of R. L. Dobrushin. The authors who contributed to this collection knew him quite well and were his colleagues. The title, "On Dobrushin's Way", is mea

  2. Coexistence on reflecting hyperplane in generalized probability theories

    Science.gov (United States)

    Kobayshi, Masatomo

    2017-08-01

    The coexistence of effects in a certain class of generalized probability theories is investigated. The effect space corresponding to an even-sided regular polygon state space has a central hyperplane that contains all the nontrivial extremal effects. The existence of such a hyperplane, called a reflecting hyperplane, is closely related to the point symmetry of the corresponding state space. The effects on such a hyperplane can be regarded as the (generalized) unbiased effects. A necessary and sufficient condition for a pair of unbiased effects in the even-sided regular polygon theories is presented. This result reproduces a low-dimensional analogue of known results of qubit effects in a certain limit.

  3. A short course on measure and probability theories

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre

    2004-02-01

    This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the past decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.

  4. Working directly with probabilities in quantum field theory

    Science.gov (United States)

    Dickinson, R.; Forshaw, J.; Millington, P.

    2017-08-01

    We present a novel approach to computing transition probabilities in quantum field theory, which allows them to be written directly in terms of expectation values of nested commutators and anti-commutators of field operators, rather than squared matrix elements. We show that this leads to a diagrammatic expansion in which the retarded propagator plays a dominant role. As a result, one is able to see clearly how faster-than-light signalling is prevented between sources and detectors. Finally, we comment on potential implications of this approach for dealing with infra-red divergences.

  5. Subjective Probability of Receiving Harm as a Function of Attraction and Harm Delivered.

    Science.gov (United States)

    Schlenker, Barry R.; And Others

    It was hypothesized that subjects who liked a source of potential harm would estimate the probability of receiving harm mediated by him as lower than would subjects who disliked the source. To test the hypothesis, subjects were asked to estimate the probability that a liked or disliked confederate would deliver an electric shock on each of 10…

  6. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  7. The instrumentalist aspects of quantum mechanics stem from probability theory

    Science.gov (United States)

    Vervoort, Louis

    2012-03-01

    The aim of the article is to argue that the interpretations of quantum mechanics and of probability are much closer than usually thought. Indeed, a detailed analysis of the concept of probability (within the standard frequency interpretation of R. von Mises) reveals that this notion always refers to an observing system. Therefore the instrumentalist aspects of quantum mechanics, and in particular the enigmatic role of the observer in the Copenhagen interpretation, derive from a precise understanding of probability.

  8. Probability model for analyzing fire management alternatives: theory and structure

    Science.gov (United States)

    Frederick W. Bratten

    1982-01-01

    A theoretical probability model has been developed for analyzing program alternatives in fire management. It includes submodels or modules for predicting probabilities of fire behavior, fire occurrence, fire suppression, effects of fire on land resources, and financial effects of fire. Generalized "fire management situations" are used to represent actual fire...

  9. Cold and hot cognition: quantum probability theory and realistic psychological modeling.

    Science.gov (United States)

    Corr, Philip J

    2013-06-01

    Typically, human decision making is emotionally "hot" and does not conform to "cold" classical probability (CP) theory. As quantum probability (QP) theory emphasises order, context, superimposition states, and nonlinear dynamic effects, one of its major strengths may be its power to unify formal modeling and realistic psychological theory (e.g., information uncertainty, anxiety, and indecision, as seen in the Prisoner's Dilemma).

  10. Asymptotic Theory for the Probability Density Functions in Burgers Turbulence

    CERN Document Server

    Weinan, E; Eijnden, Eric Vanden

    1999-01-01

    A rigorous study is carried out for the randomly forced Burgers equation in the inviscid limit. No closure approximations are made. Instead the probability density functions of velocity and velocity gradient are related to the statistics of quantities defined along the shocks. This method allows one to compute the anomalies, as well as asymptotics for the structure functions and the probability density functions. It is shown that the left tail for the probability density function of the velocity gradient has to decay faster than $|\\xi|^{-3}$. A further argument confirms the prediction of E et al., Phys. Rev. Lett. {\\bf 78}, 1904 (1997), that it should decay as $|\\xi|^{-7/2}$.

  11. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Semkow, Thomas M. E-mail: semkow@wadsworth.org

    1999-11-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  12. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    CERN Document Server

    Semkow, T M

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  13. The World According to de Finetti: On de Finetti's Theory of Probability and Its Application to Quantum Mechanics

    Science.gov (United States)

    Berkovitz, Joseph

    Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of probability and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that degrees of belief be coherent, and he argued that the whole of probability theory could be derived from these coherence conditions. De Finetti's interpretation of probability has been highly influential in science. This paper focuses on the application of this interpretation to quantum mechanics. We argue that de Finetti held that the coherence conditions of degrees of belief in events depend on their verifiability. Accordingly, the standard coherence conditions of degrees of belief that are familiar from the literature on subjective probability only apply to degrees of belief in events which could (in principle) be jointly verified; and the coherence conditions of degrees of belief in events that cannot be jointly verified are weaker. While the most obvious explanation of de Finetti's verificationism is the influence of positivism, we argue that it could be motivated by the radical subjectivist and instrumental nature of probability in his interpretation; for as it turns out, in this interpretation it is difficult to make sense of the idea of coherent degrees of belief in, and accordingly probabilities of unverifiable events. We then consider the application of this interpretation to quantum mechanics, concentrating on the Einstein-Podolsky-Rosen experiment and Bell's theorem.

  14. Estimation of first excursion probability for mechanical appendage system subjected to nonstationary earthquake excitation

    Energy Technology Data Exchange (ETDEWEB)

    Aoki, Shigeru; Suzuki, Kohei (Tokyo Metropolitan Univ. (Japan))

    1984-06-01

    An estimation technique whereby the first excursion probability of the mechanical appendage system subjected to the nonstationary seismic excitation can be conventionally calculated is proposed. The first excursion probability of the appendage system is estimated by using this method and the following results are obtained. (1) The probability from this technique is more convervative than that from a simulation technique taking artificial time histories compatible to the design spectrum as input excitation. (2) The first excursion probability is practically independent of the natural period of the appendage system when the tolerable barrier level is normalized by the response amplification factor given by the design spectrum. (3) The first excursion probability decreases as the damping ratio of the appendage system increases. It also decreases as the mass ratio of the appendage system to the supporting system increases. (4) For the inelastic appendage system, the first excursion probability is reduced, if an appropriate elongation is permitted.

  15. Non-equilibrium random matrix theory. Transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, Francisco Gil [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Gruppe Theorie

    2016-06-15

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  16. The first cycle of the reflective pedagogical paradigm implementation in the introduction probability theory course

    Science.gov (United States)

    Julie, Hongki

    2017-08-01

    One of purposes of this study was describing the steps of the teaching and learning process if the teacher in the Introduction Probability Theory course wanted to teach about the event probability by using the reflective pedagogical paradigm (RPP) and describing the results achieved by the students. The study consisted of three cycles, but the results would be presented in this paper was limited to the results obtained in the first cycle. Stages conducted by the researcher in the first cycle could be divided into five stages, namely (1) to know the students' context, (2) to plan and provide student learning experiences, (3) to facilitate students in actions, (4) to ask students to make a reflection and (5) to evaluate. The type of research used in this research was descriptive qualitative and quantitative research. The students' learning experience, the students' action, and the students' reflection would be described qualitatively. The student evaluation results would be described quantitatively. The research subject in this study was 38 students taking the introduction probability theory course in class C. From the students' reflection, still quite a lot of students were not complete in writing concepts that they have learned and / or have not been precise in describing the relationships between concepts that they have learned. From the students' evaluation, 85.29% students got score under 7. If examined more deeply, the most difficulty of students were in the mathematical horizontal process. As a result, they had difficulty in performing the mathematical vertical process.

  17. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...

  18. Probability Spaces, Hilbert Spaces, and The Axioms of Test Theory

    Science.gov (United States)

    Zimmerman, Donald W.

    1975-01-01

    Classical test theory findings can be derived from the concepts of conditional expectation, conditional independence, and related notions. It is shown that these concepts provide precisely the formalism needed to obtain the classical results with minimal assumptions and with greatest economy in the methods of proof. (RC)

  19. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....

  20. Bounding the Failure Probability Range of Polynomial Systems Subject to P-box Uncertainties

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper proposes a reliability analysis framework for systems subject to multiple design requirements that depend polynomially on the uncertainty. Uncertainty is prescribed by probability boxes, also known as p-boxes, whose distribution functions have free or fixed functional forms. An approach based on the Bernstein expansion of polynomials and optimization is proposed. In particular, we search for the elements of a multi-dimensional p-box that minimize (i.e., the best-case) and maximize (i.e., the worst-case) the probability of inner and outer bounding sets of the failure domain. This technique yields intervals that bound the range of failure probabilities. The offset between this bounding interval and the actual failure probability range can be made arbitrarily tight with additional computational effort.

  1. Sociological theories of subjective well-being

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2009-01-01

    textabstractSubjective well-being is no great issue in sociology; the subject is not mentioned in sociological textbooks (a notable exception is Nolan & Lenski, 2004) and is rarely discussed in sociological journals. This absence has many reasons: pragmatic, ideological, and theoretical. To begin

  2. Inventory control based on advanced probability theory, an application

    CERN Document Server

    Krever, Maarten; Schorr, Bernd; Wunderink, S

    2005-01-01

    Whenever stock is placed as a buffer between consumption and supply the decision when to replenish the stock is based on uncertain values of future demand and supply variables. Uncertainty exists about the replenishment lead time, about the number of demands and the quantities demanded during this period. We develop a new analytical expression for the reorder point, which is based on the desired service level and three distributions: the distribution of the quantity of single demands during lead time, the distribution of the lengths of time intervals between successive demands, and the distribution of the lead time itself. The distribution of lead time demand is derived from the distributions of individual demand quantities and not from the demand per period. It is not surprising that the resulting formulae for the mean and variance are different from those currently used. The theory developed is also applicable to periodic review systems. The system has been implemented at CERN and enables a significant enha...

  3. From gap probabilities in random matrix theory to eigenvalue expansions

    Science.gov (United States)

    Bothner, Thomas

    2016-02-01

    We present a method to derive asymptotics of eigenvalues for trace-class integral operators K :{L}2(J;{{d}}λ )\\circlearrowleft , acting on a single interval J\\subset {{R}}, which belongs to the ring of integrable operators (Its et al 1990 Int. J. Mod. Phys. B 4 1003-37 ). Our emphasis lies on the behavior of the spectrum \\{{λ }i(J)\\}{}i=0∞ of K as | J| \\to ∞ and i is fixed. We show that this behavior is intimately linked to the analysis of the Fredholm determinant {det}(I-γ K){| }{L2(J)} as | J| \\to ∞ and γ \\uparrow 1 in a Stokes type scaling regime. Concrete asymptotic formulæ are obtained for the eigenvalues of Airy and Bessel kernels in random matrix theory. Dedicated to Percy Deift and Craig Tracy on the occasion of their 70th birthdays.

  4. Measurement and probability a probabilistic theory of measurement with applications

    CERN Document Server

    Rossi, Giovanni Battista

    2014-01-01

    Measurement plays a fundamental role both in physical and behavioral sciences, as well as in engineering and technology: it is the link between abstract models and empirical reality and is a privileged method of gathering information from the real world. Is it possible to develop a single theory of measurement for the various domains of science and technology in which measurement is involved? This book takes the challenge by addressing the following main issues: What is the meaning of measurement? How do we measure? What can be measured? A theoretical framework that could truly be shared by scientists in different fields, ranging from physics and engineering to psychology is developed. The future in fact will require greater collaboration between science and technology and between different sciences. Measurement, which played a key role in the birth of modern science, can act as an essential interdisciplinary tool and language for this new scenario. A sound theoretical basis for addressing key problems in mea...

  5. Determinism and probability in the development of the cell theory.

    Science.gov (United States)

    Duchesneau, François

    2012-09-01

    A return to Claude Bernard's original use of the concept of 'determinism' displays the fact that natural laws were presumed to rule over all natural processes. In a more restricted sense, the term boiled down to a mere presupposition of constant determinant causes for those processes, leaving aside any particular ontological principle, even stochastic. The history of the cell theory until around 1900 was dominated by a twofold conception of determinant causes. Along a reductionist trend, cells' structures and processes were supposed to be accounted for through their analysis into detailed partial mechanisms. But a more holistic approach tended to subsume those analytic means and the mechanism involved under a program of global functional determinations. When mitotic and meiotic sequences in nuclear replication were being unveiled and that neo-Mendelian genetics was being grafted onto cytology and embryology, a conception of strict determinism at the nuclear level, principally represented by Wilhelm Roux and August Weismann, would seem to rule unilaterally over the mosaic interpretation of the cleavage of blastomeres. But, as shown by E.B. Wilson, in developmental processes there occur contingent outcomes of cell division which observations and experiments reveal. This induces the need to admit 'epigenetic' determinants and relativize the presumed 'preformation' of thedevelopmental phases by making room for an emergent order which the accidental circumstances of gene replication would trigger on. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Probabilities and Shannon's Entropy in the Everett Many-Worlds Theory

    Directory of Open Access Journals (Sweden)

    Andreas Wichert

    2016-12-01

    Full Text Available Following a controversial suggestion by David Deutsch that decision theory can solve the problem of probabilities in the Everett many-worlds we suggest that the probabilities are induced by Shannon's entropy that measures the uncertainty of events. We argue that a relational person prefers certainty to uncertainty due to fundamental biological principle of homeostasis.

  7. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  8. Higher risk of probable mental emotional disorder in low or severe vision subjects

    Directory of Open Access Journals (Sweden)

    Lutfah Rif’ati

    2012-07-01

    health problem priority in Indonesia. This paper presents an assessment of severe visual impairments related to the risk of MED. Methods: This paper assessed a part of Basic Health Research (Riskesdas 2007 data. For this assessment, subjects 15 years old or more had their visual acuity measured using the Snellen chart and their mental health status determined using the Self Reporting Questionnaire (SRQ 20. A subject was considered to have probable MED if the subject had a total score of 6 or more on the SRQ. Based on the measure of visual acuity, visual acuity was divided into 3 categories: normal/mild (20/20 to 20/60; low vision (less than 20/60 to 3/60; and blind (less than 3/60 to 0/0. Results: Among 972,989 subjects, 554,886 were aged 15 years or older. 11.4% of the subjects had probable MED. The prevalence of low vision and blindness was 5.1% and 0.9%, respectively. Compared to subjects with normal or mild visual impairments, subjects with low vision had a 74% increased risk for probable MED [adjusted relative risk (RRa=1,75; 95% confidence interval (CI=1,71-1,79].  Blind subjects had a 2.7-fold risk to be probable MED (RRa=2.69; 95% CI=2.60-2.78] compared to subjects with normal or mild visual impairments. Conclusion: Visual impairment severity increased probable MED risk. Therefore, visual impairment subjects need more attention on probable MED. (Health Science Indones 2011;2:9-13

  9. USING THE WEB-SERVICES WOLFRAM|ALPHA TO SOLVE PROBLEMS IN PROBABILITY THEORY

    Directory of Open Access Journals (Sweden)

    Taras Kobylnyk

    2015-10-01

    Full Text Available The trend towards the use of remote network resources on the Internet clearly delineated. Traditional training combined with increasingly networked, remote technologies become popular cloud computing. Research methods of probability theory are used in various fields. Of particular note is the use of methods of probability theory in psychological and educational research in statistical analysis of experimental data. Conducting such research is impossible without the use of modern information technology. Given the advantages of web-based software, the article describes web-service Wolfram|Alpha. Detailed analysis of the possibilities of using web-service Wolfram|Alpha for solving problems of probability theory. In the case studies described the results of queries for solving of probability theory, in particular the sections random events and random variables. Considered and analyzed the problem of the number of occurrences of event A in n independent trials using Wolfram|Alpha, detailed analysis of the possibilities of using the service Wolfram|Alpha for the study of continuous random variable that has a normal and uniform probability distribution, including calculating the probability of getting the value of a random variable in a given interval. The problem in applying the binomial and hypergeometric probability distribution of a discrete random variable and demonstrates the possibility of using the service Wolfram|Alpha for solving it.

  10. Relationship between Future Time Orientation and Item Nonresponse on Subjective Probability Questions: A Cross-Cultural Analysis

    Science.gov (United States)

    Lee, Sunghee; Liu, Mingnan; Hu, Mengyao

    2017-01-01

    Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals’ time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying “I don’t know” item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research. PMID:28781381

  11. Neural correlates of decision making with explicit information about probabilities and incentives in elderly healthy subjects.

    Science.gov (United States)

    Labudda, Kirsten; Woermann, Friedrich G; Mertens, Markus; Pohlmann-Eden, Bernd; Markowitsch, Hans J; Brand, Matthias

    2008-06-01

    Recent functional neuroimaging and lesion studies demonstrate the involvement of the orbitofrontal/ventromedial prefrontal cortex as a key structure in decision making processes. This region seems to be particularly crucial when contingencies between options and consequences are unknown but have to be learned by the use of feedback following previous decisions (decision making under ambiguity). However, little is known about the neural correlates of decision making under risk conditions in which information about probabilities and potential outcomes is given. In the present study, we used functional magnetic resonance imaging to measure blood-oxygenation-level-dependent (BOLD) responses in 12 subjects during a decision making task. This task provided explicit information about probabilities and associated potential incentives. The responses were compared to BOLD signals in a control condition without information about incentives. In contrast to previous decision making studies, we completely removed the outcome phase following a decision to exclude the potential influence of feedback previously received on current decisions. The results indicate that the integration of information about probabilities and incentives leads to activations within the dorsolateral prefrontal cortex, the posterior parietal lobe, the anterior cingulate and the right lingual gyrus. We assume that this pattern of activation is due to the involvement of executive functions, conflict detection mechanisms and arithmetic operations during the deliberation phase of decisional processes that are based on explicit information.

  12. The Subject, Feminist Theory and Latin American Texts

    Directory of Open Access Journals (Sweden)

    Sara Castro-Klaren

    1996-01-01

    Full Text Available From a feminist perspective, this essay reviews and analyzes the interaction between metropolitan feminist theories and their interphase with the academic criticism of texts written by Latin American women. Discussion focuses on the question of the subject, which the author believes to be paramount in feminist theory, in as much as the construction of gender and the historical subordination of women devolve on the play of difference and identity. This paper examines how the problematic assumption by feminist theorists in the North American academy of Freudian and Lacanian theories of the subject pose unresolved problems and unanticipated complications to subsequent deployment of this subject theory as modes of interpretation of texts written by women in Latin America or even to the emancipatory goals on feminists in the academy. This is a case where "traveling theory" must be examined and evaluated very carefully. The second part of the paper concentrates on the feminist challenges that have been already made to both Freudian and Lacanian theories of the feminine. It highlights the work of Jane Flax, Nacy Chodorov, Gayatri Spivak and Judith Butler in suggesting a way out of theories that rely on the primacy of the male subject formation and therefore occlude and preclude the investigation of the modes of women's agency.

  13. Opera house acoustics based on subjective preference theory

    CERN Document Server

    Ando, Yoichi

    2015-01-01

    This book focuses on opera house acoustics based on subjective preference theory; it targets researchers in acoustics and vision who are working in physics, psychology, and brain physiology. This book helps readers to understand any subjective attributes in relation to objective parameters based on the powerful and workable model of the auditory system. It is reconfirmed here that the well-known Helmholtz theory, which was based on a peripheral model of the auditory system, may not well describe pitch, timbre, and duration as well as the spatial sensations described in this book, nor overall responses such as subjective preference of sound fields and the annoyance of environmental noise.

  14. Estimation and asymptotic theory for transition probabilities in Markov renewal multi-state models.

    Science.gov (United States)

    Spitoni, Cristian; Verduijn, Marion; Putter, Hein

    2012-08-07

    In this paper we discuss estimation of transition probabilities for semi-Markov multi-state models. Non-parametric and semi-parametric estimators of the transition probabilities for a large class of models (forward going models) are proposed. Large sample theory is derived using the functional delta method and the use of resampling is proposed to derive confidence bands for the transition probabilities. The last part of the paper concerns the presentation of the main ideas of the R implementation of the proposed estimators, and data from a renal replacement study are used to illustrate the behavior of the estimators proposed.

  15. Particle number and probability density functional theory and A-representability.

    Science.gov (United States)

    Pan, Xiao-Yin; Sahni, Viraht

    2010-04-28

    In Hohenberg-Kohn density functional theory, the energy E is expressed as a unique functional of the ground state density rho(r): E = E[rho] with the internal energy component F(HK)[rho] being universal. Knowledge of the functional F(HK)[rho] by itself, however, is insufficient to obtain the energy: the particle number N is primary. By emphasizing this primacy, the energy E is written as a nonuniversal functional of N and probability density p(r): E = E[N,p]. The set of functions p(r) satisfies the constraints of normalization to unity and non-negativity, exists for each N; N = 1, ..., infinity, and defines the probability density or p-space. A particle number N and probability density p(r) functional theory is constructed. Two examples for which the exact energy functionals E[N,p] are known are provided. The concept of A-representability is introduced, by which it is meant the set of functions Psi(p) that leads to probability densities p(r) obtained as the quantum-mechanical expectation of the probability density operator, and which satisfies the above constraints. We show that the set of functions p(r) of p-space is equivalent to the A-representable probability density set. We also show via the Harriman and Gilbert constructions that the A-representable and N-representable probability density p(r) sets are equivalent.

  16. Optimized lower leg injury probability curves from postmortem human subject tests under axial impacts.

    Science.gov (United States)

    Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A; Szabo, Aniko

    2014-01-01

    Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. The study reexamined lower leg postmortem human subjects (PMHS) data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and noninjury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the covariable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal, and log-logistic distributions was based on the Akaike information criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. The mean age, stature, and weight were 58.2±15.1 years, 1.74±0.08 m, and 74.9±13.8 kg, respectively. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other 2 distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-olds at 5, 25, and 50% risk levels age groups for lower leg fracture. For 25, 45, and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 k

  17. The use of modern information technologies in teaching students of economics theory of probability

    Directory of Open Access Journals (Sweden)

    Иван Васильевич Детушев

    2013-12-01

    Full Text Available This article discusses the use of the program «MathCAD» in teaching students of economic specialties of mathematics. It is shown that the use of this software product contributes to the effective development of methods for solving problems of the theory of probability.

  18. Test Reliability and the Kuder-Richardson Formulas: Derivation from Probability Theory

    Science.gov (United States)

    Zimmerman, Donald W.

    1972-01-01

    Although a great deal of attention has been devoted over a period of years to the estimation of reliability from item statistics, there are still gaps in the mathematical derivation of the Kuder-Richardson results. The main purpose of this paper is to fill some of these gaps, using language consistent with modern probability theory. (Author)

  19. The Subject, Feminist Theory and Latin American Texts

    OpenAIRE

    Sara Castro-Klaren

    1996-01-01

    From a feminist perspective, this essay reviews and analyzes the interaction between metropolitan feminist theories and their interphase with the academic criticism of texts written by Latin American women. Discussion focuses on the question of the subject, which the author believes to be paramount in feminist theory, in as much as the construction of gender and the historical subordination of women devolve on the play of difference and identity. This paper examines how the problematic assump...

  20. Traceable accounts of subjective probability judgments in the IPCC and beyond

    Science.gov (United States)

    Baer, P. G.

    2012-12-01

    One of the major sources of controversy surrounding the reports of the IPCC has been the characterization of uncertainty. Although arguably the IPCC has paid more attention to the process of uncertainty analysis and communication than any comparable assessment body, its efforts to achieve consistency have produced mixed results. In particular, the extensive use of subjective probability assessment has attracted widespread criticism. Statements such as "Average Northern Hemisphere temperatures during the second half of the 20th century were very likely higher than during any other 50-year period in the last 500 years" are ubiquitous (one online database lists nearly 3000 such claims), and indeed are the primary way in which its key "findings" are reported. Much attention is drawn to the precise quantitative definition of such statements (e.g., "very likely" means >90% probability, vs. "extremely likely" which means >95% certainty). But there is no process by which the decision regarding the choice of such uncertainty level for a given finding is formally made or reported, and thus they are easily by disputed by anyone, expert or otherwise, who disagrees with the assessment. In the "Uncertainty Guidance Paper" for the Third Assessment Report, Richard Moss and Steve Schneider defined the concept of a "traceable account," which gave exhaustive detail regarding how one ought to provide documentation of such an uncertainty assessment. But the guidance, while appearing straightforward and reasonable, in fact was an unworkable recipe, which would have taken near-infinite time if used for more than a few key results, and would have required a different structuring of the text than the conventional scientific assessment. And even then it would have left a gap when it came to the actual provenance of any such specific judgments, because there simply is no formal step at which individuals turn their knowledge of the evidence on some finding into a probability judgment. The

  1. Further Evidence That the Effects of Repetition on Subjective Time Depend on Repetition Probability.

    Science.gov (United States)

    Skylark, William J; Gheorghiu, Ana I

    2017-01-01

    Repeated stimuli typically have shorter apparent duration than novel stimuli. Most explanations for this effect have attributed it to the repeated stimuli being more expected or predictable than the novel items, but an emerging body of work suggests that repetition and expectation exert distinct effects on time perception. The present experiment replicated a recent study in which the probability of repetition was varied between blocks of trials. As in the previous work, the repetition effect was smaller when repeats were common (and therefore more expected) than when they were rare. These results add to growing evidence that, contrary to traditional accounts, expectation increases apparent duration whereas repetition compresses subjective time, perhaps via a low-level process like adaptation. These opposing processes can be seen as instances of a more general "processing principle," according to which subjective time is a function of the perceptual strength of the stimulus representation, and therefore depends on a confluence of "bottom-up" and "top-down" variables.

  2. Unification of field theory and maximum entropy methods for learning probability densities.

    Science.gov (United States)

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  3. Investigation of probability theory on Ising models with different four-spin interactions

    Science.gov (United States)

    Yang, Yuming; Teng, Baohua; Yang, Hongchun; Cui, Haijuan

    2017-10-01

    Based on probability theory, two types of three-dimensional Ising models with different four-spin interactions are studied. Firstly the partition function of the system is calculated by considering the local correlation of spins in a given configuration, and then the properties of the phase transition are quantitatively discussed with series expansion technique and numerical method. Meanwhile the rounding errors in this calculation is analyzed so that the possibly source of the error in the calculation based on the mean field theory is pointed out.

  4. The theory, direction, and magnitude of ecosystem fire probability as constrained by precipitation and temperature.

    Science.gov (United States)

    Guyette, Richard; Stambaugh, Michael C; Dey, Daniel; Muzika, Rose Marie

    2017-01-01

    The effects of climate on wildland fire confronts society across a range of different ecosystems. Water and temperature affect the combustion dynamics, irrespective of whether those are associated with carbon fueled motors or ecosystems, but through different chemical, physical, and biological processes. We use an ecosystem combustion equation developed with the physical chemistry of atmospheric variables to estimate and simulate fire probability and mean fire interval (MFI). The calibration of ecosystem fire probability with basic combustion chemistry and physics offers a quantitative method to address wildland fire in addition to the well-studied forcing factors such as topography, ignition, and vegetation. We develop a graphic analysis tool for estimating climate forced fire probability with temperature and precipitation based on an empirical assessment of combustion theory and fire prediction in ecosystems. Climate-affected fire probability for any period, past or future, is estimated with given temperature and precipitation. A graphic analyses of wildland fire dynamics driven by climate supports a dialectic in hydrologic processes that affect ecosystem combustion: 1) the water needed by plants to produce carbon bonds (fuel) and 2) the inhibition of successful reactant collisions by water molecules (humidity and fuel moisture). These two postulates enable a classification scheme for ecosystems into three or more climate categories using their position relative to change points defined by precipitation in combustion dynamics equations. Three classifications of combustion dynamics in ecosystems fire probability include: 1) precipitation insensitive, 2) precipitation unstable, and 3) precipitation sensitive. All three classifications interact in different ways with variable levels of temperature.

  5. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  6. Reality, Causality, and Probability, from Quantum Mechanics to Quantum Field Theory

    Science.gov (United States)

    Plotnitsky, Arkady

    2015-10-01

    These three lectures consider the questions of reality, causality, and probability in quantum theory, from quantum mechanics to quantum field theory. They do so in part by exploring the ideas of the key founding figures of the theory, such N. Bohr, W. Heisenberg, E. Schrödinger, or P. A. M. Dirac. However, while my discussion of these figures aims to be faithful to their thinking and writings, and while these lectures are motivated by my belief in the helpfulness of their thinking for understanding and advancing quantum theory, this project is not driven by loyalty to their ideas. In part for that reason, these lectures also present different and even conflicting ways of thinking in quantum theory, such as that of Bohr or Heisenberg vs. that of Schrödinger. The lectures, most especially the third one, also consider new physical, mathematical, and philosophical complexities brought in by quantum field theory vis-à-vis quantum mechanics. I close by briefly addressing some of the implications of the argument presented here for the current state of fundamental physics.

  7. The attention schema theory: a mechanistic account of subjective awareness

    Directory of Open Access Journals (Sweden)

    Taylor W. Webb

    2015-04-01

    Full Text Available We recently proposed the attention schema theory, a novel way to explain the brain basis of subjective awareness in a mechanistic and scientifically testable manner. The theory begins with attention, the process by which signals compete for the brain’s limited computing resources. This internal signal competition is partly under a bottom-up influence and partly under top-down control. We propose that the top-down control of attention is improved when the brain has access to a simplified model of attention itself. The brain therefore constructs a schematic model of the process of attention, the ‘attention schema’, in much the same way that it constructs a schematic model of the body, the ‘body schema’. The content of this internal model leads a brain to conclude that it has a subjective experience. One advantage of this theory is that it explains how awareness and attention can sometimes become dissociated; the brain’s internal models are never perfect, and sometimes a model becomes dissociated from the object being modeled. A second advantage of this theory is that it explains how we can be aware of both internal and external events. The brain can apply attention to many types of information including external sensory information and internal information about emotions and cognitive states. If awareness is a model of attention, then this model should pertain to the same domains of information to which attention pertains. A third advantage of this theory is that it provides testable predictions. If awareness is the internal model of attention, used to help control attention, then without awareness, attention should still be possible but should suffer deficits in control. In this article, we review the existing literature on the relationship between attention and awareness, and suggest that at least some of the predictions of the theory are borne out by the evidence.

  8. Communicating through Probabilities: Does Quantum Theory Optimize the Transfer of Information?

    Directory of Open Access Journals (Sweden)

    William K. Wootters

    2013-08-01

    Full Text Available A quantum measurement can be regarded as a communication channel, in which the parameters of the state are expressed only in the probabilities of the outcomes of the measurement. We begin this paper by considering, in a non-quantum-mechanical setting, the problem of communicating through probabilities. For example, a sender, Alice, wants to convey to a receiver, Bob, the value of a continuous variable, θ, but her only means of conveying this value is by sending Bob a coin in which the value of θ is encoded in the probability of heads. We ask what the optimal encoding is when Bob will be allowed to flip the coin only a finite number of times. As the number of tosses goes to infinity, we find that the optimal encoding is the same as what nature would do if we lived in a world governed by real-vector-space quantum theory. We then ask whether the problem might be modified, so that the optimal communication strategy would be consistent with standard, complex-vector-space quantum theory.

  9. Conditional Probabilities in the Excursion Set Theory. Generic Barriers and non-Gaussian Initial Conditions

    CERN Document Server

    De Simone, Andrea; Riotto, Antonio

    2011-01-01

    The excursion set theory, where density perturbations evolve stochastically with the smoothing scale, provides a method for computing the dark matter halo mass function. The computation of the mass function is mapped into the so-called first-passage time problem in the presence of a moving barrier. The excursion set theory is also a powerful formalism to study other properties of dark matter halos such as halo bias, accretion rate, formation time, merging rate and the formation history of halos. This is achieved by computing conditional probabilities with non-trivial initial conditions, and the conditional two-barrier first-crossing rate. In this paper we use the recently-developed path integral formulation of the excursion set theory to calculate analytically these conditional probabilities in the presence of a generic moving barrier, including the one describing the ellipsoidal collapse, and for both Gaussian and non-Gaussian initial conditions. The non-Markovianity of the random walks induced by non-Gaussi...

  10. Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Tucker, W. Troy (Applied Biomathematics, Setauket, NY); Zhang, Jianzhong (Iowa State University, Ames, IA); Ginzburg, Lev (Applied Biomathematics, Setauket, NY); Berleant, Daniel J. (Iowa State University, Ames, IA); Ferson, Scott (Applied Biomathematics, Setauket, NY); Hajagos, Janos (Applied Biomathematics, Setauket, NY); Nelsen, Roger B. (Lewis & Clark College, Portland, OR)

    2004-10-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  11. Statistical analysis of nature frequencies of hemispherical resonator gyroscope based on probability theory

    Science.gov (United States)

    Yu, Xudong; Long, Xingwu; Wei, Guo; Li, Geng; Qu, Tianliang

    2015-04-01

    A finite element model of the hemispherical resonator gyro (HRG) is established and the natural frequencies and vibration modes are investigated. The matrix perturbation technology in the random finite element method is first introduced to analyze the statistical characteristics of the natural frequencies of HRG. The influences of random material parameters and dimensional parameters on the natural frequencies are quantitatively described based on the probability theory. The statistics expressions of the random parameters are given and the influences of three key parameters on natural frequency are pointed out. These results are important for design and improvement of high accuracy HRG.

  12. How Much Will the Sea Level Rise? Outcome Selection and Subjective Probability in Climate Change Predictions.

    Science.gov (United States)

    Juanchich, Marie; Sirota, Miroslav

    2017-08-17

    We tested whether people focus on extreme outcomes to predict climate change and assessed the gap between the frequency of the predicted outcome and its perceived probability while controlling for climate change beliefs. We also tested 2 cost-effective interventions to reduce the preference for extreme outcomes and the frequency-probability gap by manipulating the probabilistic format: numerical or dual-verbal-numerical. In 4 experiments, participants read a scenario featuring a distribution of sea level rises, selected a sea rise to complete a prediction (e.g., "It is 'unlikely' that the sea level will rise . . . inches") and judged the likelihood of this sea rise occurring. Results showed that people have a preference for predicting extreme climate change outcomes in verbal predictions (59% in Experiments 1-4) and that this preference was not predicted by climate change beliefs. Results also showed an important gap between the predicted outcome frequency and participants' perception of the probability that it would occur. The dual-format reduced the preference for extreme outcomes for low and medium probability predictions but not for high ones, and none of the formats consistently reduced the frequency-probability gap. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. The theory, direction, and magnitude of ecosystem fire probability as constrained by precipitation and temperature

    Science.gov (United States)

    Guyette, Richard; Stambaugh, Michael C.; Dey, Daniel

    2017-01-01

    The effects of climate on wildland fire confronts society across a range of different ecosystems. Water and temperature affect the combustion dynamics, irrespective of whether those are associated with carbon fueled motors or ecosystems, but through different chemical, physical, and biological processes. We use an ecosystem combustion equation developed with the physical chemistry of atmospheric variables to estimate and simulate fire probability and mean fire interval (MFI). The calibration of ecosystem fire probability with basic combustion chemistry and physics offers a quantitative method to address wildland fire in addition to the well-studied forcing factors such as topography, ignition, and vegetation. We develop a graphic analysis tool for estimating climate forced fire probability with temperature and precipitation based on an empirical assessment of combustion theory and fire prediction in ecosystems. Climate-affected fire probability for any period, past or future, is estimated with given temperature and precipitation. A graphic analyses of wildland fire dynamics driven by climate supports a dialectic in hydrologic processes that affect ecosystem combustion: 1) the water needed by plants to produce carbon bonds (fuel) and 2) the inhibition of successful reactant collisions by water molecules (humidity and fuel moisture). These two postulates enable a classification scheme for ecosystems into three or more climate categories using their position relative to change points defined by precipitation in combustion dynamics equations. Three classifications of combustion dynamics in ecosystems fire probability include: 1) precipitation insensitive, 2) precipitation unstable, and 3) precipitation sensitive. All three classifications interact in different ways with variable levels of temperature. PMID:28704457

  14. The Effect of Computer-Assisted Teaching on Remedying Misconceptions: The Case of the Subject "Probability"

    Science.gov (United States)

    Gurbuz, Ramazan; Birgin, Osman

    2012-01-01

    The aim of this study is to determine the effects of computer-assisted teaching (CAT) on remedying misconceptions students often have regarding some probability concepts in mathematics. Toward this aim, computer-assisted teaching materials were developed and used in the process of teaching. Within the true-experimental research method, a pre- and…

  15. Explaining participation differentials in Dutch higher education : the impact of subjective success probabilities on level choice and field choice

    NARCIS (Netherlands)

    Tolsma, J.; Need, A.; Jong, U. de

    2010-01-01

    In this article we examine whether subjective estimates of success probabilities explain the effect of social origin, sex, and ethnicity on students’ choices between different school tracks in Dutch higher education. The educational options analysed differ in level (i.e. university versus

  16. Using optimal transport theory to estimate transition probabilities in metapopulation dynamics

    Science.gov (United States)

    Nichols, Jonathan M.; Spendelow, Jeffrey A.; Nichols, James D.

    2017-01-01

    This work considers the estimation of transition probabilities associated with populations moving among multiple spatial locations based on numbers of individuals at each location at two points in time. The problem is generally underdetermined as there exists an extremely large number of ways in which individuals can move from one set of locations to another. A unique solution therefore requires a constraint. The theory of optimal transport provides such a constraint in the form of a cost function, to be minimized in expectation over the space of possible transition matrices. We demonstrate the optimal transport approach on marked bird data and compare to the probabilities obtained via maximum likelihood estimation based on marked individuals. It is shown that by choosing the squared Euclidean distance as the cost, the estimated transition probabilities compare favorably to those obtained via maximum likelihood with marked individuals. Other implications of this cost are discussed, including the ability to accurately interpolate the population's spatial distribution at unobserved points in time and the more general relationship between the cost and minimum transport energy.

  17. Beta-decay rate and beta-delayed neutron emission probability of improved gross theory

    Science.gov (United States)

    Koura, Hiroyuki

    2014-09-01

    A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for unmeasured nuclei are adopted from the KTUY nuclear mass formula, which is based on the spherical-basis method. Considering the properties of the integrated Fermi function, we can roughly categorized energy region of excited-state of a daughter nucleus into three regions: a highly-excited energy region, which fully affect a delayed neutron probability, a middle energy region, which is estimated to contribute the decay heat, and a region neighboring the ground-state, which determines the beta-decay rate. Some results will be given in the presentation. A theoretical study has been carried out on beta-decay rate and beta-delayed neutron emission probability. The gross theory of the beta decay is based on an idea of the sum rule of the beta-decay strength function, and has succeeded in describing beta-decay half-lives of nuclei overall nuclear mass region. The gross theory includes not only the allowed transition as the Fermi and the Gamow-Teller, but also the first-forbidden transition. In this work, some improvements are introduced as the nuclear shell correction on nuclear level densities and the nuclear deformation for nuclear strength functions, those effects were not included in the original gross theory. The shell energy and the nuclear deformation for

  18. How Prevalent Is Wishful Thinking? Misattribution of Arousal Causes Optimism and Pessimism in Subjective Probabilities

    Science.gov (United States)

    Vosgerau, Joachim

    2010-01-01

    People appear to be unrealistically optimistic about their future prospects, as reflected by theory and research in the fields of psychology, organizational behavior, behavioral economics, and behavioral finance. Many real-world examples (e.g., consumer behavior during economic recessions), however, suggest that people are not always overly…

  19. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  20. Implementation of Subjective Probability Estimates in Army Intelligence Procedures: A Critical Review of Research Findings

    Science.gov (United States)

    1980-03-01

    subjective probabil- ity estimates have been incorporated routinely into tactical intelligence comunications . Research in the area of intelligence...analysis: Report on Phase I. Report FSC-71-5047. Gaithersburg, Md.: International Business Machines (IBM), Federal Systems Division, 1971. Kelly, C. W

  1. The application of Bayes probability theory for uncertainty assessments of Antarctic ice sheet predictions

    Science.gov (United States)

    Wernecke, Andreas; Edwards, Tamsin; Edwards, Neil; Holden, Philip

    2017-04-01

    Ice sheet models (ISMs) require a variety of inputs which are known with different levels of certainty. Our current knowledge of ISM sensitivities is mainly based on single or multi parameter perturbation studies which cover only a small subset of all model inputs due to the high dimensionality of ISMs and computational constraints. Here we present a framework to enhance this approach to a systematic statistical investigation of all major sensitivities based on the well-known Bayes probability theory. We demonstrate that a principal component decomposition can be used to drastically reduce the dimensionality of field type components while retaining their structure. However, a systematic perturbation of all inputs is still not computationally feasible with grounding line resolving ISMs. Therefore we propose a Gaussian Process (GP) model trained on a set of ISM runs to emulate its behaviour and with it the sensitivities to input parameters. The beauty of a GP model is amongst other things that it provides probability distributions instead of only "best" estimates which promotes an iterative emulation: an initial set of ISM runs is used to train a GP model as emulator. This emulator is used to identify new ISM setups which are of high interest to improve the emulation (i.e. have wide probability distributions). Performing those setups leads to an updated emulator, and so forth. This framework is not only a cost effective tool for ice sheet model analytics but also for predictive purposes. Applications may include model calibrations, updates of revised input datasets and setup adjustments for model inter comparisons with virtually no additional computational cost.

  2. Overestimating HIV infection: The construction and accuracy of subjective probabilities of HIV infection in rural Malawi

    OpenAIRE

    Anglewicz, Philip; Kohler, Hans-Peter

    2009-01-01

    In the absence of HIV testing, how do rural Malawians assess their HIV status? In this paper, we use a unique dataset that includes respondents' HIV status as well as their subjective likelihood of HIV infection. These data show that many rural Malawians overestimate their likelihood of current HIV infection. The discrepancy between actual and perceived status raises an important question: Why are so many wrong? We begin by identifying determinants of self-assessed HIV status, and then compar...

  3. A Complete Theory of Everything (will be subjective)

    CERN Document Server

    Hutter, Marcus

    2009-01-01

    The progression of theories suggested for our world, from ego- to geo- to helio-centric models to universe and multiverse theories and beyond, shows one tendency: The size of the described worlds increases, with humans being expelled from their center to ever more remote and random locations. If pushed too far, a potential theory of everything (ToE) is actually more a theory of nothing (ToN). Indeed such theories have already been developed. I show that including observer localization into such theories is necessary and sufficient to avoid this problem. Ockham's razor is used to develop a quantitative recipe to identify ToEs and distinguish them from ToNs and theories in-between. This precisely shows what the problem is with some recently suggested universal ToEs. The suggested principle is extended to more practical (partial, approximate, probabilistic, parametric) world models (rather than ToEs). Finally, I provide a justification of Ockham's razor.

  4. Average bit error probability of binary coherent signaling over generalized fading channels subject to additive generalized gaussian noise

    KAUST Repository

    Soury, Hamza

    2012-06-01

    This letter considers the average bit error probability of binary coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closed form expression in terms of the Fox\\'s H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading and Nakagami-m fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters. © 2012 IEEE.

  5. Exact Symbol Error Probability of Square M-QAM Signaling over Generalized Fading Channels subject to Additive Generalized Gaussian Noise

    KAUST Repository

    Soury, Hamza

    2013-07-01

    This paper considers the average symbol error probability of square Quadrature Amplitude Modulation (QAM) coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closedform expression in terms of the Fox H function and the bivariate Fox H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading, Nakagami-m fading, and Rayleigh fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters.

  6. Some considerations on the definition of risk based on concepts of systems theory and probability.

    Science.gov (United States)

    Andretta, Massimo

    2014-07-01

    The concept of risk has been applied in many modern science and technology fields. Despite its successes in many applicative fields, there is still not a well-established vision and universally accepted definition of the principles and fundamental concepts of the risk assessment discipline. As emphasized recently, the risk fields suffer from a lack of clarity on their scientific bases that can define, in a unique theoretical framework, the general concepts in the different areas of application. The aim of this article is to make suggestions for another perspective of risk definition that could be applied and, in a certain sense, generalize some of the previously known definitions (at least in the fields of technical and scientific applications). By drawing on my experience of risk assessment in different applicative situations (particularly in the risk estimation for major industrial accidents, and in the health and ecological risk assessment for contaminated sites), I would like to revise some general and foundational concepts of risk analysis in as consistent a manner as possible from the axiomatic/deductive point of view. My proposal is based on the fundamental concepts of the systems theory and of the probability. In this way, I try to frame, in a single, broad, and general theoretical context some fundamental concepts and principles applicable in many different fields of risk assessment. I hope that this article will contribute to the revitalization and stimulation of useful discussions and new insights into the key issues and theoretical foundations of risk assessment disciplines. © 2013 Society for Risk Analysis.

  7. Introduction to the life estimation of materials - Application of the theory of probability statistics of extreme values to corrosion

    Energy Technology Data Exchange (ETDEWEB)

    Kowaka, M.

    1984-01-01

    The book contains a history of the application of statistics of extreme values of corrosion, fundamentals of statistics, probability of corrosion phenomena, exercises to understand the theory. The corrosion phenomena are described and the quantum analysis of localized corrosion and life estimation of materials are available by using the method.

  8. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  9. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  10. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  11. Exact closed form expressions for outage probability of GSC receivers over Rayleigh fading channel subject to self-interference

    KAUST Repository

    Nam, Sungsik

    2010-11-01

    Previous work on performance analyses of generalized selection combining (GSC) RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closed-form expressions for various performance measures. However, some open problems related to the performance evaluation of GSC RAKE receivers still remain to be solved such that an assessment of the impact of self-interference on the performance of GSC RAKE receivers. To have a full and exact understanding of the performance of GSC RAKE receivers, the outage probability of GSC RAKE receivers needs to be analyzed as closed-form expressions. The major difficulty in this problem is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for outage probability of GSC RAKE receivers subject to self-interference over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.

  12. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    Science.gov (United States)

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  14. Item response theory at subject- and group-level

    NARCIS (Netherlands)

    Tobi, Hilde

    1990-01-01

    This paper reviews the literature about item response models for the subject level and aggregated level (group level). Group-level item response models (IRMs) are used in the United States in large-scale assessment programs such as the National Assessment of Educational Progress and the California

  15. Comparing the difficulty of examination subjects with item response theory

    NARCIS (Netherlands)

    Korobko, O.B.; Glas, Cornelis A.W.; Bosker, Roel; Luyten, Johannes W.

    2008-01-01

    Methods are presented for comparing grades obtained in a situation where students can choose between different subjects. It must be expected that the comparison between the grades is complicated by the interaction between the students' pattern and level of proficiency on one hand, and the choice of

  16. Quantum-correlation breaking channels, quantum conditional probability and Perron–Frobenius theory

    Energy Technology Data Exchange (ETDEWEB)

    Chruściński, Dariusz, E-mail: darch@fizyka.umk.pl [Institute of Physics, Faculty of Physics, Astronomy and Informatics, Nicolaus Copernicus University, Grudziadzka 5, 87-100 Toruń (Poland)

    2013-03-01

    Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum–classical and classical–classical channels. Applying the quantum analog of Perron–Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum–classical channels to arbitrary quantum channels.

  17. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  18. A Philatelic Excursion with Jeff Hunter in Probability and Matrix Theory

    Directory of Open Access Journals (Sweden)

    George P. H. Styan

    2007-01-01

    Full Text Available We present an excursion with Jeff Hunter, visiting some of his research topics. Specifically, we will present some facts about certain people whose work seems to have influenced Jeff in his scientific career; we illustrate our presentation with postage stamps that have been issued in honour of these people. Our main guide is Hunter’s two-volume book entitled Mathematical Techniques of Applied Probability (Academic Press, 1983.

  19. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...

  20. Landscape History and Theory: from Subject Matter to Analytical Tool

    Directory of Open Access Journals (Sweden)

    Jan Birksted

    2003-10-01

    Full Text Available This essay explores how landscape history can engage methodologically with the adjacent disciplines of art history and visual/cultural studies. Central to the methodological problem is the mapping of the beholder - spatially, temporally and phenomenologically. In this mapping process, landscape history is transformed from subject matter to analytical tool. As a result, landscape history no longer simply imports and applies ideas from other disciplines but develops its own methodologies to engage and influence them. Landscape history, like art history, thereby takes on a creative cultural presence. Through that process, landscape architecture and garden design regains the cultural power now carried by the arts and museum studies, and has an effect on the innovative capabilities of contemporary landscape design.

  1. Experience matters: information acquisition optimizes probability gain.

    Science.gov (United States)

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.

  2. Classical many-body theory with retarded interactions: Dynamical irreversibility and determinism without probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Zakharov, A.Yu., E-mail: Anatoly.Zakharov@novsu.ru; Zakharov, M.A., E-mail: ma_zakharov@list.ru

    2016-01-28

    The exact equations of motion for microscopic density of classical many-body system with account of inter-particle retarded interactions is derived. It is shown that interactions retardation leads to irreversible behavior of many-body systems. - Highlights: • A new form of equation of motion of classical many-body system is proposed. • Interactions retardation as one of the mechanisms of many-body system irreversibility. • Irreversibility and determinism without probabilities. • The possible way to microscopic foundation of thermodynamics.

  3. MODEL OF PROFESSIONAL ORIENTED EDUCATION OF FUTURE ENGINEERS PROBABILITY THEORY AND STOCHASTIC PROCESSES AND ITS IMPLEMENTATION WHEN PRACTICAL LESSONS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2014-04-01

    Full Text Available The concept model and various approaches to creating of such models are analyzed in the paper. The essence of the model which reflects the process of implementing all the components of designed teaching methodology in their interaction is presented. Professionally oriented education model on the probability theory and stochastic processes course for future engineers is proposed by author. It consists of four parts: theoretical; methodological; content and organization unit; control and effective unit. Applying of methodological foundations of the theory of professionally oriented, heuristic, problem-based learning for forming of intensive learning students’ activities during practical classes is shown. Organizational methods, forms and tools of training, which promote the formation of the internal purposes of students, are presented in the paper. Methods of designing a system of professional-oriented tasks and its applying at the practical classes are given by author. Some ways of developing of students’ skills and abilities during generalization and systematization of knowledge, integrated practical exercises, laboratory works, and business games are considered. Indicators of the formation levels of training activities motivation, professional motivation, self-motivation, levels of knowledge and skills in the probability theory and stochastic processes course, levels of development of professional and analytical thinking, level of applying some e-tools are analyzed by author. The possibility of using measuring tools, including questionnaires, surveys, freshman test, modular tests, exams and special engineering disciplines test, current tests is underlined.

  4. A Framework for the Statistical Analysis of Probability of Mission Success Based on Bayesian Theory

    Science.gov (United States)

    2014-06-01

    subjectively, they also require a large amount of cognitive work. 2.2 Logistic Regression Logistic regression uses Frequentist probabilistic methods...node is an influence on the system. The colours of the nodes represent the different data types, as follows: • Orange: variable. • Blue: fixed, and

  5. Modeling self on others: An import theory of subjectivity and selfhood.

    Science.gov (United States)

    Prinz, Wolfgang

    2017-03-01

    This paper outlines an Import Theory of subjectivity and selfhood. Import theory claims that subjectivity is initially perceived as a key feature of other minds before it then becomes imported from other minds to own minds whereby it lays the ground for mental selfhood. Import theory builds on perception-production matching, which in turn draws on both representational mechanisms and social practices. Representational mechanisms rely on common coding of perception and production. Social practices rely on action mirroring in dyadic interactions. The interplay between mechanisms and practices gives rise to model self on others. Individuals become intentional agents in virtue of perceiving others mirroring themselves. The outline of the theory is preceded by an introductory section that locates import theory in the broader context of competing approaches, and it is followed by a concluding section that assesses import theory in terms of empirical evidence and explanatory power. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Quantifying Diagnostic Uncertainty Using Item Response Theory: The Posterior Probability of Diagnosis Index

    Science.gov (United States)

    Lindhiem, Oliver; Kolko, David J.; Yu, Lan

    2013-01-01

    Using traditional Diagnostic and Statistical Manual of Mental Disorders (DSM; American Psychiatric Association, 2000) diagnostic criteria, clinicians are forced to make categorical decisions (diagnosis versus no diagnosis). This forced choice implies that mental and behavioral health disorders are categorical and does not fully characterize varying degrees of uncertainty associated with a particular diagnosis. Using an IRT (latent trait model) framework, we describe the development of the Posterior Probability of Diagnosis (PPOD) Index which answers the question, “What is the likelihood that a patient meets or exceeds the latent trait threshold for a diagnosis.” The PPOD Index is based on the posterior distribution of θ (latent trait score) for each patient’s profile of symptoms. The PPOD Index allows clinicians to quantify and communicate the degree of uncertainty associated with each diagnosis in probabilistic terms. We illustrate the advantages of the PPOD Index in a clinical sample (N = 321) of children and adolescents with Oppositional Defiant Disorder (ODD). PMID:23356682

  7. Re/Writing the Subject: A Contribution to Post-Structuralist Theory in Mathematics Education

    Science.gov (United States)

    Roth, Wolff-Michael

    2012-01-01

    This text, occasioned by a critical reading of "Mathematics Education and Subjectivity" (Brown, "2011") and constituting a response to the book, aims at contributing to the building of (post-structuralist) theory in mathematics education. Its purpose was to re/write two major positions that "Mathematics Education and Subjectivity" articulates:…

  8. Optimal Volume for Concert Halls Based on Ando’s Subjective Preference and Barron Revised Theories

    Directory of Open Access Journals (Sweden)

    Salvador Cerdá

    2014-03-01

    Full Text Available The Ando-Beranek’s model, a linear version of Ando’s subjective preference theory, obtained by the authors in a recent work, was combined with Barron revised theory. An optimal volume region for each reverberation time was obtained for classical music in symphony orchestra concert halls. The obtained relation was tested with good agreement with the top rated halls reported by Beranek and other halls with reported anomalies.

  9. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: Higher order theory based on the Bethe-Peierls and path probability method approximations

    Science.gov (United States)

    Edison, John R.; Monson, Peter A.

    2014-07-01

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  10. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  11. What is Probability Theory?

    Indian Academy of Sciences (India)

    Resonance – Journal of Science Education. Current Issue : Vol. 22, Issue 8 · Current Issue Volume 22 | Issue 8. August 2017. Home · Volumes & Issues · Categories · Special Issues · Search · Editorial Board · Information for Authors · Subscription ...

  12. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  13. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  14. Increased probability of repetitive spinal motoneuron activation by transcranial magnetic stimulation after muscle fatigue in healthy subjects

    DEFF Research Database (Denmark)

    Andersen, Birgit; Felding, Ulrik Ascanius; Krarup, Christian

    2012-01-01

    Triple stimulation technique (TST) has previously shown that transcranial magnetic stimulation (TMS) fails to activate a proportion of spinal motoneurons (MNs) during motor fatigue. The TST response depression without attenuation of the conventional motor evoked potential suggested increased prob...... the muscle is fatigued. Repetitive MN firing may provide an adaptive mechanism to maintain motor unit activation and task performance during sustained voluntary activity.......Triple stimulation technique (TST) has previously shown that transcranial magnetic stimulation (TMS) fails to activate a proportion of spinal motoneurons (MNs) during motor fatigue. The TST response depression without attenuation of the conventional motor evoked potential suggested increased...... probability of repetitive spinal MN activation during exercise even if some MNs failed to discharge by the brain stimulus. Here we used a modified TST (Quadruple stimulation; QuadS and Quintuple stimulation; QuintS) to examine the influence of fatiguing exercise on second and third MN discharges after...

  15. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    Directory of Open Access Journals (Sweden)

    Mark William Perlin

    2015-01-01

    Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN

  16. Developing a Methodology for Eliciting Subjective Probability Estimates During Expert Evaluations of Safety Interventions: Application for Bayesian Belief Networks

    Science.gov (United States)

    Wiegmann, Douglas A.a

    2005-01-01

    The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.

  17. Subject positions theory -- its application to understanding collaboration (and confrontation) in critical care.

    Science.gov (United States)

    Sundin-Huard, D

    2001-05-01

    Doctors and nurses do not usually take a collaborative approach to the ethical challenges of the critical care environment. This leads to the stresses that produce moral anguish and burnout -- both for nursing and medical staff. A more collegial relationship between nurses and physicians should improve patient care. If we are to promote this collegiality, one way to proceed is to investigate the interactions between health care professionals in order to develop an understanding of the barriers to, and supports for collaboration. Subject positions theory offers a method of explaining and elucidating the interactions between nurse and physician in terms of power dynamics, mutual expectations and the discourse available to each individual. This paper aims to demonstrate how subject positions theory can facilitate the interpretation of the interactions between health professionals in terms of the power dynamics influencing those interactions. This paper will use the example of a case study from my own research to demonstrate the application of this theory and its usefulness in the analysis of the interactions between health care professionals. Application of this theory is used to demonstrate the author's argument that the current political and cultural structure of the health care system does not support the subject position - nurse advocate.

  18. A multiscale finite element model validation method of composite cable-stayed bridge based on Probability Box theory

    Science.gov (United States)

    Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan

    2016-05-01

    Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.

  19. Applying Probability Theory for the Quality Assessment of a Wildfire Spread Prediction Framework Based on Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Andrés Cencerrado

    2013-01-01

    Full Text Available This work presents a framework for assessing how the existing constraints at the time of attending an ongoing forest fire affect simulation results, both in terms of quality (accuracy obtained and the time needed to make a decision. In the wildfire spread simulation and prediction area, it is essential to properly exploit the computational power offered by new computing advances. For this purpose, we rely on a two-stage prediction process to enhance the quality of traditional predictions, taking advantage of parallel computing. This strategy is based on an adjustment stage which is carried out by a well-known evolutionary technique: Genetic Algorithms. The core of this framework is evaluated according to the probability theory principles. Thus, a strong statistical study is presented and oriented towards the characterization of such an adjustment technique in order to help the operation managers deal with the two aspects previously mentioned: time and quality. The experimental work in this paper is based on a region in Spain which is one of the most prone to forest fires: El Cap de Creus.

  20. Facilitating Group Decision-Making: Facilitator's Subjective Theories on Group Coordination

    Directory of Open Access Journals (Sweden)

    Michaela Kolbe

    2008-10-01

    Full Text Available A key feature of group facilitation is motivating and coordinating people to perform their joint work. This paper focuses on group coordination which is a prerequisite to group effectiveness, especially in complex tasks. Decision-making in groups is a complex task that consequently needs to be coordinated by explicit rather than implicit coordination mechanisms. Based on the embedded definition that explicit coordination does not just happen but is purposely executed by individuals, we argue that individual coordination intentions and mechanisms should be taken into account. Thus far, the subjective perspective of coordination has been neglected in coordination theory, which is understandable given the difficulties in defining and measuring subjective aspects of group facilitation. We therefore conducted focused interviews with eight experts who either worked as senior managers or as experienced group facilitators and analysed their approaches to group coordination using methods of content analysis. Results show that these experts possess sophisticated mental representations of their coordination behaviour. These subjective coordination theories can be organised in terms of coordination schemes in which coordination-releasing situations are facilitated by special coordination mechanisms that, in turn, lead to the perception of specific consequences. We discuss the importance of these subjective coordination theories for effectively facilitating group decision-making and minimising process losses. URN: urn:nbn:de:0114-fqs0901287

  1. Subjectivity

    Directory of Open Access Journals (Sweden)

    Jesús Vega Encabo

    2015-11-01

    Full Text Available In this paper, I claim that subjectivity is a way of being that is constituted through a set of practices in which the self is subject to the dangers of fictionalizing and plotting her life and self-image. I examine some ways of becoming subject through narratives and through theatrical performance before others. Through these practices, a real and active subjectivity is revealed, capable of self-knowledge and self-transformation. 

  2. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  3. Experience Matters: Information Acquisition Optimizes Probability Gain

    Science.gov (United States)

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  4. A grounded theory approach to the subjective understanding of urban soundscape in Sheffield

    OpenAIRE

    Liu, F.; Kang, J.

    2016-01-01

    The aim of this study is to gain a greater insight into the factors that affect individuals' preferences and understanding of urban soundscapes. Based on a grounded theory approach, with 53 participants in Sheffield, five categories have been revealed for the subjective understanding of soundscape: soundscape definition, soundscape memory, soundscape sentiment, soundscape expectation, and soundscape aesthetics. More specifically, to some extent, the value people place on sounds does not lie i...

  5. Relativistic Many-body Moller-Plesset Perturbation Theory Calculations of the Energy Levels and Transition Probabilities in Na- to P-like Xe Ions

    Energy Technology Data Exchange (ETDEWEB)

    Vilkas, M J; Ishikawa, Y; Trabert, E

    2007-03-27

    Relativistic multireference many-body perturbation theory calculations have been performed on Xe{sup 43+}-Xe{sup 39+} ions, resulting in energy levels, electric dipole transition probabilities, and level lifetimes. The second-order many-body perturbation theory calculation of energy levels included mass shifts, frequency-dependent Breit correction and Lamb shifts. The calculated transition energies and E1 transition rates are used to present synthetic spectra in the extreme ultraviolet range for some of the Xe ions.

  6. Using extreme value theory approaches to forecast the probability of outbreak of highly pathogenic influenza in Zhejiang, China.

    Directory of Open Access Journals (Sweden)

    Jiangpeng Chen

    Full Text Available Influenza is a contagious disease with high transmissibility to spread around the world with considerable morbidity and mortality and presents an enormous burden on worldwide public health. Few mathematical models can be used because influenza incidence data are generally not normally distributed. We developed a mathematical model using Extreme Value Theory (EVT to forecast the probability of outbreak of highly pathogenic influenza.The incidence data of highly pathogenic influenza in Zhejiang province from April 2009 to November 2013 were retrieved from the website of Health and Family Planning Commission of Zhejiang Province. MATLAB "VIEM" toolbox was used to analyze data and modelling. In the present work, we used the Peak Over Threshold (POT model, assuming the frequency as a Poisson process and the intensity to be Pareto distributed, to characterize the temporal variability of the long-term extreme incidence of highly pathogenic influenza in Zhejiang, China.The skewness and kurtosis of the incidence of highly pathogenic influenza in Zhejiang between April 2009 and November 2013 were 4.49 and 21.12, which indicated a "fat tail" distribution. A QQ plot and a mean excess plot were used to further validate the features of the distribution. After determining the threshold, we modeled the extremes and estimated the shape parameter and scale parameter by the maximum likelihood method. The results showed that months in which the incidence of highly pathogenic influenza is about 4462/2286/1311/487 are predicted to occur once every five/three/two/one year, respectively.Despite the simplicity, the present study successfully offers the sound modeling strategy and a methodological avenue to implement forecasting of an epidemic in the midst of its course.

  7. Bonding in Heavier Group 14 Zero-Valent Complexes-A Combined Maximum Probability Domain and Valence Bond Theory Approach.

    Science.gov (United States)

    Turek, Jan; Braïda, Benoît; De Proft, Frank

    2017-10-17

    The bonding in heavier Group 14 zero-valent complexes of a general formula L2 E (E=Si-Pb; L=phosphine, N-heterocyclic and acyclic carbene, cyclic tetrylene and carbon monoxide) is probed by combining valence bond (VB) theory and maximum probability domain (MPD) approaches. All studied complexes are initially evaluated on the basis of the structural parameters and the shape of frontier orbitals revealing a bent structural motif and the presence of two lone pairs at the central E atom. For the VB calculations three resonance structures are suggested, representing the "ylidone", "ylidene" and "bent allene" structures, respectively. The influence of both ligands and central atoms on the bonding situation is clearly expressed in different weights of the resonance structures for the particular complexes. In general, the bonding in the studied E(0) compounds, the tetrylones, is best described as a resonating combination of "ylidone" and "ylidene" structures with a minor contribution of the "bent allene" structure. Moreover, the VB calculations allow for a straightforward assessment of the π-backbonding (E→L) stabilization energy. The validity of the suggested resonance model is further confirmed by the complementary MPD calculations focusing on the E lone pair region as well as the E-L bonding region. Likewise, the MPD method reveals a strong influence of the σ-donating and π-accepting properties of the ligand. In particular, either one single domain or two symmetrical domains are found in the lone pair region of the central atom, supporting the predominance of either the "ylidene" or "ylidone" structures having one or two lone pairs at the central atom, respectively. Furthermore, the calculated average populations in the lone pair MPDs correlate very well with the natural bond orbital (NBO) populations, and can be related to the average number of electrons that is backdonated to the ligands. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. "A terrible piece of bad metaphysics"? Towards a history of abstraction in nineteenth- and early twentieth-century probability theory, mathematics and logic

    NARCIS (Netherlands)

    Verburgt, L.M.

    2015-01-01

    This dissertation provides a contribution to a new history of exact thought in which the existence of a break between "non-modern" and "modern" abstraction is put forward to account for the possibility of the "modernization" of probability theory, mathematics and logic during the 19th- and early

  9. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  10. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  11. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  12. A Longitudinal Item Response Theory Model to Characterize Cognition Over Time in Elderly Subjects

    Science.gov (United States)

    Bornkamp, Björn; Krahnke, Tillmann; Mielke, Johanna; Monsch, Andreas; Quarg, Peter

    2017-01-01

    For drug development in neurodegenerative diseases such as Alzheimer's disease, it is important to understand which cognitive domains carry the most information on the earliest signs of cognitive decline, and which subject characteristics are associated with a faster decline. A longitudinal Item Response Theory (IRT) model was developed for the Basel Study on the Elderly, in which the Consortium to Establish a Registry for Alzheimer's Disease – Neuropsychological Assessment Battery (with additions) and the California Verbal Learning Test were measured on 1,750 elderly subjects for up to 13.9 years. The model jointly captured the multifaceted nature of cognition and its longitudinal trajectory. The word list learning and delayed recall tasks carried the most information. Greater age at baseline, fewer years of education, and positive APOEɛ4 carrier status were associated with a faster cognitive decline. Longitudinal IRT modeling is a powerful approach for progressive diseases with multifaceted endpoints. PMID:28643388

  13. Factual and cognitive probability

    OpenAIRE

    Chuaqui, Rolando

    2012-01-01

    This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...

  14. Evaluating probability forecasts

    OpenAIRE

    Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo

    2011-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...

  15. Thermodynamic characterization of polymeric materials subjected to non-isothermal flows: Experiment, theory and simulation

    Science.gov (United States)

    Ionescu, Tudor Constantin

    Frictional or viscous heating phenomena are found in virtually every industrial operation dealing with processing of polymeric materials. This work is aimed at addressing some of the existing shortcomings in modeling non-isothermal polymer flowing processes. Specifically, existing theories suggest that when a polymer melt is subjected to deformation, its internal energy changes very little compared to its conformational entropy. This statement forms the definition of the Theory of Purely Entropic Elasticity (PEE) applied to polymer melts. Under the auspices of this theory, the temperature evolution equation for modeling the polymer melt under an applied deformation is greatly simplified. In this study, using a combination of experimental measurements, continuum-based computer modeling and molecular simulation techniques, the validity of this theory is tested for a wide range of processing conditions. First, we present experimental evidence that this theory is only valid for low deformation regimes. Furthermore, using molecular theory, a direct correlation is found between the relaxation characteristics of the polymer and the flow regime where this theory stops being valid. We present a new and improved form of the temperature equation containing an extra term previously neglected under the PEE assumption, followed by a recipe for evaluating the extra term. The corrected temperature equation is found to give more accurate predictions for the temperature profiles in the high flow rate regimes, in excellent agreement with our experimental measurements. Next, in order to gain a molecular-level understanding of our experimental findings, a series of polydisperse linear alkane systems with average chain lengths between 24 and 78 carbon atoms are modeled with an applied "orienting field" using a highly efficient non-equilibrium Monte Carlo scheme. Our simulation results appear to substantiate our experimental findings. The internal energy change of the oriented

  16. Difficulties related to Probabilities

    OpenAIRE

    Rosinger, Elemer Elad

    2010-01-01

    Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.

  17. Detection Probability of Trends in Rare Events: Theory and Application to Heavy Precipitation in the Alpine Region.

    Science.gov (United States)

    Frei, Christoph; Schär, Christoph

    2001-04-01

    A statistical framework is presented for the assessment of climatological trends in the frequency of rare and extreme weather events. The methodology applies to long-term records of event counts and is based on the stochastic concept of binomial distributed counts. It embraces logistic regression for trend estimation and testing, and includes a quantification of the potential/limitation to discriminate a trend from the stochastic fluctuations in a record. This potential is expressed in terms of a detection probability, which is calculated from Monte Carlo-simulated surrogate records, and determined as a function of the record length, the magnitude of the trend and the average return period (i.e., the rarity) of events.Calculations of the detection probability for daily events reveal a strong sensitivity upon the rarity of events:in a 100-yr record of seasonal counts, a frequency change by a factor of 1.5 can be detected with a probability of 0.6 for events with an average return period of 30 days; however, this value drops to 0.2 for events with a return period of 100 days. For moderately rare events the detection probability decreases rapidly with shorter record length, but it does not significantly increase with longer record length when very rare events are considered. The results demonstrate the difficulty to determine trends of very rare events, underpin the need for long period data for trend analyses, and point toward a careful interpretation of statistically nonsignificant trend results.The statistical method is applied to examine seasonal trends of heavy daily precipitation at 113 rain gauge stations in the Alpine region of Switzerland (1901-94). For intense events (return period: 30 days) a statistically significant frequency increase was found in winter and autumn for a high number of stations. For strong precipitation events (return period larger than 100 days), trends are mostly statistically nonsignificant, which does not necessarily imply the absence

  18. Theory of mind deficit in subjects with alcohol use disorder: an analysis of mindreading processes.

    Science.gov (United States)

    Bosco, Francesca M; Capozzi, Francesca; Colle, Livia; Marostica, Paolo; Tirassa, Maurizio

    2014-01-01

    The aim of the study was to investigate multidimensional Theory of Mind (ToM) abilities in subjects with alcohol use disorder (AUD). A semi-structured interview and a set of brief stories were used to investigate different components of the participants' ToM, namely first- vs. third-person, egocentric vs. allocentric, first- vs. second-order ToM in 22 persons with AUD plus an equal number of healthy controls. Participants were administered the Theory of Mind Assessment Scale (Th.o.m.a.s., Bosco et al., 2009a) and the Strange Stories test ( Happé et al., 1999). Persons with AUD performed worse than controls at all ToM dimensions. The patterns of differences between groups varied according to the Th.o.m.a.s. dimension investigated. In particular persons with AUD performed worse at third-person than at first-person ToM, and at the allocentric than at the egocentric perspective. These findings support the hypothesis that the ability to understand and ascribe mental states is impaired in AUD. Future studies should focus on the relevance of the different ToM impairments as predictors of treatment outcome in alcoholism, and on the possibility that rehabilitative interventions may be diversified according to ToM assessment.

  19. [Encountering the subject in the health field: a human care theory based on lived experience].

    Science.gov (United States)

    Vonarx, Nicolas; Desgroseilliers, Valérie

    2013-09-01

    Dominated by a bio-mechanistic paradigm, Western health systems are suffering from numerous problems. One such problem is the lack of consideration for lived experiences and the complexity and depth of meaning that characterize them. We accordingly emphasize in this text the importance of talking a deep look at the experiences of the cared-for Subject and changing the viewpoint on his or her problems. We defend this viewpoint with the help of a few ideas borrowed from Georges Canguilhem. We then refer to a socio-phenomenological approach inspired by the work of Alfred Schütz which allows us to better grasp people's lived experiences. We thus rehabilitate the Subject by proposing a human care theory that focuses on its' relationship(s) with the body, others, time and space, as well as on self-referent identity labels that give meaning to one's existence. This study is a theoretical reflection on human care that considers professional collaboration and interdisciplinarity, and that does not ignore the concrete practices of stakeholders and professionals.

  20. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  1. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-06-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  2. Concepts of Dhatu Siddhanta (theory of tissues formation and differentiation) and Rasayana; probable predecessor of stem cell therapy.

    Science.gov (United States)

    Sharma, Vinamra; Chaudhary, Anand Kumar

    2014-01-01

    To maintain health and to cure diseases through Rasayana (rejuvenation) therapy along with main treatment is the unique approach of Ayurveda. The basic constituent unit of a living being is always a functional cell. Question arises from where it is generated? How it attains its final specific differentiation form? As age progresses, various changes occur at every cell level and cell undergoes to adaptation accordingly. Microenvironment for cell nourishment diminishes with age or as disease condition persists. In this context, Acharyas had contributed and documented various facts and theories through their insight wisdom. Hidden secretes in the basic principles of any medical system are needed to be explained in terms of contemporary knowledge. Contemporary research areas should be opened to include various explanations of different fields of ancient thoughts to support these new doctrines, if any. This review may be helpful to open the door of future research area in the field of reverse scientific approach of Ayurveda in the context of Dhatu Siddhanta (theory of tissues formation and differentiation) and theory of stem cell.

  3. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  4. Five-Parameter Bivariate Probability Distribution

    Science.gov (United States)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  5. EDUCATION, CLASS STRUGGLE, REVOLUTION. SUBJECTIVITY AND OBJECTIVITY IN THE THEORY OF KARL MARX

    Directory of Open Access Journals (Sweden)

    Irene Viparelli

    2011-02-01

    Full Text Available This article aims to investigate the relationship between education, class struggle and revolution based on Manifesto and the "historical texts" of Marx on the 1848 revolution. Initially, some confrontation will be shown fruitful by illuminating the basic features of the theoretical Marxian: rejecting both "objectivist" and "subjectivist" interpretations. Marx’s theory of history turns out to be founded on two different times - linear and cyclical. These define a dialectical relationship between objectivity and subjectivity of history. Next, the same confrontation will be used to show that Marx represents the process in which the proletariat acquires a mature revolutionary consciousness. Far from considering it as "simple" development of proletarian class consciousness, Marx conceives this process as a more complex path, thus, it is implicated in all social classes. Finally, the last part is devoted to showing "empirically" this theoretical device in action: through the analysis of Marx on the development of class struggles in France, it will "almost" show the absolute centrality of the educational dimension in Marxist conception of history

  6. Trust into Collective Privacy? The Role of Subjective Theories for Self-Disclosure in Online Communication

    Directory of Open Access Journals (Sweden)

    Ricarda Moll

    2014-12-01

    Full Text Available In order to build and maintain social capital in their Online Social Networks, users need to disclose personal information, a behavior that at the same time leads to a lower level of privacy. In this conceptual paper, we offer a new theoretical perspective on the question of why people might regulate their privacy boundaries inadequately when communicating in Online Social Networks. We argue that people have developed a subjective theory about online privacy putting them into a processing mode of default trust. In this trusting mode people would (a discount the risk of a self-disclosure directly; and (b infer the risk from invalid cues which would then reinforce their trusting mode. As a consequence people might be more willing to self-disclose information than their actual privacy preferences would otherwise indicate. We exemplify the biasing potential of a trusting mode for memory and metacognitive accuracy and discuss the role of a default trust mode for the development of social capital.

  7. Implications of the Subject's Ontological Statute in the Bakhtin, Medvedev, Vološinov Circle's Discursive Theory

    Directory of Open Access Journals (Sweden)

    Vera Lúcia Pires

    2013-06-01

    Full Text Available This paper aims to explore the dialogical ontology from a Bakhtinian perspective for this is an essential topic linked to subjects’ agency in the context of dialogism. From this acting definition emerges a conception of language that is the basis of the Bakhtin, Medvedev, VoloŠinov Circle’s theory of enunciation. Thus, one intends to examine here the implications of the statute of the subject in Bakhtin’s philosophical theory and its resulting application in the Circle’s enunciative theory, which makes necessary to explore the philosophical bases of the Bakhtinian architecture as regards the dialogical principle, the conceptions of identity and intersubjectivity, social evaluation, the ethics of responsibility and the relationship among subjects. One aims to show that sense, resulting from enunciation, has to do with the ontological statute, understood as a social and historical one, of interacting subjects.

  8. A theory for the fracture of thin plates subjected to bending and twisting moments

    Science.gov (United States)

    Hui, C. Y.; Zehnder, Alan T.

    1993-01-01

    Stress fields near the tip of a through crack in an elastic plate under bending and twisting moments are reviewed assuming both Kirchhoff and Reissner plate theories. The crack tip displacement and rotation fields based on the Reissner theory are calculated. These results are used to calculate the J-integral (energy release rate) for both Kirchhoff and Reissner plate theories. Invoking Simmonds and Duva's (1981) result that the value of the J-integral based on either theory is the same for thin plates, a universal relationship between the Kirchhoff theory stress intensity factors and the Reissner theory stress intensity factors is obtained for thin plates. Calculation of Kirchhoff theory stress intensity factors from finite elements based on energy release rate is illustrated. It is proposed that, for thin plates, fracture toughness and crack growth rates be correlated with the Kirchhoff theory stress intensity factors.

  9. Theory of Nonlinear Generation of X-Radiation in a Crystal Subjected to Intense Electron Bombardment.

    Science.gov (United States)

    1980-11-01

    initially excited atom and unexcited atoms respectively, a(t) is the probability amplitude that the initially excited atom is excited at time t, all...other atoms are in their ground state and there are no photons in the radiation field; b is the probability amplitude that there is one photon with

  10. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  11. Elements of quantum probability

    NARCIS (Netherlands)

    Kummerer, B.; Maassen, H.

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with

  12. Competitive testing of health behavior theories: how do benefits, barriers, subjective norm, and intention influence mammography behavior?

    Science.gov (United States)

    Murphy, Caitlin C; Vernon, Sally W; Diamond, Pamela M; Tiro, Jasmin A

    2014-02-01

    Competitive hypothesis testing may explain differences in predictive power across multiple health behavior theories. We tested competing hypotheses of the Health Belief Model (HBM) and Theory of Reasoned Action (TRA) to quantify pathways linking subjective norm, benefits, barriers, intention, and mammography behavior. We analyzed longitudinal surveys of women veterans randomized to the control group of a mammography intervention trial (n = 704). We compared direct, partial mediation, and full mediation models with Satorra-Bentler χ (2) difference testing. Barriers had a direct and indirect negative effect on mammography behavior; intention only partially mediated barriers. Benefits had little to no effect on behavior and intention; however, it was negatively correlated with barriers. Subjective norm directly affected behavior and indirectly affected intention through barriers. Our results provide empiric support for different assertions of HBM and TRA. Future interventions should test whether building subjective norm and reducing negative attitudes increases regular mammography.

  13. Competitive testing of health behavior theories: how do benefits, barriers, subjective norm, and intention influence mammography behavior?

    Science.gov (United States)

    Murphy, Caitlin C.; Vernon, Sally W.; Diamond, Pamela M.; Tiro, Jasmin A.

    2013-01-01

    Background Competitive hypothesis testing may explain differences in predictive power across multiple health behavior theories. Purpose We tested competing hypotheses of the Health Belief Model (HBM) and Theory of Reasoned Action (TRA) to quantify pathways linking subjective norm, benefits, barriers, intention, and mammography behavior. Methods We analyzed longitudinal surveys of women veterans randomized to the control group of a mammography intervention trial (n=704). We compared direct, partial mediation, and full mediation models with Satorra-Bentler χ2 difference testing. Results Barriers had a direct and indirect negative effect on mammography behavior; intention only partially mediated barriers. Benefits had little to no effect on behavior and intention; however, it was negatively correlated with barriers. Subjective norm directly affected behavior and indirectly affected intention through barriers. Conclusions Our results provide empiric support for different assertions of HBM and TRA. Future interventions should test whether building subjective norm and reducing negative attitudes increases regular mammography. PMID:23868613

  14. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  15. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  16. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  17. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki

    2017-07-28

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  18. Jacques Lacan's theory of the subject as real, symbolic and imaginary: how can Lacanian theory be of help to mental health nursing practice?

    Science.gov (United States)

    McSherry, A

    2013-11-01

    This paper presents an outline of Lacan's theory of the human subject, in particular focusing on Lacan's concepts of the real, symbolic and imaginary registers, and how an understanding of these can inform change and practice in mental health nursing. Mental health nursing is under pressure to define itself as a practice distinct from other professions in the field, and to respond in new ways to promoting mental health to the individual and a wider public. Lacan's theory of the subject is of particular relevance to mental health nurses working with mental distress but has received little attention in mental health nursing literature. Six implications for practice are outlined in terms of: against normalization, the importance of the function of the symptom, what cannot be known, meaning as ever-changing, against empathy and against holistic ideas of the self. © 2012 John Wiley & Sons Ltd.

  19. Subjects great and small: Maxwell on Saturn's rings and kinetic theory.

    Science.gov (United States)

    Garber, Elizabeth

    2008-05-28

    Since 1890, James Clerk Maxwell's reputation has rested upon his theory of electromagnetism. However, during his lifetime he was recognized 'as the leading molecular scientist' of his generation. We will explore the foundation of his significance before 1890 using his work on the stability of Saturn's rings and the development of his kinetic theory of gases, and then briefly discuss the grounds for the change of his reputation.

  20. Contemporary Leadership Theories. Enhancing the Understanding of the Complexity, Subjectivity and Dynamic of Leadership

    DEFF Research Database (Denmark)

    Winkler, Ingo

    . Leadership is understood as product of complex social relationships embedded in the logic and dynamic of the social system. The book discusses theoretical approaches from top leadership journals, but also addresses various alternatives that are suitable to challenge mainstream leadership research....... It includes attributional and psychodynamic approaches, charismatic leadership theories, and theoretical approaches that define leader-member relations in terms of exchange relations leadership under symbolic and political perspectives, in the light of role theory and as process of social learning....

  1. Norbert Wiener and Probability Theory

    Indian Academy of Sciences (India)

    In his investigation of conti- nuous time stationary processes, Wiener introduced the famous. Wiener-Hopf equation to solve the prediction problem. ... We would go to an empty classroom and Wiener would start writing on the blackboard while I sat at the desk listening or taking notes. Among the topics we talked about.

  2. Probability theory a concise course

    CERN Document Server

    Rozanov, Y A

    1977-01-01

    This clear exposition begins with basic concepts and moves on to combination of events, dependent events and random variables, Bernoulli trials and the De Moivre-Laplace theorem, a detailed treatment of Markov chains, continuous Markov processes, and more. Includes 150 problems, many with answers. Indispensable to mathematicians and natural scientists alike.

  3. Using the Theory of Planned Behaviour to Understand Students' Subject Choices in Post-Compulsory Education

    Science.gov (United States)

    Taylor, Rachel Charlotte

    2015-01-01

    In recent years, there have been concerns in the UK regarding the uptake of particular subjects in post-compulsory education. Whilst entries for Advanced level (A-level) subjects such as media studies have experienced considerable growth, entries for A-level physics have, until recently, been declining, prompting fears of a skills crisis in future…

  4. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    Science.gov (United States)

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  5. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  6. The role of subjective norms in theory of planned behavior in the context of organic food consumption

    OpenAIRE

    Al-Swidi, Abdullah; Huque, Sheikh Mohammed Rafiul; Hafeez, Muhammad Haroon; Shariff, Mohd Noor Mohd

    2014-01-01

    The purpose of the paper is to investigate the applicability of theory of planned behavior (TPB) with special emphasis on measuring the direct and moderating effect of subjective norms on attitude, perceived behavioral control and buying intention in context of buying organic food. Structured questionnaires were randomly distributed among academic staffs and students of two universities in southern Punjab, Pakistan. Structural equation modeling was employed to test the proposed model fit....

  7. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  8. The Devaluation of the Subject in Popper's Theory of World 3

    Czech Academy of Sciences Publication Activity Database

    Parusniková, Zuzana

    2016-01-01

    Roč. 46, č. 3 (2016), s. 304-317 ISSN 0048-3931 Institutional support: RVO:67985955 Keywords : Karl Popper * World 3 * objective knowledge * criticism * creativity Subject RIV: AA - Philosophy ; Religion Impact factor: 0.392, year: 2016

  9. Elements of quantum probability

    OpenAIRE

    Kummerer, B.; Maassen, Hans

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of ‘quantum coin tosses’ are discussed, closely related to V.F.R....

  10. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  11. No Magic Bullet: A Theory-Based Meta-Analysis of Markov Transition Probabilities in Studies of Service Systems for Persons With Mental Disabilities.

    Science.gov (United States)

    Leff, Hugh Stephen; Chow, Clifton M; Graves, Stephen C

    2017-03-01

    A random-effects meta-analysis of studies that used Markov transition probabilities (TPs) to describe outcomes for mental health service systems of differing quality for persons with serious mental illness was implemented to improve the scientific understanding of systems performance, to use in planning simulations to project service system costs and outcomes over time, and to test a theory of how outcomes for systems varying in quality differ. Nineteen systems described in 12 studies were coded as basic (B), maintenance (M), and recovery oriented (R) on the basis of descriptions of services provided. TPs for studies were aligned with a common functional-level framework, converted to a one-month time period, synthesized, and compared with theory-based expectations. Meta-regression was employed to explore associations between TPs and characteristics of service recipients and studies. R systems performed better than M and B systems. However, M systems did not perform better than B systems. All systems showed negative as well as positive TPs. For approximately one-third of synthesized TPs, substantial interstudy heterogeneity was noted. Associations were found between TPs and service recipient and study variables Conclusions: Conceptualizing systems as B, M, and R has potential for improving scientific understanding and systems planning. R systems appear more effective than B and M systems, although there is no "magic bullet" system for all service recipients. Interstudy heterogeneity indicates need for common approaches to reporting service recipient states, time periods for TPs, service recipient attributes, and service system characteristics. TPs found should be used in Markov simulations to project system effectiveness and costs of over time.

  12. Gestalt Theory in Visual Screen Design — A New Look at an old subject

    OpenAIRE

    Chang, D.; Dooley, L.; Tuovinen, J. E

    2002-01-01

    Although often presented as a single basis for educational visual screen design, Gestalt theory is not a single small set of visual principles uniformly applied by all designers. In fact, it appears that instructional visual design literature often deals with only a small set of Gestalt laws. In this project Gestalt literature was consulted to distil the most relevant Gestalt laws for educational visual screen design. Eleven laws were identified. They deal with balance/symmetry, continuation,...

  13. EDUCATION, CLASS STRUGGLE, REVOLUTION. SUBJECTIVITY AND OBJECTIVITY IN THE THEORY OF KARL MARX

    OpenAIRE

    Irene Viparelli

    2011-01-01

    This article aims to investigate the relationship between education, class struggle and revolution based on Manifesto and the "historical texts" of Marx on the 1848 revolution. Initially, some confrontation will be shown fruitful by illuminating the basic features of the theoretical Marxian: rejecting both "objectivist" and "subjectivist" interpretations. Marx’s theory of history turns out to be founded on two different times - linear and cyclical. These define a dialectical relat...

  14. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  15. Brecht, Hegel, Lacan: Brecht's Theory of Gest and the Problem of the Subject

    Directory of Open Access Journals (Sweden)

    Philip E. Bishop

    1986-01-01

    Full Text Available Brecht used the term "gest" to describe the generic components of human social behavior. He schooled actors in "decomposing" real conduct into distinct gestic images, which were criticized, compared, and altered by other actor-spectators. In his pedagogic theater, Brecht's young players engaged in a reciprocal process of acting and observing, which prepared them to act critically outside the theater. This gestic reciprocality echoes the master-slave dialectic in Hegel's Phenomenology and Lacan's description of the mirror phase. In Hegel, a subject achieves mastery (or self-consciousness through the recognition of another subject. In Lacan, the infant recognizes itself in an (alienated mirror-image and in its dramatic interactions with other infants. In each of these inter-subjective dialectics, the subject achieves sovereignty through the recognition of others and through a dramatic exchange with others. For Brecht, however, the structural roles of actor and spectator, teacher and student, were reversible, thus yielding a utopian notion of shared or collective sovereignty that is absent from Lacan. Furthermore, Brecht hoped that the sovereignty gained in the gestic theater would be transferred to actions outside the theater, on the stage of history.

  16. Top-down versus bottom-up theories of subjective well-being

    NARCIS (Netherlands)

    B. Headey; R. Veenhoven (Ruut); A.J. Wearing

    1991-01-01

    textabstractThis paper addresses issues of causal direction in research on subjective well being (SWB). Previous researchers have generally assumed that such variables as domain satisfactions, social support, life events, and levels of expectation and aspiration are causes of SWB. Critics have

  17. Physical Education Teachers' Subjective Theories about Integrating Information and Communication Technology (ICT) into Physical Education

    Science.gov (United States)

    Kretschmann, Rolf

    2015-01-01

    As well as other school subjects, physical education (PE) is emerging in terms of integrating information and communication technology (ICT) into regular classes. Such innovative teaching practices that implement ICT in PE involve diverse parties that are affected by these teaching processes. Students, principals, districts, parents,…

  18. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  19. John Dewey on theory of learning and inquiry: The scientific method and subject matter

    Science.gov (United States)

    Chen, Po-Nien

    This study examines the educational debate between Dewey and some of his critics on the merits of learning the scientific method. Four of Dewey's critics---Hutchins, Hirsch, Hirst, and Scheffler criticize Dewey for over-emphasizing the importance of the scientific method and under-emphasizing the importance of subject matter in education. This dissertation argues that these critics misunderstand Dewey's use of the scientific method and its importance in education. It argues that Dewey conceives of the scientific method in two different ways: first as an attitude and second as a tool. It also argues that, by failing to understand this critical distinction, these critics misunderstand the role of the scientific method in education. The dissertation concludes by showing that, educationally, Dewey's ideas of the scientific method have different meanings in different context. It analyzes the scientific method as empirical method, critical thinking, cooperative learning, and creative thinking, and shows the place of subject matter in each of them.

  20. Control Design of Active Magnetic Bearings for Rotors Subjected to Destabilising Seal Forces - Theory & Experiment

    DEFF Research Database (Denmark)

    Lauridsen, Jonas Skjødt

    advantages over traditional types of bearings, including: no mechanical contact, no lubrication, low maintenance, low vibration level, high rotational speed and low energy consumption. These advantagesmake AMBs especially useful in challenging environments, for instance in subsea turbomachinery applications....... The main original contribution of the thesis is the framework for design of model based controllers for AMB systems subjected to uncertainand changing dynamic seal forces. An identification method has been developed, and experimentally validated, to obtain precise models of Linear Fractional Transformation...

  1. Watson’s theory of human caring and subjective living experiences: carative factors/caritas processes as a disciplinary guide to the professional nursing practice

    OpenAIRE

    Jean Watson

    2007-01-01

    This article provides an overview of Watson’s theory of Human Caring, the notion of Caritas and human phenomena. Special emphasis is placed upon the theoretical structure of human caring theory referred to as 10 Carative Factors/Caritas Processes and subjective living processes and experiences. These core conceptual aspects of the theory and human living processes are grounded within the philosophical and ethical foundation of the body of my caring theory work. Together they serve as a guide ...

  2. INTRODUCTION IN TECHNOLOGY CONCEPTUALIZATION THE SUBJECT DOMAIN OF SOCIOLOGY: EXPANSION OF THE THEORY (in the example of relationship/kinship

    Directory of Open Access Journals (Sweden)

    A. Yu. Ivanov

    2017-01-01

    Full Text Available Presented article is the second of two articles, the aim of which is to introduce the reader has no special mathematical training, with the possibilities of application of mathematical methods developed in the scientific direction of “Conceptual analysis and design of systems of organizational management (CAD SOM”, designed to solve a variety of tasks, such as technical and humanitarian spheres on the basis of the proposed methodological approach to the mathematization of the theoretical knowledge. At the heart of this methodological approach is a process of conceptualization, which is understood as a theoretical study of qualitative aspects of a selected domain using mathematical forms (axiomatic theory, the locking connection between the concepts of logical derivability characterizing this subject area. Designed axiomatic theory – conceptual scheme – is the basis for building database structures, decision-making processes, a variety of phenomena subject area, structure and genesis of domain analysis and other tasks. One of the main advantages of the sending of methodological approach is the ability to work with complex regions based on the controlled synthesis tool terminal theory of conceptual schemes, explicated simple fragments of the subject area. Given the non-mathematical preparation of the reader, the contents of the methods illustrated by conceptualizing a conceptually simple subject areas – areas related relations, as well as the choice of one of the most simple goals conceptualization – structuring the domain and build a variety of its phenomena. The first article was given a brief description of mathematical methods, describes the main stages of the conceptualization of the subject areas, ranging from the definition of the boundaries of the domain and ending with the theory of the synthesis of the terminal and determine its compliance with the tasks of conceptualizing. In the chosen example – areas related relations

  3. Dynamic response of porous functionally graded material nanobeams subjected to moving nanoparticle based on nonlocal strain gradient theory

    Science.gov (United States)

    Barati, Mohammad Reza

    2017-11-01

    Up to now, nonlocal strain gradient theory (NSGT) is broadly applied to examine free vibration, static bending and buckling of nanobeams. This theory captures nonlocal stress field effects together with the microstructure-dependent strain gradient effects. In this study, forced vibrations of NSGT nanobeams on elastic substrate subjected to moving loads are examined. The nanobeam is made of functionally graded material (FGM) with even and uneven porosity distributions inside the material structure. The graded material properties with porosities are described by a modified power-law model. Dynamic deflection of the nanobeam is obtained via Galerkin and inverse Laplace transform methods. The importance of nonlocal parameter, strain gradient parameter, moving load velocity, porosity volume fraction, type of porosity distribution and elastic foundation on forced vibration behavior of nanobeams are discussed.

  4. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  5. A random matrix/transition state theory for the probability distribution of state-specific unimolecular decay rates: Generalization to include total angular momentum conservation and other dynamical symmetries

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, R.; Miller, W.H.; Moore, C.B. (Department of Chemistry, University of California, and Chemical Sciences Division, Lawrence Berkeley Laboratory, Berkeley, California 94720 (United States)); Polik, W.F. (Department of Chemistry, Hope College, Holland, Michigan 49423 (United States))

    1993-07-15

    A previously developed random matrix/transition state theory (RM/TST) model for the probability distribution of state-specific unimolecular decay rates has been generalized to incorporate total angular momentum conservation and other dynamical symmetries. The model is made into a predictive theory by using a semiclassical method to determine the transmission probabilities of a nonseparable rovibrational Hamiltonian at the transition state. The overall theory gives a good description of the state-specific rates for the D[sub 2]CO[r arrow]D[sub 2]+CO unimolecular decay; in particular, it describes the dependence of the distribution of rates on total angular momentum [ital J]. Comparison of the experimental values with results of the RM/TST theory suggests that there is mixing among the rovibrational states.

  6. Representation theory of finite monoids

    CERN Document Server

    Steinberg, Benjamin

    2016-01-01

    This first text on the subject provides a comprehensive introduction to the representation theory of finite monoids. Carefully worked examples and exercises provide the bells and whistles for graduate accessibility, bringing a broad range of advanced readers to the forefront of research in the area. Highlights of the text include applications to probability theory, symbolic dynamics, and automata theory. Comfort with module theory, a familiarity with ordinary group representation theory, and the basics of Wedderburn theory, are prerequisites for advanced graduate level study. Researchers in algebra, algebraic combinatorics, automata theory, and probability theory, will find this text enriching with its thorough presentation of applications of the theory to these fields. Prior knowledge of semigroup theory is not expected for the diverse readership that may benefit from this exposition. The approach taken in this book is highly module-theoretic and follows the modern flavor of the theory of finite dimensional ...

  7. Theory of mind network activity is altered in subjects with familial liability for schizophrenia

    Science.gov (United States)

    Mohnke, Sebastian; Erk, Susanne; Schnell, Knut; Romanczuk-Seiferth, Nina; Schmierer, Phöbe; Romund, Lydia; Garbusow, Maria; Wackerhagen, Carolin; Ripke, Stephan; Grimm, Oliver; Haller, Leila; Witt, Stephanie H.; Degenhardt, Franziska; Tost, Heike; Heinz, Andreas; Meyer-Lindenberg, Andreas; Walter, Henrik

    2016-01-01

    As evidenced by a multitude of studies, abnormalities in Theory of Mind (ToM) and its neural processing might constitute an intermediate phenotype of schizophrenia. If so, neural alterations during ToM should be observable in unaffected relatives of patients as well, since they share a considerable amount of genetic risk. While behaviorally, impaired ToM function is confirmed meta-analytically in relatives, evidence on aberrant function of the neural ToM network is sparse and inconclusive. The present study therefore aimed to further explore the neural correlates of ToM in relatives of schizophrenia. About 297 controls and 63 unaffected first-degree relatives of patients with schizophrenia performed a ToM task during functional magnetic resonance imaging. Consistent with the literature relatives exhibited decreased activity of the medial prefrontal cortex. Additionally, increased recruitment of the right middle temporal gyrus and posterior cingulate cortex was found, which was related to subclinical paranoid symptoms in relatives. These results further support decreased medial prefrontal activation during ToM as an intermediate phenotype of genetic risk for schizophrenia. Enhanced recruitment of posterior ToM areas in relatives might indicate inefficiency mechanisms in the presence of genetic risk. PMID:26341902

  8. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  9. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  10. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  11. Static instability of beam-type NEMS subjected to symmetric electrostatic actuation based on couple stress theory and Timoshenko beam theory

    Science.gov (United States)

    Shojaeian, M.; Zeighampour, H.

    2017-06-01

    The behavior of nanoelectromechanical systems subjected to symmetrical electrostatic actuation and symmetrical Casimir intermolecular force has been investigated. Two different phenomena (i.e. electromechanical bifurcation and electromechanical buckling) have been considered to explore the static electromechanical instability of such systems. The Timoshenko beam model has been employed to find the effect of shear deformation on these systems. Modified couple stress theory has been used to investigate size-dependency. Besides, the compressive and tensile residual stresses of nanobridges have been measured on the basis of electromechanical buckling. The governing equations and corresponding boundary conditions have been obtained by means of the principle of minimum potential energy. Finally, following validation of results, the effects of material length scale, length, shear deformation, beam geometry, and gap distance on the symmetric electromechanical behavior have been discussed and examined.

  12. Positron emission tomography/computerised tomography imaging in detecting and managing recurrent cervical cancer: systematic review of evidence, elicitation of subjective probabilities and economic modelling.

    Science.gov (United States)

    Meads, C; Auguste, P; Davenport, C; Małysiak, S; Sundar, S; Kowalska, M; Zapalska, A; Guest, P; Thangaratinam, S; Martin-Hirsch, P; Borowiack, E; Barton, P; Roberts, T; Khan, K

    2013-03-01

    Cancer of the uterine cervix is a common cause of mortality in women. After initial treatment women may be symptom free, but the cancer may recur within a few years. It is uncertain whether it is more clinically effective to survey asymptomatic women for signs of recurrence or to await symptoms or signs before using imaging. This project compared the diagnostic accuracy of imaging using positron emission tomography/computerised tomography (PET-CT) with that of imaging using CT or magnetic resonance imaging (MRI) alone and evaluated the cost-effectiveness of adding PET-CT as an adjunct to standard practice. Standard systematic review methods were used to obtain and evaluate relevant test accuracy and effectiveness studies. Databases searched included MEDLINE, EMBASE, Science Citation Index and The Cochrane Library. All databases were searched from inception to May 2010. Study quality was assessed using appropriately modified Quality Assessment of Diagnostic Accuracy Studies (QUADAS) criteria. Included were any studies of PET-CT, MRI or CT compared with the reference standard of histopathological findings or clinical follow-up in symptomatic women suspected of having recurrent or persistent cervical cancer and in asymptomatic women a minimum of 3 months after completion of primary treatment. Subjective elicitation of expert opinion was used to supplement diagnostic information needed for the economic evaluation. The effectiveness of treatment with chemotherapy, radiotherapy, chemoradiotherapy, radical hysterectomy and pelvic exenteration was systematically reviewed. Meta-analysis was carried out in RevMan 5.1 (The Cochrane Collaboration, The Nordic Cochrane Centre, Copenhagen, Denmark) and Stata version 11 (StataCorp LP, College Station, Texas, USA). A Markov model was developed to compare the relative cost-effectiveness using TreeAge Pro software version 2011 (TreeAge Software Inc., Evanston, IL, USA). For the diagnostic review, a total of 7524 citations were

  13. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  14. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  15. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  16. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  17. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  18. The Role of Cooperative Learning Type Team Assisted Individualization to Improve the Students' Mathematics Communication Ability in the Subject of Probability Theory

    Science.gov (United States)

    Tinungki, Georgina Maria

    2015-01-01

    The importance of learning mathematics can not be separated from its role in all aspects of life. Communicating ideas by using mathematics language is even more practical, systematic, and efficient. In order to overcome the difficulties of students who have insufficient understanding of mathematics material, good communications should be built in…

  19. Dangerous "spin": the probability myth of evidence-based prescribing - a Merleau-Pontyian approach.

    Science.gov (United States)

    Morstyn, Ron

    2011-08-01

    The aim of this study was to examine logical positivist statistical probability statements used to support and justify "evidence-based" prescribing rules in psychiatry when viewed from the major philosophical theories of probability, and to propose "phenomenological probability" based on Maurice Merleau-Ponty's philosophy of "phenomenological positivism" as a better clinical and ethical basis for psychiatric prescribing. The logical positivist statistical probability statements which are currently used to support "evidence-based" prescribing rules in psychiatry have little clinical or ethical justification when subjected to critical analysis from any of the major theories of probability and represent dangerous "spin" because they necessarily exclude the individual , intersubjective and ambiguous meaning of mental illness. A concept of "phenomenological probability" founded on Merleau-Ponty's philosophy of "phenomenological positivism" overcomes the clinically destructive "objectivist" and "subjectivist" consequences of logical positivist statistical probability and allows psychopharmacological treatments to be appropriately integrated into psychiatric treatment.

  20. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  1. A time-dependent wave packet approach to atom-diatom reactive collision probabilities - Theory and application to the H + H2(J = 0) system

    Science.gov (United States)

    Neuhauser, Daniel; Baer, Michael; Judson, Richard S.; Kouri, Donald J.

    1990-01-01

    This paper describes a new approach to the study of atom-diatom reactive collisions in three dimensions employing wave packets and the time-dependent Schroedinger equation. The method uses a projection operator approach to couple the inelastic and reactive portions of the total wave function and optical potentials to circumvent the necessity of using product arrangement coordinates. Reactive transition probabilities are calculated from the state resolved flux of the wave packet as it leaves the interaction region in the direction of the reactive arrangement channel. The present approach is used to obtain such vibrationally resolved probabilities for the three-dimensional H + H2 (J = 0) hydrogen exchange reaction, using a body-fixed system of coordinates.

  2. Information Fusion from the Point of View of Communication Theory; Fusing Information to Trade-Off the Resolution of Assessments Against the Probability of Mis-Assessment

    Science.gov (United States)

    2013-08-19

    compressing vector-valued measurements into scalars simply selects compression vectors from a set of apriori computed eigenvectors, according to...the large deviations solution is minimax. In other words, the LD solution has a constant error exponent under any apriori probabilities of the null and...that the optimum scalar statistic is an inner product between the measure- ment and an eigenvector of an apriori signal covariance. That is, the greedy

  3. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...

  4. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  5. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  6. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  7. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  8. Probability measures on metric spaces

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom

  9. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  10. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  11. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  12. Algebraic and stochastic coding theory

    CERN Document Server

    Kythe, Dave K

    2012-01-01

    Using a simple yet rigorous approach, Algebraic and Stochastic Coding Theory makes the subject of coding theory easy to understand for readers with a thorough knowledge of digital arithmetic, Boolean and modern algebra, and probability theory. It explains the underlying principles of coding theory and offers a clear, detailed description of each code. More advanced readers will appreciate its coverage of recent developments in coding theory and stochastic processes. After a brief review of coding history and Boolean algebra, the book introduces linear codes, including Hamming and Golay codes.

  13. Lévy processes in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This is the continuation of a previous article that studied the relationship between the classes of infinitely divisible probability measures in classical and free probability, respectively, via the Bercovici–Pata bijection. Drawing on the results of the preceding article, the present paper outlines recent developments in the theory of Lévy processes in free probability.

  14. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  15. Frequentist probability and frequentist statistics

    Energy Technology Data Exchange (ETDEWEB)

    Neyman, J.

    1977-01-01

    A brief, nontechnical outline is given of the author's views on the ''frequentist'' theory of probability and the ''frequentist'' theory of statistics; their applications are illustrated in a few domains of study of nature. The phenomenon of apparently stable relative frequencies as the source of the frequentist theories of probability and statistics is taken up first. Three steps are set out: empirical establishment of apparently stable long-run relative frequencies of events judged interesting, as they develop in nature; guessing and then verifying the chance mechanism, the repeated operation of which produced the observed frequencies--this is a problem of frequentist probability theory; using the hypothetical chance mechanism of the phenomenon studied to deduce rules of adjusting our actions to the observations to ensure the highest ''measure'' of ''success''. Illustrations of the three steps are given. The theory of testing statistical hypotheses is sketched: basic concepts, simple and composite hypotheses, hypothesis tested, importance of the power of the test used, practical applications of the theory of testing statistical hypotheses. Basic ideas and an example of the randomization of experiments are discussed, and an ''embarrassing'' example is given. The problem of statistical estimation is sketched: example of an isolated problem, example of connected problems treated routinely, empirical Bayes theory, point estimation. The theory of confidence intervals is outlined: basic concepts, anticipated misunderstandings, construction of confidence intervals: regions of acceptance. Finally, the theory of estimation by confidence intervals or regions is considered briefly. 4 figures. (RWR)

  16. Quantitative metric theory of continued fractions

    Indian Academy of Sciences (India)

    Quantitative versions of the central results of the metric theory of contin- ued fractions were given primarily by C. ... Continued fractions; ergodic averages; metric theory of numbers. Mathematics Subject ... of subsets of X, a probability measure μ on the measurable space (X, β) and a measurable self-map T of X that is also ...

  17. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  18. Choice-centred versus subject-centred theories in the social sciences. The influence of simplification on explananda

    NARCIS (Netherlands)

    Lindenberg, S

    The idea, originating in economics and forcefully brought back by Goldthorpe, that rational choice theory and large-scale data analysis are symbiotic, is very attractive. Rational choice is in dire need of explananda which can be provided by large-scale data analysis, while large-scale data analysis

  19. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  20. Quantum Bayesianism as the basis of general theory of decision-making.

    Science.gov (United States)

    Khrennikov, Andrei

    2016-05-28

    We discuss the subjective probability interpretation of the quantum-like approach to decision making and more generally to cognition. Our aim is to adopt the subjective probability interpretation of quantum mechanics, quantum Bayesianism (QBism), to serve quantum-like modelling and applications of quantum probability outside of physics. We analyse the classical and quantum probabilistic schemes of probability update, learning and decision-making and emphasize the role of Jeffrey conditioning and its quantum generalizations. Classically, this type of conditioning and corresponding probability update is based on the formula of total probability-one the basic laws of classical probability theory. © 2016 The Author(s).

  1. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  2. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  3. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  4. Travel Mode Use, Travel Mode Shift and Subjective Well-Being: Overview of Theories, Empirical Findings and Policy Implications

    NARCIS (Netherlands)

    Ettema, D.F.; Friman, M.; Gärling, Tommy; Olsson, Lars

    2016-01-01

    This chapter discusses how travel by different travel modes is related to primarily subjective well-being but also to health or physical well-being. Studies carried out in different geographic contexts consistently show that satisfaction with active travel modes is higher than travel by car and

  5. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  6. Stationary algorithmic probability

    National Research Council Canada - National Science Library

    Müller, Markus

    2010-01-01

    ...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...

  7. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  8. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  9. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics Impact factor: 1.357, year: 2016 http:// library .utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  10. Probability theory: the logic of science

    National Research Council Canada - National Science Library

    Jaynes, E. T; Bretthorst, G. Larry

    2005-01-01

    ... by G. Larry Bretthorst Copyright   Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo Cambridge University Press...

  11. Toward an Objectivistic Theory of Probability

    Science.gov (United States)

    1956-01-01

    near future . 24 THIRD BERKELEY SYMPOSIUM: BARANKIN Section 6 is concerned with the delicate task of establishing the conceptual oneness of...old conceptions. The interpenetration of the man and the environ- ment (which is, incidentally, also strikingly exemplified by alimentation , or by...its veneer having begun to peel at one corner, etc., etc. Its future there-ness lies in that it may have its legs gnawed at by the new puppy in the

  12. Converting happiness theory into (interior) architectural design missions. Designing for subjective well-being in residential care centers.

    OpenAIRE

    STEVENS, Ruth; Petermans, Ann; Vanrie, Jan

    2014-01-01

    Subjective well-being (SWB) is an emerging research topic in the field of design sciences, whereby various design researchers focus on the key question ‘how to design for SWB’. Throughout different design disciplines, definitions for SWB are rising and design models and strategies are being developed in an effort to enable designers to increase users’ SWB. However, a clear image of how to design an (interior) architectural environment with the purpose of increasing people’s level of subjectiv...

  13. Situated technology in reproductive health care: Do we need a new theory of the subject to promote person-centred care?

    Science.gov (United States)

    Stankovic, Biljana

    2017-01-01

    Going through reproductive experiences (especially pregnancy and childbirth) in contemporary Western societies almost inevitably involves interaction with medical practitioners and various medical technologies in institutional context. This has important consequences for women as embodied subjects. A critical appraisal of these consequences-coming dominantly from feminist scholarship-relied on a problematic theory of both technology and the subject, which are in contemporary approaches no longer considered as given, coherent and well individualized wholes, but as complex constellations that are locally situated and that can only be described empirically. In this study, we will be relying on the developments in phenomenological theory to reconceptualize women as technologically mediated embodied subjects and on the new paradigms in philosophy of technology and STS to reconstruct medical technology as situated-with the aim of reconceptualizing their relationship and exploring different possibilities for the mediating role of medical technology. It will be argued that technologization of female reproductive processes and alienating consequences for women are not necessary or directly interrelated. The role of technology varies from case to case and depends mainly on the nontechnological and relational aspects of institutional context, in which medical practitioners play a decisive role. © 2016 John Wiley & Sons Ltd.

  14. The effects of subjective norms on behaviour in the theory of planned behaviour: a meta-analysis.

    Science.gov (United States)

    Manning, Mark

    2009-12-01

    A meta-analysis investigated the effects of perceived injunctive (IN) and descriptive (DN) norms on behaviour (BEH) within the theory of planned behaviour (TPB) in a sample of 196 studies. Two related correlation matrices (pairwise and listwise) were synthesized from the data and used to model the TPB relations with path analyses. Convergent evidence indicated that the relation between DN and BEH was stronger than the relation between IN and BEH. Evidence also suggested a significant direct relation between DN and BEH in the context of TPB. A suppressor effect of IN on DN in its relation with BEH was also noted. Moderator analyses indicated that the DN-BEH relation was stronger when there was more time between measures of cognition and behaviour, when behaviours were not socially approved, more socially motive and more pleasant; results were mixed in the case of the IN-BEH relation. Results imply that IN and DN are conceptually different constructs.

  15. Usability of a theory of visual attention (TVA) for parameter-based measurement of attention I: evidence from normal subjects

    DEFF Research Database (Denmark)

    Finke, Kathrin; Bublak, Peter; Krummenacher, Joseph

    2005-01-01

    The present study investigated the usability of whole and partial report of briefly displayed letter arrays as a diagnostic tool for the assessment of attentional functions. The tool is based on Bundesen's (1990, 1998, 2002; Bundesen et al., 2005) theory of visual attention (TVA), which assumes f...... four separable attentional components: processing speed, working memory storage capacity, spatial distribution of attention, and top-down control. A number of studies (Duncan et al., 1999; Habekost & Bundesen, 2003; Peers et al., 2005) have already demonstrated the clinical relevance...... clinical tests measuring similar constructs. The empirical independence of the four TVA parameters is suggested by nonsignificant or, in the case of processing speed and working memory storage capacity, only modest correlations between the parameter values....

  16. Improvement in cognitive and affective theory of mind with observation and imitation treatment in subjects with schizophrenia

    Directory of Open Access Journals (Sweden)

    Maria C. Pino

    2015-06-01

    Full Text Available Objective: the main objective of this study is to consider Theory of Mind (ToM, i.e. the ability to perceive other people in terms of thinking, believing and emotions, as a target for effective rehabilitative intervention, using Emotion and ToM Imitation Training (ETIT, aimed at improving social cognition and social functioning in schizophrenia. ToM impairment is a key feature of schizophrenia. According to recent literature, ToM is a multidimensional process requiring at least two components: cognitive and affective. Cognitive ToM seems to be a prerequisite for affective ToM, which requires intact empathic ability. Method: seven patients with schizophrenia completed ETIT treatment and were compared to 7 patients who participated in Problem Solving Training (PST. The participants were assessed at pre and post treatment regarding measures of cognitive (Advanced Theory of Mind Task and Social Situation Test and affective (Emotion Attribution Task and Eyes Task ToM and also empathy (Empathy Quotient. Results: our results showed that when compared to the control group, ETIT participants improved in three social cognition components evaluated (cognitive and affective ToM and empathy. Improvement in cognitive and affective ToM was found within the ETIT group pre and post treatment. Conclusions: Action observation and imitation could be important goals for future “low cost” rehabilitation treatment in several disorders in which the deficit of social cognition is considered as “core” to the disease. This represents a new perspective in the rehabilitation field.

  17. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  18. Probability, Statistics, and Computational Science

    OpenAIRE

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient infe...

  19. What Influences Chinese Adolescents’ Choice Intention between Playing Online Games and Learning? Application of Theory of Planned Behavior with Subjective Norm Manipulated as Peer Support and Parental Monitoring

    Science.gov (United States)

    Wang, Jia; Liu, Ru-De; Ding, Yi; Liu, Ying; Xu, Le; Zhen, Rui

    2017-01-01

    This study investigated how and why Chinese adolescents choose between playing online games and doing homework, using the model of the theory of planned behavior (TPB) in which the subjective norm was manipulated as two sub-elements (peer support and parental monitoring). A total of 530 students from an elementary school and a middle school in China were asked to complete the measures assessing two predictors of TPB: attitude and perceived behavioral control (PBC). Next, they completed a survey about their choice intention between playing an online game and doing homework in three different situations, wherein a conflict between playing online games and doing homework was introduced and subjective norm was manipulated as peers supporting and parents objecting to playing online games. The results showed that adolescents’ attitude and PBC, as well as the perception of obtaining or not obtaining support from their peers and caregivers (manipulated subjective norm), significantly influenced their choice intention in online gaming situations. These findings contribute to the understanding of the factors affecting adolescents’ online gaming, which has been a concern of both caregivers and educators. With regard to the theoretical implications, this study extended previous work by providing evidence that TPB can be applied to analyze choice intention. Moreover, this study illuminated the effects of the separating factors of subjective norm on choice intention between playing online games and studying. PMID:28458649

  20. What Influences Chinese Adolescents' Choice Intention between Playing Online Games and Learning? Application of Theory of Planned Behavior with Subjective Norm Manipulated as Peer Support and Parental Monitoring.

    Science.gov (United States)

    Wang, Jia; Liu, Ru-De; Ding, Yi; Liu, Ying; Xu, Le; Zhen, Rui

    2017-01-01

    This study investigated how and why Chinese adolescents choose between playing online games and doing homework, using the model of the theory of planned behavior (TPB) in which the subjective norm was manipulated as two sub-elements (peer support and parental monitoring). A total of 530 students from an elementary school and a middle school in China were asked to complete the measures assessing two predictors of TPB: attitude and perceived behavioral control (PBC). Next, they completed a survey about their choice intention between playing an online game and doing homework in three different situations, wherein a conflict between playing online games and doing homework was introduced and subjective norm was manipulated as peers supporting and parents objecting to playing online games. The results showed that adolescents' attitude and PBC, as well as the perception of obtaining or not obtaining support from their peers and caregivers (manipulated subjective norm), significantly influenced their choice intention in online gaming situations. These findings contribute to the understanding of the factors affecting adolescents' online gaming, which has been a concern of both caregivers and educators. With regard to the theoretical implications, this study extended previous work by providing evidence that TPB can be applied to analyze choice intention. Moreover, this study illuminated the effects of the separating factors of subjective norm on choice intention between playing online games and studying.

  1. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  2. Efficient probability sequence

    OpenAIRE

    Regnier, Eva

    2014-01-01

    A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...

  3. Efficient probability sequences

    OpenAIRE

    Regnier, Eva

    2014-01-01

    DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...

  4. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  5. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  6. The determinants of physician attitudes and subjective norms toward drug information sources: modification and test of the theory of reasoned action.

    Science.gov (United States)

    Gaither, C A; Bagozzi, R P; Ascione, F J; Kirking, D M

    1997-10-01

    To improve upon the theory of reasoned action and apply it to pharmaceutical research, we investigated the effects of relevant appraisals attributes, and past behavior of physicians on the use of drug information sources. We also examined the moderating effects of practice characteristics. A mail questionnaire asked HMO physicians to evaluate seven common sources of drug information on general appraisals (degree of usefulness and ease of use), specific attributes (availability, quality of information on harmful effects and on drug efficacy), and past behavior when searching for information on a new, simulated H2 antagonist agent. Semantic differential scales were used to measure each appraisal, attribute and past behavior. Information was also collected on practice characteristics. Findings from 108/200 respondents indicated that appraisals and attributes were useful determinants of attitudes and subjective norms toward use. Degree of usefulness and quality of information on harmful effects were important predictors of attitudes toward use for several sources of information. Ease of use and degree of usefulness were important predictors of subjective norms toward use. In many cases, moderating effects of practice characteristics were in opposing directions. Past behavior had significant direct effects on attitudes toward the PDR. The findings suggest ways to improve the usefulness of the theory of reasoned action as a model of decision-making. We also propose practical guidelines that can be used to improve the types of drug information sources used by physicians.

  7. Measure theory and integration

    CERN Document Server

    De Barra, G

    2003-01-01

    This text approaches integration via measure theory as opposed to measure theory via integration, an approach which makes it easier to grasp the subject. Apart from its central importance to pure mathematics, the material is also relevant to applied mathematics and probability, with proof of the mathematics set out clearly and in considerable detail. Numerous worked examples necessary for teaching and learning at undergraduate level constitute a strong feature of the book, and after studying statements of results of the theorems, students should be able to attempt the 300 problem exercises whi

  8. Oxygen boundary crossing probabilities.

    Science.gov (United States)

    Busch, N A; Silver, I A

    1987-01-01

    The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.

  9. In All Probability, Probability is not All

    Science.gov (United States)

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  10. INTRODUCTION IN TECHNOLOGY CONCEPTUALIZATION THE SUBJECT DOMAIN OF SOCIOLOGY: BASES OF POSTULATION OF THE NUCLEUS OF THE THEORY (in the example of relationship/kinship

    Directory of Open Access Journals (Sweden)

    A. Ju. Ivanov

    2017-01-01

    Full Text Available Presented article is the first of two articles, the aim of which is to familiarize the uninitiated reader with the possibilities of application of mathematical methods developed in the scientific direction of “Conceptual analysis and design of systems of organizational management (CAD SOM”, designed to solve various problems, both technical and in humanities.At the heart of this methodological approach is a process of conceptualization, which is understood as a theoretical study of qualitative aspects of a selected domain using mathematical forms (axiomatic theory, the locking connection between the concepts of logical derivability characterizing this subject area. Designed axiomatic theory – conceptual scheme – is the basis for building database structures, decisionmaking processes, a variety of phenomena subject area, structure and genesis of domain analysis and other tasks.Given the non-mathematical preparation of the reader, the contents of the methods illustrated by conceptualizing a conceptually simple subject areas – areas related relations, as well as the choice of one of the most simple goals conceptualization – structuring the domain and build a variety of its phenomena. For a better understanding of an article in a lightweight form describes and explains the stages of conceptualizing technology, the main difficulties of postulating and typical solutions.The first article explains the issues positing (postulation in constructing conceptual schemes. These include, in particular, the definition of the domain boundaries, the allocation of the basic concepts and the precise definition of the content. Considerable attention is paid to the impact of these decisions on the results of conceptualization. 

  11. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  12. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  13. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover

  14. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  15. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  17. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  18. Statistics and Probability

    Directory of Open Access Journals (Sweden)

    Laktineh Imad

    2010-04-01

    Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  19. Topos theory

    CERN Document Server

    Johnstone, PT

    2014-01-01

    Focusing on topos theory's integration of geometric and logical ideas into the foundations of mathematics and theoretical computer science, this volume explores internal category theory, topologies and sheaves, geometric morphisms, other subjects. 1977 edition.

  20. [Subjective theories on quality of life and health in old age : An explorative study with nursing home residents and their nursing personnel].

    Science.gov (United States)

    Kada, Olivia; Hedenik, Marina; Griesser, Anna; Mark, Anna-Theresa; Trost, Julia

    2017-01-25

    The terms "quality of life" and "health" are often used interchangeably even though there are indications to suggest that they are distinct constructs. Nevertheless, studies which would help to understand the difference between these constructs on the level of subjective theories of nursing home residents are lacking. Because nursing personnel can essentially contribute to the quality of life of residents, the comparison of subjective theories from residents and from nursing personnel can help to detect and understand potential discrepancies. Semi-structured interviews were conducted with 31 pairs of residents and their nursing personnel. Based on the approach of Fliege and Filipp (2000) one half of the respondents answered the questions using the term "quality of life" and the other half using the term "health". In addition, quality of life and health had to be rated on a visual analogue scale (VAS), whereby residents rated themselves and nurses rated the corresponding resident. Data were analyzed using qualitative content analysis in a team-based approach. Following a mixed methods approach the deductively developed main categories and the inductively developed subcategories were quantified and statistically analyzed together with the VAS ratings. Quality of life was more strongly associated with psychological, social and environmental aspects, whereas health more strongly evoked thoughts on physical functioning. This effect was stronger in nursing personnel, which can be explained by their role concept. In future scientific studies the terms should be used accurately, as they elicit different associations. The term "quality of life" seems to be more suitable to adequately reflect the adaptability of elderly people.

  1. Hegel’s Theory of Moral Action, its Place in his System and the ‘Highest’ Right of the Subject

    Directory of Open Access Journals (Sweden)

    David Rose

    2007-12-01

    Full Text Available There is at present, amongst Hegel scholars and in the interpretative discussions of Hegelrsquo;s social and political theories, the flavour of old-style lsquo;apologyrsquo; for his liberal credentials, as though there exists a real need to prove he holds basic liberal views palatable to the hegemonic, contemporary political worldview. Such an approach is no doubt motivated by the need to reconstruct what is left of the modern moral conscience when Hegel has finished discussing the flaws and contradictions of the Kantian model of moral judgement. The main claim made in the following pages is that the critique of lsquo;subjectiversquo; moralities is neither the sole nor even the main reason for the adoption of an immanent doctrine of ethics. This paper will look to Hegelrsquo;s mature theory of action as motivating the critique of transcendentalism rather than merely filling in the hole left when one rejects Kant and it will discuss what the consequences of this approach are for the role of the moral conscience within the political sphere, arguing that Hegelrsquo;s own conditions of free action would not be met unless the subjective moral conscience was operative in the rational state.

  2. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  3. Climbing Mount Probable

    Science.gov (United States)

    Harper, Marc Allen

    2009-01-01

    This work attempts to explain the relationships between natural selection, information theory, and statistical inference. In particular, a geometric formulation of information theory known as information geometry and its deep connections to evolutionary game theory inform the role of natural selection in evolutionary processes. The goals of this…

  4. Probability on compact Lie groups

    CERN Document Server

    Applebaum, David

    2014-01-01

    Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...

  5. Analytic methods in applied probability in memory of Fridrikh Karpelevich

    CERN Document Server

    Suhov, Yu M

    2002-01-01

    This volume is dedicated to F. I. Karpelevich, an outstanding Russian mathematician who made important contributions to applied probability theory. The book contains original papers focusing on several areas of applied probability and its uses in modern industrial processes, telecommunications, computing, mathematical economics, and finance. It opens with a review of Karpelevich's contributions to applied probability theory and includes a bibliography of his works. Other articles discuss queueing network theory, in particular, in heavy traffic approximation (fluid models). The book is suitable

  6. On Randomness and Probability

    Indian Academy of Sciences (India)

    casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...

  7. Dynamic update with probabilities

    NARCIS (Netherlands)

    Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant

  8. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  9. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  10. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  11. Investigating multiple components of attitude, subjective norm, and perceived control: an examination of the theory of planned behaviour in the exercise domain.

    Science.gov (United States)

    Rhodes, Ryan E; Courneya, Kerry S

    2003-03-01

    The presence of two subcomponents within each theory of planned behaviour (TPB) concept of attitude (affective and instrumental), subjective norm (injunctive and descriptive), and PBC (self-efficacy and controllability) has been widely supported. However, research has not examined whether the commonality of variance between these components (i.e. a general factor) or the specificity of variance within the subcomponents influences intention and behaviour. Therefore, the purpose of this study was to examine the optimal conceptualization of either two subcomponents or a general common factor for each TPB concept within an omnibus model. Further, to test whether conceptualizations may differ by population even within the same behavioural domain, we examined these research questions with 300 undergraduates (M age = 20) and 272 cancer survivors (M age = 61) for exercise behaviour. Results identified that a general subjective norm factor was an optimal predictive conceptualization over two separate injunctive and descriptive norm components. In contrast, a specific self-efficacy component, and not controllability or a general factor of PBC, predicted intention optimally for both samples. Finally, optimal models of attitude differed between the populations, with a general factor best predicting intention for undergraduates but only affective attitude influencing intention for cancer survivors. The findings of these studies underscore the possibility for optimal tailored interventions based on population and behaviour. Finally, a discussion of the theoretical ambiguity of the PBC concept led to suggestions for future research and possible re-conceptualization.

  12. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  13. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  14. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  15. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  16. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  17. The control processes and subjective well-being of Chinese teachers: Evidence of convergence with and divergence from the key propositions of the motivational theory of life-span development

    Directory of Open Access Journals (Sweden)

    Wan-Chi eWong

    2014-05-01

    Full Text Available An analytical review of the motivational theory of life-span development reveals that this theory has undergone a series of elegant theoretical integrations. Its claim to universality nonetheless brings forth unresolved controversies. With the purpose of scrutinizing the key propositions of this theory, an empirical study was designed to examine the control processes and subjective well-being of Chinese teachers (N = 637. The OPS-Scales (Optimization in Primary and Secondary Control Scales for the Domain of Teaching were constructed to assess patterns of control processes. Three facets of subjective well-being were investigated with the Positive and Negative Affect Schedule, the Life Satisfaction Scale, and the Subjective Vitality Scale. The results revealed certain aspects of alignment with and certain divergences from the key propositions of the motivational theory of life-span development. Neither primacy of primary control nor primacy of secondary control was clearly supported. Notably, using different criteria for subjective well-being yielded different subtypes of primary and secondary control as predictors. The hypothesized life-span trajectories of primary and secondary control received limited support. To advance the theory in this area, we recommend incorporating Lakatos’ ideas about sophisticated falsification by specifying the hard core of the motivational theory of life-span development and articulating new auxiliary hypotheses.

  18. Entropy in probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Rolke, W.A.

    1992-01-01

    The author develops a theory of entropy, where entropy is defined as the Legendre-Fenchel transform of the logarithmic moment generating function of a probability measure on a Banach space. A variety of properties relating the probability measure and its entropy are proven. It is shown that the entropy of a large class of stochastic processes can be approximated by the entropies of the finite-dimensional distributions of the process. For several types of measures the author finds explicit formulas for the entropy, for example for stochastic processes with independent increments and for Gaussian processes. For the entropy of Markov chains, evaluated at the observations of the process, the author proves a central limit theorem. Theorems relating weak convergence of probability measures on a finite dimensional space and pointwise convergence of their entropies are developed and then used to give a new proof of Donsker's theorem. Finally the use of entropy in statistics is discussed. The author shows the connection between entropy and Kullback's minimum discrimination information. A central limit theorem yields a test for the independence of a sequence of observations.

  19. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  20. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  1. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  2. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  3. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  4. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  5. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  6. Time preference and its relationship with age, health, and survival probability

    Directory of Open Access Journals (Sweden)

    Li-Wei Chao

    2009-02-01

    Full Text Available Although theories from economics and evolutionary biology predict that one's age, health, and survival probability should be associated with one's subjective discount rate (SDR, few studies have empirically tested for these links. Our study analyzes in detail how the SDR is related to age, health, and survival probability, by surveying a sample of individuals in townships around Durban, South Africa. In contrast to previous studies, we find that age is not significantly related to the SDR, but both physical health and survival expectations have a U-shaped relationship with the SDR. Individuals in very poor health have high discount rates, and those in very good health also have high discount rates. Similarly, those with expected survival probability on the extremes have high discount rates. Therefore, health and survival probability, and not age, seem to be predictors of one's SDR in an area of the world with high morbidity and mortality.

  7. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  8. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  9. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    Introductory treatment develops the theory of integration in a general context, making it applicable to other branches of analysis. More specialized topics include convergence theorems and random sequences and functions. 1963 edition.

  10. Huygens' foundations of probability

    NARCIS (Netherlands)

    Freudenthal, Hans

    It is generally accepted that Huygens based probability on expectation. The term “expectation,” however, stems from Van Schooten's Latin translation of Huygens' treatise. A literal translation of Huygens' Dutch text shows more clearly what Huygens actually meant and how he proceeded.

  11. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  12. Univariate Probability Distributions

    Science.gov (United States)

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  13. probably mostly white

    African Journals Online (AJOL)

    Willem Scholtz

    internet – the (probably mostly white) public's interest in the so-called Border War is ostensibly at an all-time high. By far most of the publications are written by ex- ... understanding of this very important episode in the history of Southern Africa. It was, therefore, with some anticipation that one waited for this book, which.

  14. On Randomness and Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.

  15. A modern theory of random variation with applications in stochastic calculus, financial mathematics, and Feynman integration

    CERN Document Server

    Muldowney, Patrick

    2012-01-01

    A Modern Theory of Random Variation is a new and radical re-formulation of the mathematical underpinnings of subjects as diverse as investment, communication engineering, and quantum mechanics. Setting aside the classical theory of probability measure spaces, the book utilizes a mathematically rigorous version of the theory of random variation that bases itself exclusively on finitely additive probability distribution functions. In place of twentieth century Lebesgue integration and measure theory, the author uses the simpler concept of Riemann sums, and the non-absolute Riemann-type integration of Henstock. Readers are supplied with an accessible approach to standard elements of probability theory such as the central limmit theorem and Brownian motion as well as remarkable, new results on Feynman diagrams and stochastic integrals. Throughout the book, detailed numerical demonstrations accompany the discussions of abstract mathematical theory, from the simplest elements of the subject to the most complex. I...

  16. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  17. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  18. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  19. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto; Baake, Ellen; Georgii, Hans-Otto

    2008-01-01

    This book is a translation of the third edition of the well accepted German textbook 'Stochastik', which presents the fundamental ideas and results of both probability theory and statistics, and comprises the material of a one-year course. The stochastic concepts, models and methods are motivated by examples and problems and then developed and analysed systematically.

  20. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  1. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  2. Quantum probability for probabilists

    CERN Document Server

    Meyer, Paul-André

    1993-01-01

    In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.

  3. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  4. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  5. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  6. Structural Minimax Probability Machine.

    Science.gov (United States)

    Gu, Bin; Sun, Xingming; Sheng, Victor S

    2017-07-01

    Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

  7. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  8. Random walks on three-strand braids and on related hyperbolic groups 05.40.-a Fluctuation phenomena, random processes, noise, and Brownian motion; 02.50.-r Probability theory, stochastic processes, and statistics; 02.40.Ky Riemannian geometries;

    CERN Document Server

    Nechaev, S

    2003-01-01

    We investigate the statistical properties of random walks on the simplest nontrivial braid group B sub 3 , and on related hyperbolic groups. We provide a method using Cayley graphs of groups allowing us to compute explicitly the probability distribution of the basic statistical characteristics of random trajectories - the drift and the return probability. The action of the groups under consideration in the hyperbolic plane is investigated, and the distribution of a geometric invariant - the hyperbolic distance - is analysed. It is shown that a random walk on B sub 3 can be viewed as a 'magnetic random walk' on the group PSL(2, Z).

  9. Random walks on three-strand braids and on related hyperbolic groups[05.40.-a Fluctuation phenomena, random processes, noise, and Brownian motion; 02.50.-r Probability theory, stochastic processes, and statistics; 02.40.Ky Riemannian geometries;

    Energy Technology Data Exchange (ETDEWEB)

    Nechaev, Sergei [Laboratoire de Physique Theorique et Modeles Statistiques, Universite Paris Sud, 91405 Orsay Cedex (France); Voituriez, Raphael [Laboratoire de Physique Theorique et Modeles Statistiques, Universite Paris Sud, 91405 Orsay Cedex (France)

    2003-01-10

    We investigate the statistical properties of random walks on the simplest nontrivial braid group B{sub 3}, and on related hyperbolic groups. We provide a method using Cayley graphs of groups allowing us to compute explicitly the probability distribution of the basic statistical characteristics of random trajectories - the drift and the return probability. The action of the groups under consideration in the hyperbolic plane is investigated, and the distribution of a geometric invariant - the hyperbolic distance - is analysed. It is shown that a random walk on B{sub 3} can be viewed as a 'magnetic random walk' on the group PSL(2, Z)

  10. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  11. Decisions under risk in Parkinson's disease: preserved evaluation of probability and magnitude.

    Science.gov (United States)

    Sharp, Madeleine E; Viswanathan, Jayalakshmi; McKeown, Martin J; Appel-Cresswell, Silke; Stoessl, A Jon; Barton, Jason J S

    2013-11-01

    Unmedicated Parkinson's disease patients tend to be risk-averse while dopaminergic treatment causes a tendency to take risks. While dopamine agonists may result in clinically apparent impulse control disorders, treatment with levodopa also causes shift in behaviour associated with an enhanced response to rewards. Two important determinants in decision-making are how subjects perceive the magnitude and probability of outcomes. Our objective was to determine if patients with Parkinson's disease on or off levodopa showed differences in their perception of value when making decisions under risk. The Vancouver Gambling task presents subjects with a choice between one prospect with larger outcome and a second with higher probability. Eighteen age-matched controls and eighteen patients with Parkinson's disease before and after levodopa were tested. In the Gain Phase subjects chose between one prospect with higher probability and another with larger reward to maximize their gains. In the Loss Phase, subjects played to minimize their losses. Patients with Parkinson's disease, on or off levodopa, were similar to controls when evaluating gains. However, in the Loss Phase before levodopa, they were more likely to avoid the prospect with lower probability but larger loss, as indicated by the steeper slope of their group psychometric function (t(24) = 2.21, p = 0.04). Modelling with prospect theory suggested that this was attributable to a 28% overestimation of the magnitude of loss, rather than an altered perception of its probability. While pre-medicated patients with Parkinson's disease show risk-aversion for large losses, patients on levodopa have normal perception of magnitude and probability for both loss and gain. The finding of accurate and normally biased decisions under risk in medicated patients with PD is important because it indicates that, if there is indeed anomalous risk-seeking behaviour in such a cohort, it may derive from abnormalities in components of

  12. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  13. Music-evoked incidental happiness modulates probability weighting during risky lottery choices

    Science.gov (United States)

    Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.

    2014-01-01

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007

  14. Music-evoked incidental happiness modulates probability weighting during risky lottery choices

    Directory of Open Access Journals (Sweden)

    Stefan eSchulreich

    2014-01-01

    Full Text Available We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the happy than in the sad and random tones conditions. Via structural regressions based on CPT, we found that the observed changes in participants’ choices can be attributed to changes in the elevation parameter of the probability weighting function: In the happy condition, participants showed significantly higher decision weights associated with the larger payoffs than in the sad and random tones conditions. Moreover, the elevation parameter correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.

  15. Music-evoked incidental happiness modulates probability weighting during risky lottery choices.

    Science.gov (United States)

    Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R

    2014-01-07

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.

  16. Anthropic decision theory

    CERN Document Server

    Armstrong, Stuart

    2011-01-01

    This paper sets out to solve the Sleeping Beauty problem and various related anthropic problems, not through the calculation of anthropic probabilities, but through finding the correct decision to make. Given certain simple assumptions, it turns out to be possible to do so without knowing the underlying anthropic probabilities. Most common anthropic problems are underspecified from the decision perspective, and this can explain some of the differing intuitions in the subject: selfless and selfish agents, total and average utilitarians, will all reach different decisions in the same problem. These results are formalised into an anthropic decision theory, that is them used to solve many anthropic problems and paradoxes, such as the Presumptuous Philosopher, Adam and Eve, and Doomsday problems.

  17. Random vibrations theory and practice

    CERN Document Server

    Wirsching, Paul H; Ortiz, Keith

    1995-01-01

    Random Vibrations: Theory and Practice covers the theory and analysis of mechanical and structural systems undergoing random oscillations due to any number of phenomena— from engine noise, turbulent flow, and acoustic noise to wind, ocean waves, earthquakes, and rough pavement. For systems operating in such environments, a random vibration analysis is essential to the safety and reliability of the system. By far the most comprehensive text available on random vibrations, Random Vibrations: Theory and Practice is designed for readers who are new to the subject as well as those who are familiar with the fundamentals and wish to study a particular topic or use the text as an authoritative reference. It is divided into three major sections: fundamental background, random vibration development and applications to design, and random signal analysis. Introductory chapters cover topics in probability, statistics, and random processes that prepare the reader for the development of the theory of random vibrations a...

  18. Statistics, Probability and Chaos

    OpenAIRE

    Berliner, L. Mark

    1992-01-01

    The study of chaotic behavior has received substantial attention in many disciplines. Although often based on deterministic models, chaos is associated with complex, "random" behavior and forms of unpredictability. Mathematical models and definitions associated with chaos are reviewed. The relationship between the mathematics of chaos and probabilistic notions, including ergodic theory and uncertainty modeling, are emphasized. Popular data analytic methods appearing in the literature are disc...

  19. The mathematics of various entertaining subjects

    CERN Document Server

    Rosenhouse, Jason

    Volume 1 : The history of mathematics is filled with major breakthroughs resulting from solutions to recreational problems. Problems of interest to gamblers led to the modern theory of probability, for example, and surreal numbers were inspired by the game of Go. Yet even with such groundbreaking findings and a wealth of popular-level books exploring puzzles and brainteasers, research in recreational mathematics has often been neglected. The Mathematics of Various Entertaining Subjects brings together authors from a variety of specialties to present fascinating problems and solutions in recreational mathematics. Contributors to the book show how sophisticated mathematics can help construct mazes that look like famous people, how the analysis of crossword puzzles has much in common with understanding epidemics, and how the theory of electrical circuits is useful in understanding the classic Towers of Hanoi puzzle. The card game SET is related to the theory of error-correcting codes, and simple tic-tac-toe tak...

  20. Atomic theories

    CERN Document Server

    Loring, FH

    2014-01-01

    Summarising the most novel facts and theories which were coming into prominence at the time, particularly those which had not yet been incorporated into standard textbooks, this important work was first published in 1921. The subjects treated cover a wide range of research that was being conducted into the atom, and include Quantum Theory, the Bohr Theory, the Sommerfield extension of Bohr's work, the Octet Theory and Isotopes, as well as Ionisation Potentials and Solar Phenomena. Because much of the material of Atomic Theories lies on the boundary between experimentally verified fact and spec

  1. Considerations on probability: from games of chance to modern science

    Directory of Open Access Journals (Sweden)

    Paola Monari

    2015-12-01

    Full Text Available The article sets out a number of considerations on the distinction between variability and uncertainty over the centuries. Games of chance have always been useful random experiments which through combinatorial calculation have opened the way to probability theory and to the interpretation of modern science through statistical laws. The article also looks briefly at the stormy nineteenth-century debate concerning the definitions of probability which went over the same grounds – sometimes without any historical awareness – as the debate which arose at the very beginnings of probability theory, when the great probability theorists were open to every possible meaning of the term.

  2. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  3. A seismic probability map

    Directory of Open Access Journals (Sweden)

    J. M. MUNUERA

    1964-06-01

    Full Text Available The material included in former two papers (SB and EF
    which summs 3307 shocks corresponding to 2360 years, up to I960, was
    reduced to a 50 years period by means the weight obtained for each epoch.
    The weitliing factor is the ratio 50 and the amount of years for every epoch.
    The frequency has been referred over basis VII of the international
    seismic scale of intensity, for all cases in which the earthquakes are equal or
    greater than VI and up to IX. The sum of products: frequency and parameters
    previously exposed, is the probable frequency expected for the 50
    years period.
    On each active small square, we have made the corresponding computation
    and so we have drawn the Map No 1, in percentage. The epicenters with
    intensity since X to XI are plotted in the Map No 2, in order to present a
    complementary information.
    A table shows the return periods obtained for all data (VII to XI,
    and after checking them with other computed from the first up to last shock,
    a list includes the probable approximate return periods estimated for the area.
    The solution, we suggest, is an appropriated form to express the seismic
    contingent phenomenon and it improves the conventional maps showing
    the equal intensity curves corresponding to the maximal values of given side.

  4. Comonotonic Book-Making with Nonadditive Probabilities

    NARCIS (Netherlands)

    Diecidue, E.; Wakker, P.P.

    2000-01-01

    This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the

  5. An introduction to information theory

    CERN Document Server

    Reza, Fazlollah M

    1994-01-01

    Graduate-level study for engineering students presents elements of modern probability theory, information theory, coding theory, more. Emphasis on sample space, random variables, capacity, etc. Many reference tables and extensive bibliography. 1961 edition.

  6. 6th International Conference on Soft Methods in Probability and Statistics

    CERN Document Server

    Berthold, Michael; Moewes, Christian; Gil, María; Grzegorzewski, Przemysław; Hryniewicz, Olgierd; Synergies of Soft Computing and Statistics for Intelligent Data Analysis

    2013-01-01

    In recent years there has been a growing interest to extend classical methods for data analysis. The aim is to allow a more flexible modeling of phenomena such as uncertainty, imprecision or ignorance. Such extensions of classical probability theory and statistics are useful in many real-life situations, since uncertainties in data are not only present in the form of randomness --- various types of incomplete or subjective information have to be handled. About twelve years ago the idea of strengthening the dialogue between the various research communities in the field of data analysis was born and resulted in the International Conference Series on Soft Methods in Probability and Statistics (SMPS). This book gathers contributions presented at the SMPS'2012 held in Konstanz, Germany. Its aim is to present recent results illustrating new trends in intelligent data analysis. It gives a comprehensive overview of current research into the fusion of soft computing methods with probability and statistics. Synergies o...

  7. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  8. A High-Order Theory for the Analysis of Circular Cylindrical Composite Sandwich Shells with Transversely Compliant Core Subjected to External Loads

    DEFF Research Database (Denmark)

    Rahmani, Omid; Khalili, S.M.R.; Thomsen, Ole Thybo

    2012-01-01

    A new model based on the high order sandwich panel theory is proposed to study the effect of external loads on the free vibration of circular cylindrical composite sandwich shells with transversely compliant core, including also the calculation of the buckling loads. In the present model, in cont......A new model based on the high order sandwich panel theory is proposed to study the effect of external loads on the free vibration of circular cylindrical composite sandwich shells with transversely compliant core, including also the calculation of the buckling loads. In the present model......, in contrast to most of the available sandwich plate and shell theories, no prior assumptions are made with respect to the displacement field in the core. Herein the displacement and the stress fields of the core material are determined through a 3D elasticity solution. The performance of the present theory...... is compared with that of other sandwich theories by the presentation of comparative results obtained for several examples encompassing different material properties and geometric parameters. It is shown that the present model produce results of very high accuracy, and it is suggested that the present model...

  9. Probability of Boulders

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    To collect background information for formulating a description of the expected soil properties along the tunnel line, in 1987 Storebælt initiated a statistical investigation of the occurrence and size of boulders in the Great Belt area. The data for the boulder size distribution were obtained...... by use of aerial photographing of cliff beaches with subsequent stereo metric measurement on the photographs. To get information about the density of occurrence, a series of continuous seismic scannings were also made along parallel lines in a corridor with the tunnel line as about the central line....... The data collection part of the investigation was made on the basis of geological expert advice (Gunnar Larsen, Århus) by the Danish Geotechnical Institute (DGI).The statistical data analysis combined with stochastic modeling based on geometry and sound wave diffraction theory gave a point estimate...

  10. Maximum Probability Domains for Hubbard Models

    CERN Document Server

    Acke, Guillaume; Claeys, Pieter W; Van Raemdonck, Mario; Poelmans, Ward; Van Neck, Dimitri; Bultinck, Patrick

    2015-01-01

    The theory of Maximum Probability Domains (MPDs) is formulated for the Hubbard model in terms of projection operators and generating functions for both exact eigenstates as well as Slater determinants. A fast MPD analysis procedure is proposed, which is subsequently used to analyse numerical results for the Hubbard model. It is shown that the essential physics behind the considered Hubbard models can be exposed using MPDs. Furthermore, the MPDs appear to be in line with what is expected from Valence Bond Theory-based knowledge.

  11. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  12. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  13. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  14. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  15. The Subjectivity of Participation

    DEFF Research Database (Denmark)

    Nissen, Morten

    What is a 'we' – a collective – and how can we use such communal self-knowledge to help people? This book is about collectivity, participation, and subjectivity – and about the social theories that may help us understand these matters. It also seeks to learn from the innovative practices and ideas...... practices. Through this dialogue, it develops an original trans-disciplinary critical theory and practice of collective subjectivity for which the ongoing construction and overcoming of common sense, or ideology, is central. It also points to ways of relating discourse with agency, and fertilizing insights...... from interactionism and ideology theories in a cultural-historical framework....

  16. Classicality versus quantumness in Born's probability

    Science.gov (United States)

    Luo, Shunlong

    2017-11-01

    Born's rule, which postulates the probability of a measurement outcome in a quantum state, is pivotal to interpretations and applications of quantum mechanics. By exploiting the departure of the product of two Hermitian operators in Born's rule from Hermiticity, we prescribe an intrinsic and natural scheme to decompose Born's probability into a classical part and a quantum part, which have significant implications in quantum information theory. The classical part constitutes the information compatible with the associated measurement operator, while the quantum part represents the quantum coherence of the state with respect to the measurement operator. Fundamental properties of the decomposition are revealed. As applications, we establish several trade-off relations for the classicality and quantumness in Born's probability, which may be interpreted as alternative realizations of Heisenberg's uncertainty principle. The results shed physical lights on related issues concerning quantification of complementarity, coherence, and uncertainty, as well as the classical-quantum interplay.

  17. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  18. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  19. The Influence of Subjective Life Expectancy on Retirement Transition and Planning: A Longitudinal Study

    Science.gov (United States)

    Griffin, Barbara; Hesketh, Beryl; Loh, Vanessa

    2012-01-01

    This study examines the construct of subjective life expectancy (SLE), or the estimation of one's probable age of death. Drawing on the tenets of socioemotional selectivity theory (Carstensen, Isaacowitz, & Charles, 1999), we propose that SLE provides individuals with their own unique mental model of remaining time that is likely to affect their…

  20. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Designing Multimedia Learning Application with Learning Theories: A Case Study on a Computer Science Subject with 2-D and 3-D Animated Versions

    Science.gov (United States)

    Rias, Riaza Mohd; Zaman, Halimah Badioze

    2011-01-01

    Higher learning based instruction may be primarily concerned in most cases with the content of their academic lessons, and not very much with their instructional delivery. However, the effective application of learning theories and technology in higher education has an impact on student performance. With the rapid progress in the computer and…

  2. Imprecise probability analysis for integrated assessment of climate change

    OpenAIRE

    Kriegler, Elmar

    2005-01-01

    We present an application of imprecise probability theory to the quantification of uncertainty in the integrated assessment of climate change. Our work is motivated by the fact that uncertainty about climate change is pervasive, and therefore requires a thorough treatment in the integrated assessment process. Classical probability theory faces some severe difficulties in this respect, since it cannot capture very poor states of information in a satisfactory manner. A more general framework is...

  3. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  4. On the discretization of probability density functions and the ...

    Indian Academy of Sciences (India)

    In probability theory, statistics, statistical mechanics, communication theory, and other fields of science, the calculation of Rényi and Tsallis entropies [1–3] for probability density function ρ(x) involves integral. ∫ b a [ρ(x)] q dx, where q ≥ 0 is a parameter. The aim of this paper is to present a procedure for the discretization of ...

  5. A Priori Probability Distribution of the Cosmological Constant

    OpenAIRE

    Weinberg, Steven

    2000-01-01

    In calculations of the probability distribution for the cosmological constant, it has been previously assumed that the a priori probability distribution is essentially constant in the very narrow range that is anthropically allowed. This assumption has recently been challenged. Here we identify large classes of theories in which this assumption is justified.

  6. A Quantum Theoretical Explanation for Probability Judgment Errors

    Science.gov (United States)

    Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.

    2011-01-01

    A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…

  7. Theory of semigroups and applications

    CERN Document Server

    Sinha, Kalyan B

    2017-01-01

    The book presents major topics in semigroups, such as operator theory, partial differential equations, harmonic analysis, probability and statistics and classical and quantum mechanics, and applications. Along with a systematic development of the subject, the book emphasises on the explorations of the contact areas and interfaces, supported by the presentations of explicit computations, wherever feasible. Designed into seven chapters and three appendixes, the book targets to the graduate and senior undergraduate students of mathematics, as well as researchers in the respective areas. The book envisages the pre-requisites of a good understanding of real analysis with elements of the theory of measures and integration, and a first course in functional analysis and in the theory of operators. Chapters 4 through 6 contain advanced topics, which have many interesting applications such as the Feynman–Kac formula, the central limit theorem and the construction of Markov semigroups. Many examples have been given in...

  8. The emergence of probabilities in anhomomorphic logic

    Science.gov (United States)

    Ghazi-Tabatabai, Yousef; Wallden, Petros

    2009-06-01

    Anhomomorphic logic is a new interpretation of Quantum Theory (due to R. Sorkin). It is a histories formulation (c.f. consistent histories, quantum measure theory). In this approach, reality is a co-event, which is essentially an assignment of a truth value {True, False} to each question. The way this assignment is done mimics classical physics in as much as possible allowing however for sufficient flexibility to accommodate quantum 'paradoxes', as is shown by the analysis of Kochen-Specker theorem. In this contribution, after briefly reviewing the approach, we will examine how probabilistic predictions can arise. The Cournot principle and the use of approximate preclusions will play a crucial role. Facing similar problems in interpreting probability as in classical probability theory, we will resort to the weak form of Cournot principle, where possible realities will be preclusive co-events and the quantum measure is used to obtain predictions. Examples considered, includes the fair coin and the double slit pattern arguably one of the most important paradigms for quantum theory.

  9. The emergence of probabilities in anhomomorphic logic

    Energy Technology Data Exchange (ETDEWEB)

    Ghazi-Tabatabai, Yousef [Blackett Laboratory, Imperial College, London, SW7 2AZ (United Kingdom); Wallden, Petros, E-mail: yousef.ghazi05@imperial.ac.u, E-mail: petros.wallden@gmail.co [Raman Research Institute, Theoretical Physics Group Sadashivanagar, Bangalore - 560 080 (India)

    2009-06-01

    Anhomomorphic logic is a new interpretation of Quantum Theory (due to R. Sorkin). It is a histories formulation (c.f. consistent histories, quantum measure theory). In this approach, reality is a co-event, which is essentially an assignment of a truth value left braceTrue, Falseright brace to each question. The way this assignment is done mimics classical physics in as much as possible allowing however for sufficient flexibility to accommodate quantum 'paradoxes', as is shown by the analysis of Kochen-Specker theorem. In this contribution, after briefly reviewing the approach, we will examine how probabilistic predictions can arise. The Cournot principle and the use of approximate preclusions will play a crucial role. Facing similar problems in interpreting probability as in classical probability theory, we will resort to the weak form of Cournot principle, where possible realities will be preclusive co-events and the quantum measure is used to obtain predictions. Examples considered, includes the fair coin and the double slit pattern arguably one of the most important paradigms for quantum theory.

  10. The role of probabilities in physics.

    Science.gov (United States)

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. Introduction to graph theory

    CERN Document Server

    Wilson, Robin J

    1985-01-01

    Graph Theory has recently emerged as a subject in its own right, as well as being an important mathematical tool in such diverse subjects as operational research, chemistry, sociology and genetics. This book provides a comprehensive introduction to the subject.

  12. The break-up of Ekman theory in a flow subjected to background rotation and driven by a non-conservative body force

    NARCIS (Netherlands)

    Duran-Matute, M.; Di Nitto, G.; Trieling, R.R.; Kamp, L.P.J.; van Heijst, G.J.F.

    2012-01-01

    We present an experimental/numerical study of a dipolar flow structure in a shallow layer of electrolyte driven by electromagnetic forcing and subjected to background rotation. The aim of this study is to determine the influence of a non-conservative body force on the range of applicability of the

  13. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  14. Probability with applications in engineering, science, and technology

    CERN Document Server

    Carlton, Matthew A

    2017-01-01

    This updated and revised first-course textbook in applied probability provides a contemporary and lively post-calculus introduction to the subject of probability. The exposition reflects a desirable balance between fundamental theory and many applications involving a broad range of real problem scenarios. It is intended to appeal to a wide audience, including mathematics and statistics majors, prospective engineers and scientists, and those business and social science majors interested in the quantitative aspects of their disciplines. The textbook contains enough material for a year-long course, though many instructors will use it for a single term (one semester or one quarter). As such, three course syllabi with expanded course outlines are now available for download on the book’s page on the Springer website. A one-term course would cover material in the core chapters (1-4), supplemented by selections from one or more of the remaining chapters on statistical inference (Ch. 5), Markov chains (Ch. 6), stoch...

  15. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  16. Liquefaction Probability Curves for Surficial Geologic Units

    Science.gov (United States)

    Holzer, T. L.; Noce, T. E.; Bennett, M. J.

    2009-12-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both

  17. Limiting values of large deviation probabilities of quadratic statistics

    NARCIS (Netherlands)

    Jeurnink, Gerardus A.M.; Kallenberg, W.C.M.

    1990-01-01

    Application of exact Bahadur efficiencies in testing theory or exact inaccuracy rates in estimation theory needs evaluation of large deviation probabilities. Because of the complexity of the expressions, frequently a local limit of the nonlocal measure is considered. Local limits of large deviation

  18. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  19. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  20. Possibilistic systems within a general information theory

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, C.

    1999-06-01

    The author surveys possibilistic systems theory and place it in the context of Imprecise Probabilities and General Information Theory (GIT). In particular, he argues that possibilistic systems hold a distinct position within a broadly conceived, synthetic GIT. The focus is on systems and applications which are semantically grounded by empirical measurement methods (statistical counting), rather than epistemic or subjective knowledge elicitation or assessment methods. Regarding fuzzy measures as special provisions, and evidence measures (belief and plausibility measures) as special fuzzy measures, thereby he can measure imprecise probabilities directly and empirically from set-valued frequencies (random set measurement). More specifically, measurements of random intervals yield empirical fuzzy intervals. In the random set (Dempster-Shafer) context, probability and possibility measures stand as special plausibility measures in that their distributionality (decomposability) maps directly to an aggregable structure of the focal classes of their random sets. Further, possibility measures share with imprecise probabilities the ability to better handle open world problems where the universe of discourse is not specified in advance. In addition to empirically grounded measurement methods, possibility theory also provides another crucial component of a full systems theory, namely prediction methods in the form of finite (Markov) processes which are also strictly analogous to the probabilistic forms.

  1. THEORY OF SOCIAL CONSTRUCTIVISM AS THE BASIS OF INTEGRATION OF THE FOREIGN LANGUAGE AND SUBJECT CONTENTS OF THE STUDIED DISCIPLINES IN TECHNICAL INSTITUTION

    Directory of Open Access Journals (Sweden)

    Л Л Салехова

    2016-12-01

    Full Text Available Recently there is a need for training of the competitive experts of a technical profile capable to carry out oral and written communication within professional communication. Training of this sort of experts can be carried out on the basis of the integrated subject and language training - Content and LanguageIntegrated Learning. In article the essence of this concept reveals, its social and constructivist orientation is considered, examples of use of information and communication technologies are given.

  2. An introduction to measure-theoretic probability

    CERN Document Server

    Roussas, George G

    2004-01-01

    This book provides in a concise, yet detailed way, the bulk of the probabilistic tools that a student working toward an advanced degree in statistics,probability and other related areas, should be equipped with. The approach is classical, avoiding the use of mathematical tools not necessary for carrying out the discussions. All proofs are presented in full detail.* Excellent exposition marked by a clear, coherent and logical devleopment of the subject* Easy to understand, detailed discussion of material* Complete proofs

  3. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  4. Measure and integration theory

    CERN Document Server

    Burckel, Robert B

    2001-01-01

    This book gives a straightforward introduction to the field as it is nowadays required in many branches of analysis and especially in probability theory. The first three chapters (Measure Theory, Integration Theory, Product Measures) basically follow the clear and approved exposition given in the author's earlier book on ""Probability Theory and Measure Theory"". Special emphasis is laid on a complete discussion of the transformation of measures and integration with respect to the product measure, convergence theorems, parameter depending integrals, as well as the Radon-Nikodym theorem. The fi

  5. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  6. Posterior Probability Matching and Human Perceptual Decision Making

    Science.gov (United States)

    Murray, Richard F.; Patel, Khushbu; Yee, Alan

    2015-01-01

    Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models’ performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods provide new tools

  7. The Object of the Subject

    DEFF Research Database (Denmark)

    Hansen, Brian Benjamin

    2016-01-01

    The article presents a theory of the subject, based on the work of Jacques Lacan, using the concepts of alienation, separation and liberation.......The article presents a theory of the subject, based on the work of Jacques Lacan, using the concepts of alienation, separation and liberation....

  8. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  9. The structure of information: from probability to homology

    OpenAIRE

    Vigneaux, Juan Pablo

    2017-01-01

    D. Bennequin and P. Baudot introduced a cohomological construction adapted to information theory, called "information cohomology" (see "The homological nature of Entropy", 2015). Our text serves as a detailed introduction to information cohomology, containing the necessary background in probability theory and homological algebra. It makes explicit the link with topos theory, as introduced by Grothendieck, Verdier and their collaborators in the SGA IV. It also contains several new construction...

  10. Subjective indicators as a gauge for improving organizational well-being. An attempt to apply the cognitive activation theory to organizations.

    Science.gov (United States)

    Arnetz, Bengt B

    2005-11-01

    Globally, organizations are undergoing substantial changes, commonly resulting in significant employee stress. However, facing similar stressors and challenges, departments within an organizations, as well as companies within the same area of business, vary in the way they cope with change. It was hypothesized that collective uncertainty about the future as well as unclear organizational goals contribute to chronic stress in organizations exposed to change. Applying the theoretical cognitive activation theory of stress--CATS--model by Ursin and Eriksen at an organizational level, support was found for the above hypothesis. Changes in chronic stress indicators between two assessments were related to clarity of organizational goals. It is suggested that the CATS model might be fruitful, not only in understanding variations in individual stress responses and experiences, but also to interpret and manage organizational stress. By doing so, both organizational health and well-being will improve, creating enterprises with healthy employees and healthy productivity and economic results.

  11. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  12. Graph theory

    CERN Document Server

    Gould, Ronald

    2012-01-01

    This introduction to graph theory focuses on well-established topics, covering primary techniques and including both algorithmic and theoretical problems. The algorithms are presented with a minimum of advanced data structures and programming details. This thoroughly corrected 1988 edition provides insights to computer scientists as well as advanced undergraduates and graduate students of topology, algebra, and matrix theory. Fundamental concepts and notation and elementary properties and operations are the first subjects, followed by examinations of paths and searching, trees, and networks. S

  13. Rethinking the learning of belief network probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  14. Multidimensional stationary probability distribution for interacting active particles

    National Research Council Canada - National Science Library

    Maggi, Claudio; Marconi, Umberto Marini Bettolo; Gnan, Nicoletta; Di Leonardo, Roberto

    2015-01-01

    We derive the stationary probability distribution for a non-equilibrium system composed by an arbitrary number of degrees of freedom that are subject to Gaussian colored noise and a conservative potential...

  15. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    Science.gov (United States)

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  16. Transit probabilities for debris around white dwarfs

    Science.gov (United States)

    Lewis, John Arban; Johnson, John A.

    2017-01-01

    The discovery of WD 1145+017 (Vanderburg et al. 2015), a metal-polluted white dwarf with an infrared-excess and transits confirmed the long held theory that at least some metal-polluted white dwarfs are actively accreting material from crushed up planetesimals. A statistical understanding of WD 1145-like systems would inform us on the various pathways for metal-pollution and the end states of planetary systems around medium- to high-mass stars. However, we only have one example and there are presently no published studies of transit detection/discovery probabilities for white dwarfs within this interesting regime. We present a preliminary look at the transit probabilities for metal-polluted white dwarfs and their projected space density in the Solar Neighborhood, which will inform future searches for analogs to WD 1145+017.

  17. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  18. Percolation Theory

    OpenAIRE

    Wierman, John C.

    1982-01-01

    An introduction is provided to the mathematical tools and problems of percolation theory. A discussion of Bernoulli percolation models shows the role of graph duality and correlation inequalities in the recent determination of the critical probability in the square, triangular, and hexagonal lattice bond models. An introduction to first passage percolation concentrates on the problems of existence of optimal routes, length of optimal routes, and conditions for convergence of first passage tim...

  19. Training Teachers to Teach Probability

    Science.gov (United States)

    Batanero, Carmen; Godino, Juan D.; Roa, Rafael

    2004-01-01

    In this paper we analyze the reasons why the teaching of probability is difficult for mathematics teachers, describe the contents needed in the didactical preparation of teachers to teach probability and analyze some examples of activities to carry out this training. These activities take into account the experience at the University of Granada,…

  20. Daniel Courgeau: Probability and social science: methodological relationships between the two approaches [Review of: . Probability and social science: methodological relationships between the two approaches

    NARCIS (Netherlands)

    Willekens, F.J.C.

    2013-01-01

    Throughout history, humans engaged in games in which randomness plays a role. In the 17th century, scientists started to approach chance scientifically and to develop a theory of probability. Courgeau describes how the relationship between probability theory and social sciences emerged and evolved

  1. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  2. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM...

  3. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  4. Decision making generalized by a cumulative probability weighting function

    Science.gov (United States)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  5. Nurses' behavioural intentions towards self-poisoning patients: a theory of reasoned action, comparison of attitudes and subjective norms as predictive variables.

    Science.gov (United States)

    McKinlay, A; Couston, M; Cowan, S

    2001-04-01

    The incidence of self-poisoning is on the increase. Most patients who self-poison are dealt with initially in the general hospital. Therefore, the type and quality of care self-poisoning patients receive will depend, in part, on how they are viewed by nursing staff within the general hospital setting. A knowledge and understanding of the attitudes held by nurses towards self-poisoning patients is therefore important to those involved in the planning and delivery of care towards this client group. Previous studies have examined health care professionals' attitudes towards people who self-poison. Usually, however, these have not focused specifically on nurses' attitudes, and they have ignored the relationship between the attitudes expressed by staff and their intentions to engage in subsequent caring behaviour of one sort or another. It is hence unclear how the findings of such studies are relevant or applicable to nursing policy and practice. The present study aims to address these limitations using a methodology informed by the theory of reasoned action. The study aims to separate out the distinctive roles played by nurses' own attitudes, and the social pressures represented by other people's attitudes, in determining the types of caring behaviour in which nurses intend to engage when dealing with self-poisoning patients. The study adopts a questionnaire-based approach incorporating two specially designed vignettes. The results show that nurses' own attitudes, and what they believe about the attitudes of others, predict their behavioural intentions towards self-poisoning patients. The study also shows that nurses with a more positive orientation towards self-poisoning patients differ in behavioural and normative beliefs from nurses who have a less positive orientation. The implications for future attempts to explore the relationship between nurses' attitudes and subsequent caring behaviour are considered, along with implications for nursing policy and practice.

  6. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  7. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  8. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  9. Statistical methods for solar flare probability forecasting

    Science.gov (United States)

    Vecchia, D. F.; Tryon, P. V.; Caldwell, G. A.; Jones, R. W.

    1980-09-01

    The Space Environment Services Center (SESC) of the National Oceanic and Atmospheric Administration provides probability forecasts of regional solar flare disturbances. This report describes a statistical method useful to obtain 24 hour solar flare forecasts which, historically, have been subjectively formulated. In Section 1 of this report flare classifications of the SESC and the particular probability forecasts to be considered are defined. In Section 2 we describe the solar flare data base and outline general principles for effective data management. Three statistical techniques for solar flare probability forecasting are discussed in Section 3, viz, discriminant analysis, logistic regression, and multiple linear regression. We also review two scoring measures and suggest the logistic regression approach for obtaining 24 hour forecasts. In Section 4 a heuristic procedure is used to select nine basic predictors from the many available explanatory variables. Using these nine variables logistic regression is demonstrated by example in Section 5. We conclude in Section 6 with band broad suggestions regarding continued development of objective methods for solar flare probability forecasting.

  10. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  11. Considerations on a posteriori probability

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available In this first paper of 1911 relating to the sex ratio at birth, Gini repurposed a Laplace’s succession rule according to a Bayesian version. The Gini's intuition consisted in assuming for prior probability a Beta type distribution and introducing the "method of results (direct and indirect" for the determination of  prior probabilities according to the statistical frequency obtained from statistical data.

  12. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....

  13. Transition Probabilities of Gd I

    Science.gov (United States)

    Bilty, Katherine; Lawler, J. E.; Den Hartog, E. A.

    2011-01-01

    Rare earth transition probabilities are needed within the astrophysics community to determine rare earth abundances in stellar photospheres. The current work is part an on-going study of rare earth element neutrals. Transition probabilities are determined by combining radiative lifetimes measured using time-resolved laser-induced fluorescence on a slow atom beam with branching fractions measured from high resolution Fourier transform spectra. Neutral rare earth transition probabilities will be helpful in improving abundances in cool stars in which a significant fraction of rare earths are neutral. Transition probabilities are also needed for research and development in the lighting industry. Rare earths have rich spectra containing 100's to 1000's of transitions throughout the visible and near UV. This makes rare earths valuable additives in Metal Halide - High Intensity Discharge (MH-HID) lamps, giving them a pleasing white light with good color rendering. This poster presents the work done on neutral gadolinium. We will report radiative lifetimes for 135 levels and transition probabilities for upwards of 1500 lines of Gd I. The lifetimes are reported to ±5% and the transition probabilities range from 5% for strong lines to 25% for weak lines. This work is supported by the National Science Foundation under grant CTS 0613277 and the National Science Foundation's REU program through NSF Award AST-1004881.

  14. Elementary decision theory

    CERN Document Server

    Chernoff, Herman

    1988-01-01

    This well-respected introduction to statistics and statistical theory covers data processing, probability and random variables, utility and descriptive statistics, computation of Bayes strategies, models, testing hypotheses, and much more. 1959 edition.

  15. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  16. The Bernoullis and the Origin of Probability Theory: Looking back ...

    Indian Academy of Sciences (India)

    Basel in 1987. His major research interests are multivariate time series models, Bayesian and financial econometrics and computer-intensive statistical methods. Wolfgang Polasek. This paper describes the ...... tate access to the historical texts for the modern reader by providing interpretative introductions, explanatory ...

  17. Joseph L Doob and Development of Probability Theory

    Indian Academy of Sciences (India)

    IAS Admin

    ted with his thesis work. There were excellent mathematicians at Harvard at that time including. Garret Birkhoff, Kellogg and Morse, all leaders in their fields. But, Doob says that he completely missed contact with any of these three and many other good mathematicians at Harvard. Over the last sixty years, most top class US ...

  18. Bayesian Statistics-The Theory of Inverse Probability

    Indian Academy of Sciences (India)

    Author Affiliations. Mohan Delampady1 T Krishnan2. Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560 059, India. Systat Software Asia-Pacific (P) Ltd. Floor 5, 'C' Tower Golden Enclave, Airport Road Bangalore 560 017, India.

  19. Cooperative Control Simulation Validation Using Applied Probability Theory

    National Research Council Canada - National Science Library

    Schulz, Christopher

    2003-01-01

    ...; however, these simulations lack a method to validate their output. This research presents a method to validate the performance of a decentralized cooperative control simulation environment for an autonomous Wide Area Search Munition (WASM...

  20. Contributions of Kolmogorov to the Foundations of Probability Theory

    Indian Academy of Sciences (India)

    ... zero-one law, the three series theorem, Kolmogorov's inequality, consistency theorem, law of the iterated logarithm, Chapman-. Kolmogorov equation, Kolmogorov's forward and backward equations, Kolmogorov's criterion for continuity, Kolmogorov-Smirnov test, characterisation of infinitely divisible distributions with finite.

  1. Some results of ruin probability for the classical risk process

    OpenAIRE

    He Yuanjiang; Li Xucheng; John Zhang

    2003-01-01

    The computation of ruin probability is an important problem in the collective risk theory. It has applications in the fields of insurance, actuarial science, and economics. Many mathematical models have been introduced to simulate business activities and ruin probability is studied based on these models. Two of these models are the classical risk model and the Cox model. In the classical model, the counting process is a Poisson process and in the Cox model, the counting process...

  2. Probability of Failure in Hypersonic Engines Using Large Deviations

    OpenAIRE

    Papanicolaou, George; West, Nicholas; Yang, Tzu-Wei

    2012-01-01

    We consider a reduced order model of an air-breathing hypersonic engine with a time-dependent stochastic inflow that may cause the failure of the engine. The probability of failure is analyzed by the Freidlin-Wentzell theory, the large deviation principle for finite dimensional stochastic differential equations. We compute the asymptotic failure probability by numerically solving the constrained optimization related to the large deviation problem. A large-deviation-based importance sampling s...

  3. Negative Binomial and Multinomial States: probability distributions and coherent states

    OpenAIRE

    Fu, Hong-Chen; Sasaki, Ryu

    1996-01-01

    Following the relationship between probability distribution and coherent states, for example the well known Poisson distribution and the ordinary coherent states and relatively less known one of the binomial distribution and the $su(2)$ coherent states, we propose ``interpretation'' of $su(1,1)$ and $su(r,1)$ coherent states ``in terms of probability theory''. They will be called the ``negative binomial'' (``multinomial'') ``states'' which correspond to the ``negative'' binomial (multinomial)...

  4. Expanding subjectivities

    DEFF Research Database (Denmark)

    Lundgaard Andersen, Linda; Soldz, Stephen

    2012-01-01

    A major theme in recent psychoanalytic thinking concerns the use of therapist subjectivity, especially “countertransference,” in understanding patients. This thinking converges with and expands developments in qualitative research regarding the use of researcher subjectivity as a tool to understa...

  5. Continuity theory

    CERN Document Server

    Nel, Louis

    2016-01-01

    This book presents a detailed, self-contained theory of continuous mappings. It is mainly addressed to students who have already studied these mappings in the setting of metric spaces, as well as multidimensional differential calculus. The needed background facts about sets, metric spaces and linear algebra are developed in detail, so as to provide a seamless transition between students' previous studies and new material. In view of its many novel features, this book will be of interest also to mature readers who have studied continuous mappings from the subject's classical texts and wish to become acquainted with a new approach. The theory of continuous mappings serves as infrastructure for more specialized mathematical theories like differential equations, integral equations, operator theory, dynamical systems, global analysis, topological groups, topological rings and many more. In light of the centrality of the topic, a book of this kind fits a variety of applications, especially those that contribute to ...

  6. Probability in reasoning: a developmental test on conditionals.

    Science.gov (United States)

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Science.gov (United States)

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  8. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Directory of Open Access Journals (Sweden)

    Karel Doubravsky

    Full Text Available Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (rechecked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  9. Naive Probability: Model-Based Estimates of Unique Events.

    Science.gov (United States)

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  10. Two-slit experiment: quantum and classical probabilities

    Science.gov (United States)

    Khrennikov, Andrei

    2015-06-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum-classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane).

  11. P300, probability, and introverted/extroverted personality types.

    Science.gov (United States)

    Cahill, J M; Polich, J

    1992-05-01

    Extreme introverted and extroverted subject groups (n = 24 each) containing equal numbers of male and females were assessed with the P300 (P3) component of the event-related potential (ERP). A two-tone auditory discrimination task in which the probability of the target stimulus varied systematically in different conditions (.20, .40, .60, .80) was used to elicit the ERPs. The P3 amplitude demonstrated a significant interaction between personality type, probability, and subject gender and was generally smaller for introverts than for extroverts. Female subjects tended to have larger overall P3 components than male subjects. P3 latency was not affected by the personality variable. The results support previous findings for ERP differences between introverts and extroverts and suggest that personality type differentially influences target stimulus probability effects. The findings are discussed in terms of individual differences in cortical activity on P3 amplitude and personality measures.

  12. Fitness Probability Distribution of Bit-Flip Mutation.

    Science.gov (United States)

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  13. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  14. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  15. Flood hazard probability mapping method

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  16. Incompatible Stochastic Processes and Complex Probabilities

    Science.gov (United States)

    Zak, Michail

    1997-01-01

    The definition of conditional probabilities is based upon the existence of a joint probability. However, a reconstruction of the joint probability from given conditional probabilities imposes certain constraints upon the latter, so that if several conditional probabilities are chosen arbitrarily, the corresponding joint probability may not exist.

  17. Image and Morphology in Modern Theory of Architecture

    Science.gov (United States)

    Yankovskaya, Y. S.; Merenkov, A. V.

    2017-11-01

    This paper is devoted to some important and fundamental problems of the modern Russian architectural theory. These problems are: methodological and technological retardation; substitution of the modern professional architectural theoretical knowledge by the humanitarian concepts; preference of the traditional historical or historical-theoretical research. One of the most probable ways is the formation of useful modern subject (and multi-subject)-oriented concepts in architecture. To get over the criticism and distrust of the architectural theory is possible through the recognition of an important role of the subject (architect, consumer, contractor, ruler, etc.) and direction of the practical tasks of the forming human environment in the today’s rapidly changing world and post-industrial society. In this article we consider the evolution of two basic concepts for the theory of architecture such as the image and morphology.

  18. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  19. Can quantum probability provide a new direction for cognitive modeling?

    Science.gov (United States)

    Pothos, Emmanuel M; Busemeyer, Jerome R

    2013-06-01

    Classical (Bayesian) probability (CP) theory has led to an influential research tradition for modeling cognitive processes. Cognitive scientists have been trained to work with CP principles for so long that it is hard even to imagine alternative ways to formalize probabilities. However, in physics, quantum probability (QP) theory has been the dominant probabilistic approach for nearly 100 years. Could QP theory provide us with any advantages in cognitive modeling as well? Note first that both CP and QP theory share the fundamental assumption that it is possible to model cognition on the basis of formal, probabilistic principles. But why consider a QP approach? The answers are that (1) there are many well-established empirical findings (e.g., from the influential Tversky, Kahneman research tradition) that are hard to reconcile with CP principles; and (2) these same findings have natural and straightforward explanations with quantum principles. In QP theory, probabilistic assessment is often strongly context- and order-dependent, individual states can be superposition states (that are impossible to associate with specific values), and composite systems can be entangled (they cannot be decomposed into their subsystems). All these characteristics appear perplexing from a classical perspective. However, our thesis is that they provide a more accurate and powerful account of certain cognitive processes. We first introduce QP theory and illustrate its application with psychological examples. We then review empirical findings that motivate the use of quantum theory in cognitive theory, but also discuss ways in which QP and CP theories converge. Finally, we consider the implications of a QP theory approach to cognition for human rationality.

  20. A reconsideration of Lotka's extinction probability using a bisexual branching process

    OpenAIRE

    Hull, David M.

    2001-01-01

    It is generally recognized that Alfred Lotka made the first application of standard Galton-Watson branching process theory to calculate an extinction probability in a specific population (using asexual reproduction). This note applies bisexual Galton-Watson branching process theory to the calculation of an extinction probability from Lotka's data, yielding a somewhat higher value.

  1. Assessing the Probability of Bankruptcy

    OpenAIRE

    Heaps, Jarom

    2015-01-01

    Knowing whether or not a company is financial stable has always been a top concern for analysts and money managers. This paper compares the effectiveness of default prediction using two different types of measures: accounting and market based. Accounting measures have been the most popular even though, according to theory, a market based measure reflects all available information. Theory goes as far to say that accounting measures can add no incremental value to a market based measure. In my ...

  2. SUBJECT INDEX

    Indian Academy of Sciences (India)

    Unknown

    Generalized density-functional theory: Conquering the N-representability problem with exact functionals for the electron pair density and the second-order re- duced density matrix. 507. Chemical reactivity of hypervalent silicon com- pounds: The local hard and soft acids and bases prin- ciple viewpoint. 525. A philicity based ...

  3. Politics of modern muslim subjectivities

    DEFF Research Database (Denmark)

    Jung, Dietrich; Petersen, Marie Juul; Sparre, Sara Lei

    Examining modern Muslim identity constructions, the authors introduce a novel analytical framework to Islamic Studies, drawing on theories of successive modernities, sociology of religion, and poststructuralist approaches to modern subjectivity, as well as the results of extensive fieldwork in th...

  4. Politics of modern muslim subjectivities

    DEFF Research Database (Denmark)

    Jung, Dietrich; Petersen, Marie Juul; Sparre, Sara Lei

    Examining modern Muslim identity constructions, the authors introduce a novel analytical framework to Islamic Studies, drawing on theories of successive modernities, sociology of religion, and poststructuralist approaches to modern subjectivity, as well as the results of extensive fieldwork...

  5. Thinking about Russia: plausible pasts and probable futures.

    Science.gov (United States)

    Tetlock, P E; Visser, P S

    2000-06-01

    This paper uses both correlational and experimental methods to explore the power of counterfactual cognitions about the past to constrain judgments about the future as well as policy preferences. Study 1 asked 47 specialists on the Soviet Union to assess both the plausibility of controversial counterfactuals and the probability of controversial conditional forecasts. The results reveal deep ideological schisms, with liberals much more likely than conservatives to believe that Stalinism was not inevitable, that the Cold War could have ended earlier, and that Gorbachev might have succeeded in democratizing the Soviet Union if he had been a better tactician, among others. Reactions to these counterfactuals proved to be highly predictive of positions that experts in early 1992 endorsed concerning the advisability of 'shock therapy', expanding NATO eastward, and economic aid to Russia. Study 2 manipulated the salience and plausibility of counterfactual scenarios concerning (a) why the Cold War ended as it did, and (b) how close the US and USSR came to nuclear war. Changes in the counterfactual scenarios that non-experts endorsed produced significant changes in their policy preferences in the direction suggested by the salient counterfactual. Experts, however, were unswayed, often generating counter-arguments against dissonant counterfactuals. Taken together, the studies show that assumptions about what happened in the missing control conditions of history are highly subjective, largely theory-driven and profoundly consequential.

  6. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  7. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  8. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  9. Probability and statistics: A reminder

    Science.gov (United States)

    Clément, Benoit

    2013-07-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from "data analysis in experimental sciences" given in [1

  10. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  11. Probability and statistics: A reminder

    OpenAIRE

    Clément Benoit

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  12. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  13. Persistence theory

    CERN Document Server

    Oudot, Steve Y

    2015-01-01

    Persistence theory emerged in the early 2000s as a new theory in the area of applied and computational topology. This book provides a broad and modern view of the subject, including its algebraic, topological, and algorithmic aspects. It also elaborates on applications in data analysis. The level of detail of the exposition has been set so as to keep a survey style, while providing sufficient insights into the proofs so the reader can understand the mechanisms at work. The book is organized into three parts. The first part is dedicated to the foundations of persistence and emphasizes its conne

  14. Quantum probability and cognitive modeling: some cautions and a promising direction in modeling physics learning.

    Science.gov (United States)

    Franceschetti, Donald R; Gire, Elizabeth

    2013-06-01

    Quantum probability theory offers a viable alternative to classical probability, although there are some ambiguities inherent in transferring the quantum formalism to a less determined realm. A number of physicists are now looking at the applicability of quantum ideas to the assessment of physics learning, an area particularly suited to quantum probability ideas.

  15. Introduction to Probability, Part 1 - Basic Concepts. Student Text. Revised Edition.

    Science.gov (United States)

    Blakeslee, David W.; And Others

    This book is designed to introduce the reader to some fundamental ideas about probability. The mathematical theory of probability plays an increasingly important role in science, government, industry, business, and economics. An understanding of the basic concepts of probability is essential for the study of statistical methods that are widely…

  16. Subjective Expected Utility Theory with "Small Worlds"

    DEFF Research Database (Denmark)

    Gyntelberg, Jacob; Hansen, Frank

    We model the notion of a "small world" as a context dependent state space embedded into the "grand world". For each situation the decision maker creates a "small world" reflecting the events perceived to be relevant for the act under consideration. The "grand world" is represented by an event space...

  17. Operator theory

    CERN Document Server

    2015-01-01

    A one-sentence definition of operator theory could be: The study of (linear) continuous operations between topological vector spaces, these being in general (but not exclusively) Fréchet, Banach, or Hilbert spaces (or their duals). Operator theory is thus a very wide field, with numerous facets, both applied and theoretical. There are deep connections with complex analysis, functional analysis, mathematical physics, and electrical engineering, to name a few. Fascinating new applications and directions regularly appear, such as operator spaces, free probability, and applications to Clifford analysis. In our choice of the sections, we tried to reflect this diversity. This is a dynamic ongoing project, and more sections are planned, to complete the picture. We hope you enjoy the reading, and profit from this endeavor.

  18. A Fractional Probability Calculus View of Allometry

    Directory of Open Access Journals (Sweden)

    Bruce J. West

    2014-04-01

    Full Text Available The scaling of respiratory metabolism with body size in animals is considered by many to be a fundamental law of nature. An apparent corollary of this law is the scaling of physiologic time with body size, implying that physiologic time is separate and distinct from clock time. However, these are only two of the many allometry relations that emerge from empirical studies in the physical, social and life sciences. Herein, we present a theory of allometry that provides a foundation for the allometry relation between a network function and the size that is entailed by the hypothesis that the fluctuations in the two measures are described by a scaling of the joint probability density. The dynamics of such networks are described by the fractional calculus, whose scaling solutions entail the empirically observed allometry relations.

  19. Ramsey theory for product spaces

    CERN Document Server

    Dodos, Pandelis

    2016-01-01

    Ramsey theory is a dynamic area of combinatorics that has various applications in analysis, ergodic theory, logic, number theory, probability theory, theoretical computer science, and topological dynamics. This book is devoted to one of the most important areas of Ramsey theory-the Ramsey theory of product spaces. It is a culmination of a series of recent breakthroughs by the two authors and their students who were able to lift this theory to the infinite-dimensional case. The book presents many major results and methods in the area, such as Szemerédi's regularity method, the hypergraph removal lemma, and the density Hales-Jewett theorem. This book addresses researchers in combinatorics but also working mathematicians and advanced graduate students who are interested in Ramsey theory. The prerequisites for reading this book are rather minimal: it only requires familiarity, at the graduate level, with probability theory and real analysis. Some familiarity with the basics of Ramsey theory would be beneficial, ...

  20. On misclassication probabilities of linear and quadratic classiers ...

    African Journals Online (AJOL)

    We study the theoretical misclassication probability of linear and quadratic classiers and examine the performance of these classiers under distributional variations in theory and using simulation. We derive expression for Bayes errors for some competing distributions from the same family under location shift. Keywords: ...

  1. On Field Size and Success Probability in Network Coding

    DEFF Research Database (Denmark)

    Geil, Hans Olav; Matsumoto, Ryutaroh; Thomsen, Casper

    2008-01-01

    Using tools from algebraic geometry and Gröbner basis theory we solve two problems in network coding. First we present a method to determine the smallest field size for which linear network coding is feasible. Second we derive improved estimates on the success probability of random linear network...

  2. Introduction to the Interface of Probability and Algorithms

    OpenAIRE

    Aldous, David; Steele, J. Michael

    1993-01-01

    Probability and algorithms enjoy an almost boisterous interaction that has led to an active, extensive literature that touches fields as diverse as number theory and the design of computer hardware. This article offers a gentle introduction to the simplest, most basic ideas that underlie this development.

  3. Potential theory

    CERN Document Server

    Helms, Lester L

    2014-01-01

    Potential Theory presents a clear path from calculus to classical potential theory and beyond, with the aim of moving the reader into the area of mathematical research as quickly as possible. The subject matter is developed from first principles using only calculus. Commencing with the inverse square law for gravitational and electromagnetic forces and the divergence theorem, the author develops methods for constructing solutions of Laplace's equation on a region with prescribed values on the boundary of the region. The latter half of the book addresses more advanced material aimed at those with the background of a senior undergraduate or beginning graduate course in real analysis. Starting with solutions of the Dirichlet problem subject to mixed boundary conditions on the simplest of regions, methods of morphing such solutions onto solutions of Poisson's equation on more general regions are developed using diffeomorphisms and the Perron-Wiener-Brelot method, culminating in application to Brownian motion. In ...

  4. Transition probabilities and radiative lifetimes of levels in F I

    Energy Technology Data Exchange (ETDEWEB)

    Celik, Gueltekin, E-mail: gultekin@selcuk.edu.tr; Dogan, Duygu; Ates, Sule; Taser, Mehmet

    2012-07-15

    The electric dipole transition probabilities and the lifetimes of excited levels have been calculated using the weakest bound electron potential model theory (WBEPMT) and the quantum defect orbital theory (QDOT) in atomic fluorine. In the calculations, many of transition arrays included both multiplet and fine-structure transitions are considered. We employed Numerical Coulomb Approximation (NCA) wave functions and numerical non-relativistic Hartree-Fock (NRHF) wave functions for expectation values of radii in determination of parameters. The necessary energy values have been taken from experimental energy data in the literature. The calculated transition probabilities and lifetimes have been compared with available theoretical and experimental results. A good agreement with results in literature has been obtained. Moreover, some transition probability and the lifetime values not existing in the literature for some highly excited levels have been obtained using these methods.

  5. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  6. SUBJECT INDEX

    Indian Academy of Sciences (India)

    Subject Index. Variation of surface electric field during geomagnetic disturbed period at Maitri, Antarctica. 1721. Geomorphology. A simple depression-filling method for raster and irregular elevation datasets. 1653. Decision Support System integrated with Geographic. Information System to target restoration actions in water-.

  7. Probability Distributions for Random Quantum Operations

    Science.gov (United States)

    Schultz, Kevin

    Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.

  8. Nevanlinna theory

    CERN Document Server

    Kodaira, Kunihiko

    2017-01-01

    This book deals with the classical theory of Nevanlinna on the value distribution of meromorphic functions of one complex variable, based on minimum prerequisites for complex manifolds. The theory was extended to several variables by S. Kobayashi, T. Ochiai, J. Carleson, and P. Griffiths in the early 1970s. K. Kodaira took up this subject in his course at The University of Tokyo in 1973 and gave an introductory account of this development in the context of his final paper, contained in this book. The first three chapters are devoted to holomorphic mappings from C to complex manifolds. In the fourth chapter, holomorphic mappings between higher dimensional manifolds are covered. The book is a valuable treatise on the Nevanlinna theory, of special interests to those who want to understand Kodaira's unique approach to basic questions on complex manifolds.

  9. Entanglement probabilities of polymers: a white noise functional approach

    CERN Document Server

    Bernido, C C

    2003-01-01

    The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)theta. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel.

  10. Winter School on Operator Spaces, Noncommutative Probability and Quantum Groups

    CERN Document Server

    2017-01-01

    Providing an introduction to current research topics in functional analysis and its applications to quantum physics, this book presents three lectures surveying recent progress and open problems.  A special focus is given to the role of symmetry in non-commutative probability, in the theory of quantum groups, and in quantum physics. The first lecture presents the close connection between distributional symmetries and independence properties. The second introduces many structures (graphs, C*-algebras, discrete groups) whose quantum symmetries are much richer than their classical symmetry groups, and describes the associated quantum symmetry groups. The last lecture shows how functional analytic and geometric ideas can be used to detect and to quantify entanglement in high dimensions.  The book will allow graduate students and young researchers to gain a better understanding of free probability, the theory of compact quantum groups, and applications of the theory of Banach spaces to quantum information. The l...

  11. Wigner function and the probability representation of quantum states

    Directory of Open Access Journals (Sweden)

    Man’ko Margarita A.

    2014-01-01

    Full Text Available The relation of theWigner function with the fair probability distribution called tomographic distribution or quantum tomogram associated with the quantum state is reviewed. The connection of the tomographic picture of quantum mechanics with the integral Radon transform of the Wigner quasidistribution is discussed. The Wigner–Moyal equation for the Wigner function is presented in the form of kinetic equation for the tomographic probability distribution both in quantum mechanics and in the classical limit of the Liouville equation. The calculation of moments of physical observables in terms of integrals with the state tomographic probability distributions is constructed having a standard form of averaging in the probability theory. New uncertainty relations for the position and momentum are written in terms of optical tomograms suitable for directexperimental check. Some recent experiments on checking the uncertainty relations including the entropic uncertainty relations are discussed.

  12. É possível uma sociologia do sujeito? Uma abordagem sobre as teorias de Foucault e Touraine Is a sociology of the subject possible? An approach to the theories of Foucault and Touraine

    Directory of Open Access Journals (Sweden)

    Neiva Furlin

    2012-04-01

    Full Text Available Este ensaio de pesquisa analisa as concepções teóricas da noção de sujeito nas obras de dois intelectuais franceses, Touraine e Foucault. Trata-se de uma abordagem comparativa, na qual se busca estabelecer uma relação entre os aspectos que convergem e se distinguem na teoria social desses autores, procurando refletir sobre as suas contribuições para a compreensão da sociedade contemporânea, especificamente no que tange aos processos de subjetivação dos indivíduos. Em razão disso, selecionaram-se as últimas obras da trajetória acadêmica desses intelectuais. Tanto Touraine como Foucault apontam contribuições teóricas para uma sociologia que não recorre aos grandes fenômenos históricos, mas que coloca em cena o esforço do sujeito no interior das microrrelações sociais, em suas contradições culturais, econômicas, políticas e pessoais.This research essay analyzes the theoretical concepts of the notion of "subject" in the works of two French intellectuals, Touraine and Foucault. This comparative approach seeks to establish a relation between the convergent and the contrasting aspects of their social theories, on the purpose of reflecting on their contributions to the comprehension of contemporary society, especially concerning individuals' subjectivation processes. For this purpose we chose the last works in the scholarly trajectory of these intellectuals. Touraine as well as Foucault offer theoretical contributions to a sociology that does not resort to the great historical phenomena, but puts the efforts of the subjects within their social micro-relations on the stage, considering their cultural, economic, political and personal contradictions.

  13. Is Piaget's epistemic subject dead?

    Science.gov (United States)

    Lawson, Anton E.

    Niaz (1990) presents arguments in favor of the retention of Piaget's epistemic subject as a theoretical construct to guide research and practice in science education and psychology. The intent of this article is to point out the weaknesses of those arguments and to suggest that the weight of evidence argues against the existence of the logical thinker postulated by Piaget. Therefore, contrary to Niaz's conclusion that the acceptance of Piaget's epistemic subject will facilitate the development of cognitive theories with greater explanatory power, the conclusion is reached that Piaget's epistemic subject is dead and that continued acceptance of this aspect of Piagetian theory would be counterproductive.

  14. Who plays dice? Subjective uncertainty in deterministic quantum world

    Science.gov (United States)

    Carter, Brandon

    2006-11-01

    Einstein's 1905 recognition that light consists of discrete ``quanta'' inaugurated the duality (wave versus particle) paradox that was resolved 20 years later by Born's introduction of the probability interpretation on which modem quantum theory is based. Einstein's refusal to abandon the classical notion of deterministic evolution - despite the unqualified success of the new paradigm on a local scale - foreshadowed the restoration of determinism in the attempt to develop a global treatment applicable to cosmology by Everett, who failed however to provide a logically coherent treatment of subjective uncertainty at a local level. This drawback has recently been overcome in an extended formulation allowing deterministic description of a physical universe in which the uncertainty concerns only our own particular local situations, whose probability is prescribed by an appropriate micro-anthropic principle.

  15. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  16. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...... to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible to explain cooperation in the one-shot Prisoner's Dilemma...

  17. Blind source separation theory and applications

    CERN Document Server

    Yu, Xianchuan; Xu, Jindong

    2013-01-01

    A systematic exploration of both classic and contemporary algorithms in blind source separation with practical case studies    The book presents an overview of Blind Source Separation, a relatively new signal processing method.  Due to the multidisciplinary nature of the subject, the book has been written so as to appeal to an audience from very different backgrounds. Basic mathematical skills (e.g. on matrix algebra and foundations of probability theory) are essential in order to understand the algorithms, although the book is written in an introductory, accessible style. This book offers

  18. Probability and statistics with R

    CERN Document Server

    Ugarte, Maria Dolores; Arnholt, Alan T

    2008-01-01

    -Technometrics, May 2009, Vol. 51, No. 2 The book is comprehensive and well written. The notation is clear and the mathematical derivations behind nontrivial equations and computational implementations are carefully explained. Rather than presenting a collection of R scripts together with a summary of relevant theoretical results, this book offers a well-balanced mix of theory, examples and R code.-Raquel Prado, University of California, Santa Cruz,  The American Statistician, February 2009… an impressive book … Overall, this is a good reference book with comprehensive coverage of the details

  19. Dependence in Probability and Statistics

    CERN Document Server

    Doukhan, Paul; Surgailis, Donatas; Teyssiere, Gilles

    2010-01-01

    This volume collects recent works on weakly dependent, long-memory and multifractal processes and introduces new dependence measures for studying complex stochastic systems. Other topics include the statistical theory for bootstrap and permutation statistics for infinite variance processes, the dependence structure of max-stable processes, and the statistical properties of spectral estimators of the long memory parameter. The asymptotic behavior of Fejer graph integrals and their use for proving central limit theorems for tapered estimators are investigated. New multifractal processes are intr

  20. CRC standard probability and statistics tables and formulae Student ed.

    CERN Document Server

    Kokoska, Stephen

    2000-01-01

    Users of statistics in their professional lives and statistics students will welcome this concise, easy-to-use reference for basic statistics and probability. It contains all of the standardized statistical tables and formulas typically needed plus material on basic statistics topics, such as probability theory and distributions, regression, analysis of variance, nonparametric statistics, and statistical quality control.For each type of distribution the authors supply:?definitions?tables?relationships with other distributions, including limiting forms?statistical parameters, such as variance a