WorldWideScience

Sample records for probable number rapid

  1. Comparing rapid methods for detecting Listeria in seafood and environmental samples using the most probably number (MPN) technique.

    Science.gov (United States)

    Cruz, Cristina D; Win, Jessicah K; Chantarachoti, Jiraporn; Mutukumira, Anthony N; Fletcher, Graham C

    2012-02-15

    The standard Bacteriological Analytical Manual (BAM) protocol for detecting Listeria in food and on environmental surfaces takes about 96 h. Some studies indicate that rapid methods, which produce results within 48 h, may be as sensitive and accurate as the culture protocol. As they only give presence/absence results, it can be difficult to compare the accuracy of results generated. We used the Most Probable Number (MPN) technique to evaluate the performance and detection limits of six rapid kits for detecting Listeria in seafood and on an environmental surface compared with the standard protocol. Three seafood products and an environmental surface were inoculated with similar known cell concentrations of Listeria and analyzed according to the manufacturers' instructions. The MPN was estimated using the MPN-BAM spreadsheet. For the seafood products no differences were observed among the rapid kits and efficiency was similar to the BAM method. On the environmental surface the BAM protocol had a higher recovery rate (sensitivity) than any of the rapid kits tested. Clearview™, Reveal®, TECRA® and VIDAS® LDUO detected the cells but only at high concentrations (>10(2) CFU/10 cm(2)). Two kits (VIP™ and Petrifilm™) failed to detect 10(4) CFU/10 cm(2). The MPN method was a useful tool for comparing the results generated by these presence/absence test kits. There remains a need to develop a rapid and sensitive method for detecting Listeria in environmental samples that performs as well as the BAM protocol, since none of the rapid tests used in this study achieved a satisfactory result. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Clan structure analysis and rapidity gap probability

    International Nuclear Information System (INIS)

    Lupia, S.; Giovannini, A.; Ugoccioni, R.

    1995-01-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)

  3. Clan structure analysis and rapidity gap probability

    Energy Technology Data Exchange (ETDEWEB)

    Lupia, S. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Giovannini, A. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Ugoccioni, R. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy)

    1995-03-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)

  4. Rapid, single-step most-probable-number method for enumerating fecal coliforms in effluents from sewage treatment plants

    Science.gov (United States)

    Munoz, E. F.; Silverman, M. P.

    1979-01-01

    A single-step most-probable-number method for determining the number of fecal coliform bacteria present in sewage treatment plant effluents is discussed. A single growth medium based on that of Reasoner et al. (1976) and consisting of 5.0 gr. proteose peptone, 3.0 gr. yeast extract, 10.0 gr. lactose, 7.5 gr. NaCl, 0.2 gr. sodium lauryl sulfate, and 0.1 gr. sodium desoxycholate per liter is used. The pH is adjusted to 6.5, and samples are incubated at 44.5 deg C. Bacterial growth is detected either by measuring the increase with time in the electrical impedance ratio between the innoculated sample vial and an uninnoculated reference vial or by visual examination for turbidity. Results obtained by the single-step method for chlorinated and unchlorinated effluent samples are in excellent agreement with those obtained by the standard method. It is suggested that in automated treatment plants impedance ratio data could be automatically matched by computer programs with the appropriate dilution factors and most probable number tables already in the computer memory, with the corresponding result displayed as fecal coliforms per 100 ml of effluent.

  5. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  6. Talking probabilities: communicating probabilistic information with words and numbers

    NARCIS (Netherlands)

    Renooij, S.; Witteman, C.L.M.

    1999-01-01

    The number of knowledge-based systems that build on Bayesian belief networks is increasing. The construction of such a network however requires a large number of probabilities in numerical form. This is often considered a major obstacle, one of the reasons being that experts are reluctant to

  7. Talking probabilities: communicating probalistic information with words and numbers

    NARCIS (Netherlands)

    Renooij, S.; Witteman, C.L.M.

    1999-01-01

    The number of knowledge-based systems that build on Bayesian belief networks is increasing. The construction of such a network however requires a large number of probabilities in numerical form. This is often considered a major obstacle, one of the reasons being that experts are reluctant to provide

  8. Learning Binomial Probability Concepts with Simulation, Random Numbers and a Spreadsheet

    Science.gov (United States)

    Rochowicz, John A., Jr.

    2005-01-01

    This paper introduces the reader to the concepts of binomial probability and simulation. A spreadsheet is used to illustrate these concepts. Random number generators are great technological tools for demonstrating the concepts of probability. Ideas of approximation, estimation, and mathematical usefulness provide numerous ways of learning…

  9. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    Energy Technology Data Exchange (ETDEWEB)

    O' Rourke, Patrick Francis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-27

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  10. Serodiagnosis of dengue infection using rapid immunochromatography test in patients with probable dengue infection.

    Science.gov (United States)

    Kidwai, Aneela Altaf; Jamal, Qaiser; Saher; Mehrunnisa; Farooqi, Faiz-ur-rehman; Saleem-Ullah

    2010-11-01

    To determine the frequency of seropositive dengue infection using rapid immunochromatographic assay in patients with probable dengue infection as per WHO criteria. A cross-sectional observational study was conducted at Abbasi Shaheed Hospital, Karachi from July 2008 to January 2009. Patients presenting with acute febrile illness, rashes, bleeding tendencies, leucopenia and or thrombocytopenia were evaluated according to WHO criteria for probable dengue infection. Acute phase sera were collected after 5 days of the onset of fever as per WHO criteria. Serology was performed using rapid immunochromatographic (ICT) assay with differential detection of IgM and IgG. A primary dengue infection was defined by a positive IgM band and a negative IgG band whereas secondary infection was defined by a positive IgG band with or without positive IgM band. Among 599 patients who met the WHO criteria for dengue infection, 251(41.9%) were found to be ICT reactive among whom 42 (16.73%) had primary infection. Secondary infection was reported in 209 (83.26%). Acute phase sera of 348 (58.09%) were ICT non reactive. Four patients died because of dengue shock syndrome among which three had secondary infection. Early identification of secondary infection in acute phase sera using rapid ICT is valuable in terms of disease progression and mortality. However in highly suspected cases of dengue infection clinical management should not rely on negative serological results.

  11. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  12. Rapid methods for detection of bacteria

    DEFF Research Database (Denmark)

    Corfitzen, Charlotte B.; Andersen, B.Ø.; Miller, M.

    2006-01-01

    Traditional methods for detection of bacteria in drinking water e.g. Heterotrophic Plate Counts (HPC) or Most Probable Number (MNP) take 48-72 hours to give the result. New rapid methods for detection of bacteria are needed to protect the consumers against contaminations. Two rapid methods...

  13. Adapting a Markov Monte Carlo simulation model for forecasting the number of Coronary Artery Revascularisation Procedures in an era of rapidly changing technology and policy

    Directory of Open Access Journals (Sweden)

    Knuiman Matthew

    2008-06-01

    Full Text Available Abstract Background Treatments for coronary heart disease (CHD have evolved rapidly over the last 15 years with considerable change in the number and effectiveness of both medical and surgical treatments. This period has seen the rapid development and uptake of statin drugs and coronary artery revascularization procedures (CARPs that include Coronary Artery Bypass Graft procedures (CABGs and Percutaneous Coronary Interventions (PCIs. It is difficult in an era of such rapid change to accurately forecast requirements for treatment services such as CARPs. In a previous paper we have described and outlined the use of a Markov Monte Carlo simulation model for analyzing and predicting the requirements for CARPs for the population of Western Australia (Mannan et al, 2007. In this paper, we expand on the use of this model for forecasting CARPs in Western Australia with a focus on the lack of adequate performance of the (standard model for forecasting CARPs in a period during the mid 1990s when there were considerable changes to CARP technology and implementation policy and an exploration and demonstration of how the standard model may be adapted to achieve better performance. Methods Selected key CARP event model probabilities are modified based on information relating to changes in the effectiveness of CARPs from clinical trial evidence and an awareness of trends in policy and practice of CARPs. These modified model probabilities and the ones obtained by standard methods are used as inputs in our Markov simulation model. Results The projected numbers of CARPs in the population of Western Australia over 1995–99 only improve marginally when modifications to model probabilities are made to incorporate an increase in effectiveness of PCI procedures. However, the projected numbers improve substantially when, in addition, further modifications are incorporated that relate to the increased probability of a PCI procedure and the reduced probability of a CABG

  14. Adapting a Markov Monte Carlo simulation model for forecasting the number of coronary artery revascularisation procedures in an era of rapidly changing technology and policy.

    Science.gov (United States)

    Mannan, Haider R; Knuiman, Matthew; Hobbs, Michael

    2008-06-25

    Treatments for coronary heart disease (CHD) have evolved rapidly over the last 15 years with considerable change in the number and effectiveness of both medical and surgical treatments. This period has seen the rapid development and uptake of statin drugs and coronary artery revascularization procedures (CARPs) that include Coronary Artery Bypass Graft procedures (CABGs) and Percutaneous Coronary Interventions (PCIs). It is difficult in an era of such rapid change to accurately forecast requirements for treatment services such as CARPs. In a previous paper we have described and outlined the use of a Markov Monte Carlo simulation model for analyzing and predicting the requirements for CARPs for the population of Western Australia (Mannan et al, 2007). In this paper, we expand on the use of this model for forecasting CARPs in Western Australia with a focus on the lack of adequate performance of the (standard) model for forecasting CARPs in a period during the mid 1990s when there were considerable changes to CARP technology and implementation policy and an exploration and demonstration of how the standard model may be adapted to achieve better performance. Selected key CARP event model probabilities are modified based on information relating to changes in the effectiveness of CARPs from clinical trial evidence and an awareness of trends in policy and practice of CARPs. These modified model probabilities and the ones obtained by standard methods are used as inputs in our Markov simulation model. The projected numbers of CARPs in the population of Western Australia over 1995-99 only improve marginally when modifications to model probabilities are made to incorporate an increase in effectiveness of PCI procedures. However, the projected numbers improve substantially when, in addition, further modifications are incorporated that relate to the increased probability of a PCI procedure and the reduced probability of a CABG procedure stemming from changed CARP preference

  15. Serodiagnosis of dengue infection using rapid immuno chromatography test in patients with probable dengue infection

    International Nuclear Information System (INIS)

    Kidwai, A.A.; Jamal, Q.; Mehrunnisa, S.; Farooqi, F.R.

    2010-01-01

    Objective: To determine the frequency of seropositive dengue infection using rapid immuno chromatographic assay in patients with probable dengue infection as per WHO criteria. Method: A cross-sectional observational study was conducted at Abbasi Shaheed Hospital, Karachi from July 2008 to January 2009. Patients presenting with acute febrile illness, rashes, bleeding tendencies, leucopenia and or thrombocytopenia were evaluated according to WHO criteria for probable dengue infection. Acute phase sera were collected after 5 days of the onset of fever as per WHO criteria. Serology was performed using rapid immuno chromatographic (ICT) assay with differential detection of IgM and IgG. A primary dengue infection was defined by a positive IgM band and a negative IgG band whereas secondary infection was defined by a positive IgG band with or without positive IgM band. Result: Among 599 patients who met the WHO criteria for dengue infection, 251(41.9%) were found to be ICT reactive among whom 42 (16.73%) had primary infection. Secondary infection was reported in 209 (83.26%). Acute phase sera of 348 (58.09%) were ICT non reactive. Four patients died because of dengue shock syndrome among which three had secondary infection. Conclusion: Early identification of secondary infection in acute phase sera using rapid ICT is valuable in terms of disease progression and mortality. However in highly suspected cases of dengue infection clinical management should not rely on negative serological results. (author)

  16. Hotspots ampersand other hidden targets: Probability of detection, number, frequency and area

    International Nuclear Information System (INIS)

    Vita, C.L.

    1994-01-01

    Concepts and equations are presented for making probability-based estimates of the detection probability, and the number, frequency, and area of hidden targets, including hotspots, at a given site. Targets include hotspots, which are areas of extreme or particular contamination, and any object or feature that is hidden from direct visual observation--including buried objects and geologic or hydrologic details or anomalies. Being Bayesian, results are fundamentally consistent with observational methods. Results are tools for planning or interpreting exploration programs used in site investigation or characterization, remedial design, construction, or compliance monitoring, including site closure. Used skillfully and creatively, these tools can help streamline and expedite environmental restoration, reducing time and cost, making site exploration cost-effective, and providing acceptable risk at minimum cost. 14 refs., 4 figs

  17. Medicine in words and numbers: a cross-sectional survey comparing probability assessment scales

    Directory of Open Access Journals (Sweden)

    Koele Pieter

    2007-06-01

    Full Text Available Abstract Background In the complex domain of medical decision making, reasoning under uncertainty can benefit from supporting tools. Automated decision support tools often build upon mathematical models, such as Bayesian networks. These networks require probabilities which often have to be assessed by experts in the domain of application. Probability response scales can be used to support the assessment process. We compare assessments obtained with different types of response scale. Methods General practitioners (GPs gave assessments on and preferences for three different probability response scales: a numerical scale, a scale with only verbal labels, and a combined verbal-numerical scale we had designed ourselves. Standard analyses of variance were performed. Results No differences in assessments over the three response scales were found. Preferences for type of scale differed: the less experienced GPs preferred the verbal scale, the most experienced preferred the numerical scale, with the groups in between having a preference for the combined verbal-numerical scale. Conclusion We conclude that all three response scales are equally suitable for supporting probability assessment. The combined verbal-numerical scale is a good choice for aiding the process, since it offers numerical labels to those who prefer numbers and verbal labels to those who prefer words, and accommodates both more and less experienced professionals.

  18. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  19. Method for rapidly determining a pulp kappa number using spectrophotometry

    Science.gov (United States)

    Chai, Xin-Sheng; Zhu, Jun Yong

    2002-01-01

    A system and method for rapidly determining the pulp kappa number through direct measurement of the potassium permanganate concentration in a pulp-permanganate solution using spectrophotometry. Specifically, the present invention uses strong acidification to carry out the pulp-permanganate oxidation reaction in the pulp-permanganate solution to prevent the precipitation of manganese dioxide (MnO.sub.2). Consequently, spectral interference from the precipitated MnO.sub.2 is eliminated and the oxidation reaction becomes dominant. The spectral intensity of the oxidation reaction is then analyzed to determine the pulp kappa number.

  20. An innovative method for offshore wind farm site selection based on the interval number with probability distribution

    Science.gov (United States)

    Wu, Yunna; Chen, Kaifeng; Xu, Hu; Xu, Chuanbo; Zhang, Haobo; Yang, Meng

    2017-12-01

    There is insufficient research relating to offshore wind farm site selection in China. The current methods for site selection have some defects. First, information loss is caused by two aspects: the implicit assumption that the probability distribution on the interval number is uniform; and ignoring the value of decision makers' (DMs') common opinion on the criteria information evaluation. Secondly, the difference in DMs' utility function has failed to receive attention. An innovative method is proposed in this article to solve these drawbacks. First, a new form of interval number and its weighted operator are proposed to reflect the uncertainty and reduce information loss. Secondly, a new stochastic dominance degree is proposed to quantify the interval number with a probability distribution. Thirdly, a two-stage method integrating the weighted operator with stochastic dominance degree is proposed to evaluate the alternatives. Finally, a case from China proves the effectiveness of this method.

  1. The average number of partons per clan in rapidity intervals in parton showers

    Energy Technology Data Exchange (ETDEWEB)

    Giovannini, A. [Turin Univ. (Italy). Ist. di Fisica Teorica; Lupia, S. [Max-Planck-Institut fuer Physik, Muenchen (Germany). Werner-Heisenberg-Institut; Ugoccioni, R. [Lund Univ. (Sweden). Dept. of Theoretical Physics

    1996-04-01

    The dependence of the average number of partons per clan on virtuality and rapidity variables is analytically predicted in the framework of the Generalized Simplified Parton Shower model, based on the idea that clans are genuine elementary subprocesses. The obtained results are found to be qualitatively consistent with experimental trends. This study extends previous results on the behavior of the average number of clans in virtuality and rapidity and shows how important physical quantities can be calculated analytically in a model based on essentials of QCD allowing local violations of the energy-momentum conservation law, still requiring its global validity. (orig.)

  2. The average number of partons per clan in rapidity intervals in parton showers

    International Nuclear Information System (INIS)

    Giovannini, A.; Lupia, S.; Ugoccioni, R.

    1996-01-01

    The dependence of the average number of partons per clan on virtuality and rapidity variables is analytically predicted in the framework of the Generalized Simplified Parton Shower model, based on the idea that clans are genuine elementary subprocesses. The obtained results are found to be qualitatively consistent with experimental trends. This study extends previous results on the behavior of the average number of clans in virtuality and rapidity and shows how important physical quantities can be calculated analytically in a model based on essentials of QCD allowing local violations of the energy-momentum conservation law, still requiring its global validity. (orig.)

  3. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  4. Reduced probability of smoking cessation in men with increasing number of job losses and partnership breakdowns

    DEFF Research Database (Denmark)

    Kriegbaum, Margit; Larsen, Anne Mette; Christensen, Ulla

    2011-01-01

    and to study joint exposure to both. Methods Birth cohort study of smoking cessation of 6232 Danish men born in 1953 with a follow-up at age 51 (response rate 66.2%). History of unemployment and cohabitation was measured annually using register data. Information on smoking cessation was obtained...... by a questionnaire. Results The probability of smoking cessation decreased with the number of job losses (ranging from 1 OR 0.54 (95% CI 0.46 to 0.64) to 3+ OR 0.41 (95% CI 0.30 to 0.55)) and of broken partnerships (ranging from 1 OR 0.74 (95% CI 0.63 to 0.85) to 3+ OR 0.50 (95% CI 0.39 to 0.63)). Furthermore......–23 years (OR 0.44, 95% CI 0.37 to 0.52)). Those who never cohabited and experienced one or more job losses had a particular low chance of smoking cessation (OR 0.19, 95% CI 0.12 to 0.30). Conclusion The numbers of job losses and of broken partnerships were both inversely associated with probability...

  5. A General Probability Formula of the Number of Location Areas' Boundaries Crossed by a Mobile Between Two Successive Call Arrivals

    Institute of Scientific and Technical Information of China (English)

    Yi-Hua Zhu; Ding-Hua Shi; Yong Xiong; Ji Gao; He-Zhi Luo

    2004-01-01

    Mobility management is a challenging topic in mobile computing environment. Studying the situation of mobiles crossing the boundaries of location areas is significant for evaluating the costs and performances of various location management strategies. Hitherto, several formulae were derived to describe the probability of the number of location areas' boundaries crossed by a mobile. Some of them were widely used in analyzing the costs and performances of mobility management strategies. Utilizing the density evolution method of vector Markov processes, we propose a general probability formula of the number of location areas' boundaries crossed by a mobile between two successive calls. Fortunately, several widely-used formulae are special cases of the proposed formula.

  6. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  7. Miniaturized most probable number for the enumeration of Salmonella sp in artificially contaminated chicken meat

    Directory of Open Access Journals (Sweden)

    FL Colla

    2014-03-01

    Full Text Available Salmonella is traditionally identified by conventional microbiological tests, but the enumeration of this bacterium is not used on a routine basis. Methods such as the most probable number (MPN, which utilize an array of multiple tubes, are time-consuming and expensive, whereas miniaturized most probable number (mMPN methods, which use microplates, can be adapted for the enumeration of bacteria, saving up time and materials. The aim of the present paper is to assess two mMPN methods for the enumeration of Salmonella sp in artificially-contaminated chicken meat samples. Microplates containing 24 wells (method A and 96 wells (method B, both with peptone water as pre-enrichment medium and modified semi-solid Rappaport-Vassiliadis (MSRV as selective enrichment medium, were used. The meat matrix consisted of 25g of autoclaved ground chicken breast contaminated with dilutions of up to 10(6 of Salmonella Typhimurium (ST and Escherichia coli (EC. In method A, the dilution 10-5 of Salmonella Typhimurium corresponded to >57 MPN/mL and the dilution 10-6 was equal to 30 MPN/mL. There was a correlation between the counts used for the artificial contamination of the samples and those recovered by mMPN, indicating that the method A was sensitive for the enumeration of different levels of contamination of the meat matrix. In method B, there was no correlation between the inoculated dilutions and the mMPN results.

  8. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  9. Gluon saturation: Survival probability for leading neutrons in DIS

    International Nuclear Information System (INIS)

    Levin, Eugene; Tapia, Sebastian

    2012-01-01

    In this paper we discuss the example of one rapidity gap process: the inclusive cross sections of the leading neutrons in deep inelastic scattering with protons (DIS). The equations for this process are proposed and solved, giving the example of theoretical calculation of the survival probability for one rapidity gap processes. It turns out that the value of the survival probability is small and it decreases with energy.

  10. Estimated probability of the number of buildings damaged by the ...

    African Journals Online (AJOL)

    The analysis shows that the probability estimator of the building damage ... and homeowners) should reserve the cost of repair at least worth the risk of loss, to face ... Keywords: Citarum River; logistic regression; genetic algorithm; losses risk; ...

  11. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  12. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  13. Kualitas Air Sumur Gali Kelurahan Lubuk Buaya Kecamatan Koto Tangah Kota Padang Berdasarkan Indeks Most Probable Number (MPN

    Directory of Open Access Journals (Sweden)

    Randa Novalino

    2016-09-01

    Full Text Available AbstrakDiare merupakan salah satu penyakit yang ditularkan melalui air terkontaminasi oleh agen penyebab seperti bakteri Coliform.  Menurut data Dinas Kesehatan Kota (DKK Padang pada tahun 2011, kejadian diare di Kelurahan Lubuk Buaya Kecamatan Koto Tangah, Kota Padang merupakan kasus tertinggi di Kota Padang. Tujuan penelitian ini adalah menentukan kualitas air sumur gali di Kelurahan Lubuk Buaya berdasarkan Indeks Most Probable Number (MPN menurut Peraturan Menteri Kesehatan Republik Indonesia (Permenkes RI No. 416 tahun 1990. Sampel penelitian ini adalah  air sumur gali yang digunakan di beberapa Rukun Tetangga (RT, yang diambil secara acak dari beberapa Rukun Warga (RW yang telah dipilih sebelumnya, sehingga didapatkan 15 sampel. Penelitian ini dilaksanakan dalam 2 tahap yaitu pengambilan sampel air sumur gali sekaligus observasi faktor yang mempengaruhi kualitas air dan pemeriksaan mikrobiologi dengan metode Most Probable Number (MPN Test. Tes ini terdiri dari tes presumtif dan tes konfirmatif yang disesuaikan dengan Permenkes RI. Hasil penelitian ialah 73,33% dari jumlah sumur yang diperiksa tidak memenuhi standar Permenkes R.I. karena mengandung Coliform > 50 pada setiap 100 ml air. Hanya 26,6% sumur yang memenuhi standar yang telah ditetapkan. Beberapa faktor yang dapat mempengaruhi yaitu lokasi sumber pencemaran, dinding parapet, drainase, tutup sumur dan sarana pengambilan air.Kata kunci: kualitas air sumur gali, MPN, coliform AbstractDiarrhea is one of the diseases that  transmitted through contaminated water by causative agent, one of which is coliform bacteria. According to data from City Health Department Padang in 2011, the incidence of diarrhea in Kelurahan Lubuk Buaya Kecamatan Koto Tangah – Padang is the highest case in the city of Padang. The objective of this study was to determine the water quality of wells dug in Kelurahan Lubuk Buaya by Most Probable Number Index (MPN according regulation of Indonesian health

  14. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  15. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  16. Time dependent and asymptotic neutron number probability distribution calculation using discrete Fourier transform

    International Nuclear Information System (INIS)

    Humbert, Ph.

    2005-01-01

    In this paper we consider the probability distribution of neutrons in a multiplying assembly. The problem is studied using a space independent one group neutron point reactor model without delayed neutrons. We recall the generating function methodology and analytical results obtained by G.I. Bell when the c 2 approximation is used and we present numerical solutions in the general case, without this approximation. The neutron source induced distribution is calculated using the single initial neutron distribution which satisfies a master (Kolmogorov backward) equation. This equation is solved using the generating function method. The generating function satisfies a differential equation and the probability distribution is derived by inversion of the generating function. Numerical results are obtained using the same methodology where the generating function is the Fourier transform of the probability distribution. Discrete Fourier transforms are used to calculate the discrete time dependent distributions and continuous Fourier transforms are used to calculate the asymptotic continuous probability distributions. Numerical applications are presented to illustrate the method. (author)

  17. Development and application of a most probable number-PCR assay to quantify flagellate populations in soil samples

    DEFF Research Database (Denmark)

    Fredslund, Line; Ekelund, Flemming; Jacobsen, Carsten Suhr

    2001-01-01

    This paper reports on the first successful molecular detection and quantification of soil protozoa. Quantification of heterotrophic flagellates and naked amoebae in soil has traditionally relied on dilution culturing techniques, followed by most-probable-number (MPN) calculations. Such methods...... are biased by differences in the culturability of soil protozoa and are unable to quantify specific taxonomic groups, and the results are highly dependent on the choice of media and the skills of the microscopists. Successful detection of protozoa in soil by DNA techniques requires (i) the development...

  18. Some applications of the fractional Poisson probability distribution

    International Nuclear Information System (INIS)

    Laskin, Nick

    2009-01-01

    Physical and mathematical applications of the recently invented fractional Poisson probability distribution have been presented. As a physical application, a new family of quantum coherent states has been introduced and studied. As mathematical applications, we have developed the fractional generalization of Bell polynomials, Bell numbers, and Stirling numbers of the second kind. The appearance of fractional Bell polynomials is natural if one evaluates the diagonal matrix element of the evolution operator in the basis of newly introduced quantum coherent states. Fractional Stirling numbers of the second kind have been introduced and applied to evaluate the skewness and kurtosis of the fractional Poisson probability distribution function. A representation of the Bernoulli numbers in terms of fractional Stirling numbers of the second kind has been found. In the limit case when the fractional Poisson probability distribution becomes the Poisson probability distribution, all of the above listed developments and implementations turn into the well-known results of the quantum optics and the theory of combinatorial numbers.

  19. Rapid kinetics of lysis in human natural cell-mediated cytotoxicity: some implications

    International Nuclear Information System (INIS)

    Bloom, E.T.; Babbitt, J.T.

    1983-01-01

    The entire lytic process of natural cell-mediated cytotoxicity against sensitive target cells can occur rapidly, within minutes. This was demonstrated by 51 chromium release and in single-cell assays. At the cellular level, most of the target cell lysis occurred within 15-30 min after binding to effector cells. The enriched natural killer cell subpopulation of lymphocytes obtained by Percoll density gradient centrifugation (containing greater than 70% large granular lymphocytes (LGL)) was the most rapidly lytic population by 51 chromium release. However, in the single-cell assay, the rate of lysis of bound target cells was quite similar for the LGL-enriched effector subpopulation and the higher density subpopulation of effector cells recognized previously. Both the light and dense effector cells contained similar numbers of target binding cells. Therefore, that the light subpopulation effected lysis more rapidly and to a greater extent than the dense subpopulation suggested that the low-density effector cells probably recycled more rapidly than those of higher density. This was corroborated by the finding that when conjugates were formed at 29 degrees C for the single-cell assay, a significant number of dead unconjugated targets could be observed only on the slides made with the LGL-enriched effector cells but not on those made with dense effector cell. Lysis continued to increase in the chromium-release assay probably because of recycling, recruitment, and/or heterogeneity of the effector cells, and/or because of heterogeneity or delayed death of the target cells

  20. A measure of mutual divergence among a number of probability distributions

    Directory of Open Access Journals (Sweden)

    J. N. Kapur

    1987-01-01

    major inequalities due to Shannon, Renyi and Holder. The inequalities are then used to obtain some useful results in information theory. In particular measures are obtained to measure the mutual divergence among two or more probability distributions.

  1. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  2. Effect of energy level sequences and neutron–proton interaction on α-particle preformation probability

    International Nuclear Information System (INIS)

    Ismail, M.; Adel, A.

    2013-01-01

    A realistic density-dependent nucleon–nucleon (NN) interaction with finite-range exchange part which produces the nuclear matter saturation curve and the energy dependence of the nucleon–nucleus optical model potential is used to calculate the preformation probability, S α , of α-decay from different isotones with neutron numbers N=124,126,128,130 and 132. We studied the variation of S α with the proton number, Z, for each isotone and found the effect of neutron and proton energy levels of parent nuclei on the behavior of the α-particle preformation probability. We found that S α increases regularly with the proton number when the proton pair in α-particle is emitted from the same level and the neutron level sequence is not changed during the Z-variation. In this case the neutron–proton (n–p) interaction of the two levels, contributing to emission process, is too small. On the contrary, if the proton or neutron level sequence is changed during the emission process, S α behaves irregularly, the irregular behavior increases if both proton and neutron levels are changed. This behavior is accompanied by change or rapid increase in the strength of n–p interaction

  3. An automated technique for most-probable-number (MPN) analysis of densities of phagotrophic protists with lux-AB labelled bacteria as growth medium

    DEFF Research Database (Denmark)

    Ekelund, Flemming; Christensen, Søren; Rønn, Regin

    1999-01-01

    An automated modification of the most-probable-number (MPN) technique has been developed for enumeration of phagotrophic protozoa. The method is based on detection of prey depletion in micro titre plates rather than on presence of protozoa. A transconjugant Pseudomonas fluorescens DR54 labelled w...

  4. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  5. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  6. Unitarity corrections to short-range order long-range rapidity correlations

    CERN Document Server

    Capella, A

    1978-01-01

    Although the effective hadronic forces have short range in rapidity space, one nevertheless expects long-range dynamical correlations induced by unitarity constraints. This paper contains a thorough discussion of long-range rapidity correlations in high-multiplicity events. In particular, the authors analyze in detail the forward- backward multiplicity correlations, measured recently in the whole CERN ISR energy range. They find from these data that the normalized variance of the number n of exchanged cut Pomerons, ((n/(n)-1)/sup 2/) , is most probably in the range 0.32 to 0.36. They show that such a number is obtained from Reggeon theory in the eikonal approximation. The authors also predict a very specific violation of local compensation of charge in multiparticle events: The violation should appear in the fourth-order zone correlation function and is absent in the second-order correlation function, the only one measured until now. (48 refs).

  7. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  8. On the shake-off probability for atomic systems

    Energy Technology Data Exchange (ETDEWEB)

    Santos, A.C.F., E-mail: toniufrj@gmail.com [Instituto de Física, Universidade Federal do Rio de Janeiro, P.O. Box 68528, 21941-972 Rio de Janeiro, RJ (Brazil); Almeida, D.P. [Departamento de Física, Universidade Federal de Santa Catarina, 88040-900 Florianópolis (Brazil)

    2016-07-15

    Highlights: • The scope is to find the relationship among SO probabilities, Z and electron density. • A scaling law is suggested, allowing us to find the SO probabilities for atoms. • SO probabilities have been scaled as a function of target Z and polarizability. - Abstract: The main scope in this work has been upon the relationship between shake-off probabilities, target atomic number and electron density. By comparing the saturation values of measured double-to-single photoionization ratios from the literature, a simple scaling law has been found, which allows us to predict the shake-off probabilities for several elements up to Z = 54 within a factor 2. The electron shake-off probabilities accompanying valence shell photoionization have been scaled as a function of the target atomic number, Z, and polarizability, α. This behavior is in qualitative agreement with the experimental results.

  9. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  10. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  11. Influence of Coloured Correlated Noises on Probability Distribution and Mean of Tumour Cell Number in the Logistic Growth Model

    Institute of Scientific and Technical Information of China (English)

    HAN Li-Bo; GONG Xiao-Long; CAO Li; WU Da-Jin

    2007-01-01

    An approximate Fokker-P1anck equation for the logistic growth model which is driven by coloured correlated noises is derived by applying the Novikov theorem and the Fox approximation. The steady-state probability distribution (SPD) and the mean of the tumour cell number are analysed. It is found that the SPD is the single extremum configuration when the degree of correlation between the multiplicative and additive noises, λ, is in -1<λ ≤ 0 and can be the double extrema in 0<λ<1. A configuration transition occurs because of the variation of noise parameters. A minimum appears in the curve of the mean of the steady-state tumour cell number, 〈x〉, versus λ. The position and the value of the minimum are controlled by the noise-correlated times.

  12. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  13. Characterizing the Frequency and Elevation of Rapid Drainage Events in West Greenland

    Science.gov (United States)

    Cooley, S.; Christoffersen, P.

    2016-12-01

    Rapid drainage of supraglacial lakes on the Greenland Ice Sheet is critical for the establishment of surface-to-bed hydrologic connections and the subsequent transfer of water from surface to bed. Yet, estimates of the number and spatial distribution of rapidly draining lakes vary widely due to limitations in the temporal frequency of image collection and obscureness by cloud. So far, no study has assessed the impact of these observation biases. In this study, we examine the frequency and elevation of rapidly draining lakes in central West Greenland, from 68°N to 72.6°N, and we make a robust statistical analysis to estimate more accurately the likelihood of lakes draining rapidly. Using MODIS imagery and a fully automated lake detection method, we map more than 500 supraglacial lakes per year over a 63000 km2 study area from 2000-2015. Through testing four different definitions of rapidly draining lakes from previously published studies, we find that the number of rapidly draining lakes varies from 3% to 38%. Logistic regression between rapid drainage events and image sampling frequency demonstrates that the number of rapid drainage events is strongly dependent on cloud-free observation percentage. We then develop three new drainage criteria and apply an observation bias correction that suggests a true rapid drainage probability between 36% and 45%, considerably higher than previous studies without bias assessment have reported. We find rapid-draining lakes are on average larger and disappear earlier than slow-draining lakes, and we also observe no elevation differences for the lakes detected as rapidly draining. We conclude a) that methodological problems in rapid drainage research caused by observation bias and varying detection methods have obscured large-scale rapid drainage characteristics and b) that the lack of evidence for an elevation limit on rapid drainage suggests surface-to-bed hydrologic connections may continue to propagate inland as climate warms.

  14. The Most Probable Limit of Detection (MPL) for rapid microbiological methods

    NARCIS (Netherlands)

    Verdonk, G.P.H.T.; Willemse, M.J.; Hoefs, S.G.G.; Cremers, G.; Heuvel, E.R. van den

    Classical microbiological methods have nowadays unacceptably long cycle times. Rapid methods, available on the market for decades, are already applied within the clinical and food industry, but the implementation in pharmaceutical industry is hampered by for instance stringent regulations on

  15. The most probable limit of detection (MPL) for rapid microbiological methods

    NARCIS (Netherlands)

    Verdonk, G.P.H.T.; Willemse, M.J.; Hoefs, S.G.G.; Cremers, G.; Heuvel, van den E.R.

    2010-01-01

    Classical microbiological methods have nowadays unacceptably long cycle times. Rapid methods, available on the market for decades, are already applied within the clinical and food industry, but the implementation in pharmaceutical industry is hampered by for instance stringent regulations on

  16. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  17. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  18. Comparative evaluation of direct plating and most probable number for enumeration of low levels of Listeria monocytogenes in naturally contaminated ice cream products.

    Science.gov (United States)

    Chen, Yi; Pouillot, Régis; S Burall, Laurel; Strain, Errol A; Van Doren, Jane M; De Jesus, Antonio J; Laasri, Anna; Wang, Hua; Ali, Laila; Tatavarthy, Aparna; Zhang, Guodong; Hu, Lijun; Day, James; Sheth, Ishani; Kang, Jihun; Sahu, Surasri; Srinivasan, Devayani; Brown, Eric W; Parish, Mickey; Zink, Donald L; Datta, Atin R; Hammack, Thomas S; Macarisin, Dumitru

    2017-01-16

    A precise and accurate method for enumeration of low level of Listeria monocytogenes in foods is critical to a variety of studies. In this study, paired comparison of most probable number (MPN) and direct plating enumeration of L. monocytogenes was conducted on a total of 1730 outbreak-associated ice cream samples that were naturally contaminated with low level of L. monocytogenes. MPN was performed on all 1730 samples. Direct plating was performed on all samples using the RAPID'L.mono (RLM) agar (1600 samples) and agar Listeria Ottaviani and Agosti (ALOA; 130 samples). Probabilistic analysis with Bayesian inference model was used to compare paired direct plating and MPN estimates of L. monocytogenes in ice cream samples because assumptions implicit in ordinary least squares (OLS) linear regression analyses were not met for such a comparison. The probabilistic analysis revealed good agreement between the MPN and direct plating estimates, and this agreement showed that the MPN schemes and direct plating schemes using ALOA or RLM evaluated in the present study were suitable for enumerating low levels of L. monocytogenes in these ice cream samples. The statistical analysis further revealed that OLS linear regression analyses of direct plating and MPN data did introduce bias that incorrectly characterized systematic differences between estimates from the two methods. Published by Elsevier B.V.

  19. Finite-size scaling of survival probability in branching processes

    OpenAIRE

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Alvaro

    2014-01-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We reveal the finite-size scaling law of the survival probability for a given branching process ruled by a probability distribution of the number of offspring per element whose standard deviation is finite, obtaining the exact scaling function as well as the critical exponents. Our findings prove the universal behavi...

  20. EDF: Computing electron number probability distribution functions in real space from molecular wave functions

    Science.gov (United States)

    Francisco, E.; Pendás, A. Martín; Blanco, M. A.

    2008-04-01

    : 2.80 GHz Intel Pentium IV CPU Operating system: GNU/Linux RAM: 55 992 KB Word size: 32 bits Classification: 2.7 External routines: Netlib Nature of problem: Let us have an N-electron molecule and define an exhaustive partition of the physical space into m three-dimensional regions. The edf program computes the probabilities P(n,n,…,n)≡P({n}) of all possible allocations of n electrons to Ω, n electrons to Ω,…, and n electrons to Ω,{n} being integers. Solution method: Let us assume that the N-electron molecular wave function, Ψ(1,N), is a linear combination of M Slater determinants, Ψ(1,N)=∑rMCψ(1,N). Calling SΩrs the overlap matrix over the 3D region Ω between the (real) molecular spin-orbitals (MSO) in ψ(χ1r,…χNr) and the MSOs in ψ,(χ1s,…,χNs), edf finds all the P({n})'s by solving the linear system ∑{n}{∏kmtkn}P({n})=∑r,sMCCdet[∑kmtSΩrs], where t=1 and t,…,t are arbitrary real numbers. Restrictions: The number of {n} sets grows very fast with m and N, so that the dimension of the linear system (1) soon becomes very large. Moreover, the computer time required to obtain the determinants in the second member of Eq. (1) scales quadratically with M. These two facts limit the applicability of the method to relatively small molecules. Unusual features: Most of the real variables are of precision real*16. Running time: 0.030, 2.010, and 0.620 seconds for Test examples 1, 2, and 3, respectively. References: [1] A. Martín Pendás, E. Francisco, M.A. Blanco, Faraday Discuss. 135 (2007) 423-438. [2] A. Martín Pendás, E. Francisco, M.A. Blanco, J. Phys. Chem. A 111 (2007) 1084-1090. [3] A. Martín Pendás, E. Francisco, M.A. Blanco, Phys. Chem. Chem. Phys. 9 (2007) 1087-1092. [4] E. Francisco, A. Martín Pendás, M.A. Blanco, J. Chem. Phys. 126 (2007) 094102. [5] A. Martín Pendás, E. Francisco, M.A. Blanco, C. Gatti, Chemistry: A European Journal 113 (2007) 9362-9371.

  1. Charge transfer and rapidity gap analysis in p(π+)n interactions at 195 GeV/c

    International Nuclear Information System (INIS)

    Eisenberg, Y.; Haber, B.; Hochmann, D.; Karshon, U.; Ronat, E.E.; Shapira, A.; Yekutieli, G.

    1980-01-01

    We present charge transfer probabilities between CM hemispheres in pn and π + n interactions at 195 GeV/c. The relative probabilities for charge exchanges ΔQ > 1 as a function of rapidity gap length, r, are given. Both results are compared with those of π - p interactions at 200 GeV/c. The average of r, viz. , is given as a function of the gap number and of ΔQ for various multiplicities, and the reduced average gap lengths /ysub(max) for pn interactions are compared with data at a lower energy. (orig.)

  2. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  3. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  4. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  5. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  6. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  7. Calculation of transition probabilities using the multiconfiguration Dirac-Fock method

    International Nuclear Information System (INIS)

    Kim, Yong Ki; Desclaux, Jean Paul; Indelicato, Paul

    1998-01-01

    The performance of the multiconfiguration Dirac-Fock (MCDF) method in calculating transition probabilities of atoms is reviewed. In general, the MCDF wave functions will lead to transition probabilities accurate to ∼ 10% or better for strong, electric-dipole allowed transitions for small atoms. However, it is more difficult to get reliable transition probabilities for weak transitions. Also, some MCDF wave functions for a specific J quantum number may not reduce to the appropriate L and S quantum numbers in the nonrelativistic limit. Transition probabilities calculated from such MCDF wave functions for nonrelativistically forbidden transitions are unreliable. Remedies for such cases are discussed

  8. Re‐estimated effects of deep episodic slip on the occurrence and probability of great earthquakes in Cascadia

    Science.gov (United States)

    Beeler, Nicholas M.; Roeloffs, Evelyn A.; McCausland, Wendy

    2013-01-01

    Mazzotti and Adams (2004) estimated that rapid deep slip during typically two week long episodes beneath northern Washington and southern British Columbia increases the probability of a great Cascadia earthquake by 30–100 times relative to the probability during the ∼58 weeks between slip events. Because the corresponding absolute probability remains very low at ∼0.03% per week, their conclusion is that though it is more likely that a great earthquake will occur during a rapid slip event than during other times, a great earthquake is unlikely to occur during any particular rapid slip event. This previous estimate used a failure model in which great earthquakes initiate instantaneously at a stress threshold. We refine the estimate, assuming a delayed failure model that is based on laboratory‐observed earthquake initiation. Laboratory tests show that failure of intact rock in shear and the onset of rapid slip on pre‐existing faults do not occur at a threshold stress. Instead, slip onset is gradual and shows a damped response to stress and loading rate changes. The characteristic time of failure depends on loading rate and effective normal stress. Using this model, the probability enhancement during the period of rapid slip in Cascadia is negligible (stresses of 10 MPa or more and only increases by 1.5 times for an effective normal stress of 1 MPa. We present arguments that the hypocentral effective normal stress exceeds 1 MPa. In addition, the probability enhancement due to rapid slip extends into the interevent period. With this delayed failure model for effective normal stresses greater than or equal to 50 kPa, it is more likely that a great earthquake will occur between the periods of rapid deep slip than during them. Our conclusion is that great earthquake occurrence is not significantly enhanced by episodic deep slip events.

  9. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  10. METHOD OF FOREST FIRES PROBABILITY ASSESSMENT WITH POISSON LAW

    Directory of Open Access Journals (Sweden)

    A. S. Plotnikova

    2016-01-01

    Full Text Available The article describes the method for the forest fire burn probability estimation on a base of Poisson distribution. The λ parameter is assumed to be a mean daily number of fires detected for each Forest Fire Danger Index class within specific period of time. Thus, λ was calculated for spring, summer and autumn seasons separately. Multi-annual daily Forest Fire Danger Index values together with EO-derived hot spot map were input data for the statistical analysis. The major result of the study is generation of the database on forest fire burn probability. Results were validated against EO daily data on forest fires detected over Irkutsk oblast in 2013. Daily weighted average probability was shown to be linked with the daily number of detected forest fires. Meanwhile, there was found a number of fires which were developed when estimated probability was low. The possible explanation of this phenomenon was provided.

  11. Using Fuzzy Probability Weights in Cumulative Prospect Theory

    Directory of Open Access Journals (Sweden)

    Užga-Rebrovs Oļegs

    2016-12-01

    Full Text Available During the past years, a rapid growth has been seen in the descriptive approaches to decision choice. As opposed to normative expected utility theory, these approaches are based on the subjective perception of probabilities by the individuals, which takes place in real situations of risky choice. The modelling of this kind of perceptions is made on the basis of probability weighting functions. In cumulative prospect theory, which is the focus of this paper, decision prospect outcome weights are calculated using the obtained probability weights. If the value functions are constructed in the sets of positive and negative outcomes, then, based on the outcome value evaluations and outcome decision weights, generalised evaluations of prospect value are calculated, which are the basis for choosing an optimal prospect.

  12. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  13. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  14. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  15. Rapid enumeration of low numbers of moulds in tea based drinks using an automated system.

    Science.gov (United States)

    Tanaka, Kouichi; Yamaguchi, Nobuyasu; Baba, Takashi; Amano, Norihide; Nasu, Masao

    2011-01-31

    Aseptically prepared cold drinks based on tea have become popular worldwide. Contamination of these drinks with harmful microbes is a potential health problem because such drinks are kept free from preservatives to maximize aroma and flavour. Heat-tolerant conidia and ascospores of fungi can survive pasteurization, and need to be detected as quickly as possible. We were able to rapidly and accurately detect low numbers of conidia and ascospores in tea-based drinks using fluorescent staining followed by an automated counting system. Conidia or ascospores were inoculated into green tea and oolong tea, and samples were immediately filtered through nitrocellulose membranes (pore size: 0.8 μm) to concentrate fungal propagules. These were transferred onto potato dextrose agar and incubated for 23 h at 28 °C. Fungi germinating on the membranes were fluorescently stained for 30 min. The stained mycelia were counted selectively within 90s using an automated counting system (MGS-10LD; Chuo Electric Works, Osaka, Japan). Very low numbers (1 CFU/100ml) of conidia or ascospores could be rapidly counted, in contrast to traditional labour intensive techniques. All tested mould strains were detected within 24h while conventional plate counting required 72 h for colony enumeration. Counts of slow-growing fungi (Cladosporium cladosporioides) obtained by automated counting and by conventional plate counting were close (r(2) = 0.986). Our combination of methods enables counting of both fast- and slow-growing fungi, and should be useful for microbiological quality control of tea-based and also other drinks. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  17. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  18. A procedure for estimation of pipe break probabilities due to IGSCC

    International Nuclear Information System (INIS)

    Bergman, M.; Brickstad, B.; Nilsson, F.

    1998-06-01

    A procedure has been developed for estimation of the failure probability of welds joints in nuclear piping susceptible to intergranular stress corrosion cracking. The procedure aims at a robust and rapid estimate of the failure probability for a specific weld with known stress state. Random properties are taken into account of the crack initiation rate, the initial crack length, the in-service inspection efficiency and the leak rate. A computer realization of the procedure has been developed for user friendly applications by design engineers. Some examples are considered to investigate the sensitivity of the failure probability to different input quantities. (au)

  19. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  20. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  1. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...

  2. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  3. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  4. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  5. The estimation of collision probabilities in complicated geometries

    International Nuclear Information System (INIS)

    Roth, M.J.

    1969-04-01

    This paper demonstrates how collision probabilities in complicated geometries may be estimated. It is assumed that the reactor core may be divided into a number of cells each with simple geometry so that a collision probability matrix can be calculated for each cell by standard methods. It is then shown how these may be joined together. (author)

  6. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  7. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  8. PROCOPE, Collision Probability in Pin Clusters and Infinite Rod Lattices

    International Nuclear Information System (INIS)

    Amyot, L.; Daolio, C.; Benoist, P.

    1984-01-01

    1 - Nature of physical problem solved: Calculation of directional collision probabilities in pin clusters and infinite rod lattices. 2 - Method of solution: a) Gauss integration of analytical expressions for collision probabilities. b) alternately, an approximate closed expression (not involving integrals) may be used for pin-to-pin interactions. 3 - Restrictions on the complexity of the problem: number of fuel pins must be smaller than 62; maximum number of groups of symmetry is 300

  9. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  10. Failure frequencies and probabilities applicable to BWR and PWR piping

    International Nuclear Information System (INIS)

    Bush, S.H.; Chockie, A.D.

    1996-03-01

    This report deals with failure probabilities and failure frequencies of nuclear plant piping and the failure frequencies of flanges and bellows. Piping failure probabilities are derived from Piping Reliability Analysis Including Seismic Events (PRAISE) computer code calculations based on fatigue and intergranular stress corrosion as failure mechanisms. Values for both failure probabilities and failure frequencies are cited from several sources to yield a better evaluation of the spread in mean and median values as well as the widths of the uncertainty bands. A general conclusion is that the numbers from WASH-1400 often used in PRAs are unduly conservative. Failure frequencies for both leaks and large breaks tend to be higher than would be calculated using the failure probabilities, primarily because the frequencies are based on a relatively small number of operating years. Also, failure probabilities are substantially lower because of the probability distributions used in PRAISE calculations. A general conclusion is that large LOCA probability values calculated using PRAISE will be quite small, on the order of less than 1E-8 per year (<1E-8/year). The values in this report should be recognized as having inherent limitations and should be considered as estimates and not absolute values. 24 refs 24 refs

  11. Probability of pregnancy as affected by oestrus number and days to first oestrus in dairy cows of three breeds and parities

    DEFF Research Database (Denmark)

    Friggens, N C; Labouriau, R

    2010-01-01

    An improved understanding of the animal factors that affect measures such as conception rate would contribute to solving the problem of impaired reproductive performance in modern dairy cows. A question of particular interest relates to the observed improvement in conception rates from first...... to second and third oestrus cycle: is the increase in conception rate related to cycle number per se or to increasing days from calving? A 5-year study using three breeds (Holstein, Jersey and Danish Red) allowed this issue to be examined. In 560 lactations, from calving until confirmed pregnancy or until...... model. Danish Red cows had a significantly greater rate of occurrence of first oestrus over time. Generalized linear mixed models defined using a binomial distribution and logit link function were used to estimate probability of pregnancy as affected by: breed, parity, oestrus number and days from...

  12. The transmission probability method in one-dimensional cylindrical geometry

    International Nuclear Information System (INIS)

    Rubin, I.E.

    1983-01-01

    The collision probability method widely used in solving the problems of neutron transpopt in a reactor cell is reliable for simple cells with small number of zones. The increase of the number of zones and also taking into account the anisotropy of scattering greatly increase the scope of calculations. In order to reduce the time of calculation the transmission probability method is suggested to be used for flux calculation in one-dimensional cylindrical geometry taking into account the scattering anisotropy. The efficiency of the suggested method is verified using the one-group calculations for cylindrical cells. The use of the transmission probability method allows to present completely angular and spatial dependences is neutrons distributions without the increase in the scope of calculations. The method is especially effective in solving the multi-group problems

  13. Tumor control probability after a radiation of animal tumors

    International Nuclear Information System (INIS)

    Urano, Muneyasu; Ando, Koichi; Koike, Sachiko; Nesumi, Naofumi

    1975-01-01

    Tumor control and regrowth probability of animal tumors irradiated with a single x-ray dose were determined, using a spontaneous C3H mouse mammary carcinoma. Cellular radiation sensitivity of tumor cells and tumor control probability of the tumor were examined by the TD 50 and TCD 50 assays respectively. Tumor growth kinetics were measured by counting the percentage of labelled mitosis and by measuring the growth curve. A mathematical analysis of tumor control probability was made from these results. A formula proposed, accounted for cell population kinetics or division probability model, cell sensitivity to radiation and number of tumor cells. (auth.)

  14. The risk of major nuclear accident: calculation and perception of probabilities

    International Nuclear Information System (INIS)

    Leveque, Francois

    2013-01-01

    Whereas before the Fukushima accident, already eight major accidents occurred in nuclear power plants, a number which is higher than that expected by experts and rather close to that corresponding of people perception of risk, the author discusses how to understand these differences and reconcile observations, objective probability of accidents and subjective assessment of risks, why experts have been over-optimistic, whether public opinion is irrational regarding nuclear risk, and how to measure risk and its perception. Thus, he addresses and discusses the following issues: risk calculation (cost, calculated frequency of major accident, bias between the number of observed accidents and model predictions), perceived probabilities and aversion for disasters (perception biases of probability, perception biases unfavourable to nuclear), the Bayes contribution and its application (Bayes-Laplace law, statistics, choice of an a priori probability, prediction of the next event, probability of a core fusion tomorrow)

  15. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  16. Phylogenomics of a rapid radiation: is chromosomal evolution linked to increased diversification in north american spiny lizards (Genus Sceloporus)?

    Science.gov (United States)

    Leaché, Adam D; Banbury, Barbara L; Linkem, Charles W; de Oca, Adrián Nieto-Montes

    2016-03-22

    Resolving the short phylogenetic branches that result from rapid evolutionary diversification often requires large numbers of loci. We collected targeted sequence capture data from 585 nuclear loci (541 ultraconserved elements and 44 protein-coding genes) to estimate the phylogenetic relationships among iguanian lizards in the North American genus Sceloporus. We tested for diversification rate shifts to determine if rapid radiation in the genus is correlated with chromosomal evolution. The phylogenomic trees that we obtained for Sceloporus using concatenation and coalescent-based species tree inference provide strong support for the monophyly and interrelationships among nearly all major groups. The diversification analysis supported one rate shift on the Sceloporus phylogeny approximately 20-25 million years ago that is associated with the doubling of the speciation rate from 0.06 species/million years (Ma) to 0.15 species/Ma. The posterior probability for this rate shift occurring on the branch leading to the Sceloporus species groups exhibiting increased chromosomal diversity is high (posterior probability = 0.997). Despite high levels of gene tree discordance, we were able to estimate a phylogenomic tree for Sceloporus that solves some of the taxonomic problems caused by previous analyses of fewer loci. The taxonomic changes that we propose using this new phylogenomic tree help clarify the number and composition of the major species groups in the genus. Our study provides new evidence for a putative link between chromosomal evolution and the rapid divergence and radiation of Sceloporus across North America.

  17. Case series of probable sporadic Creutzfeldt-Jakob disease from Eastern India

    Directory of Open Access Journals (Sweden)

    Atanu Biswas

    2013-01-01

    Full Text Available Background: Creutzfeldt-Jakob disease is a rapidly progressive, fatal, transmissible neurodegenerative disorder caused by prion protein. It is still considered rare in countries like India. This is probably due to nonavailability of autopsy studies in majority of the center. The recent European diagnostic criterion for sporadic CJD (sCJD is useful for making an early diagnosis. Objective: To report a series of patients of probable sCJD from a neurology institute of eastern India. Materials and Methods: Patients of rapidly developing dementia fulfilling the diagnostic criteria for sCJD were included. All were investigated in detail to find out any possible treatable cause including electroencephalography (EEG, magnetic resonance imaging (MRI of brain, and cerebrospinal fluid analysis. Results: A total 10 patients of probable sCJD diagnosed using the European diagnostic criterion between December 2011 and January 2013. The clinical features are consistent with other reported series. While 60% of patients had the classical EEG findings, 100% had typical MRI features. Eight patients died within a mean duration of 4.56 months from the disease onset. Conclusions: The clinical features are similar to other reported series. Our observation raises question about the prevalence of this disease in India which needs more elaborate studies.

  18. Probability calculations for three-part mineral resource assessments

    Science.gov (United States)

    Ellefsen, Karl J.

    2017-06-27

    Three-part mineral resource assessment is a methodology for predicting, in a specified geographic region, both the number of undiscovered mineral deposits and the amount of mineral resources in those deposits. These predictions are based on probability calculations that are performed with computer software that is newly implemented. Compared to the previous implementation, the new implementation includes new features for the probability calculations themselves and for checks of those calculations. The development of the new implementation lead to a new understanding of the probability calculations, namely the assumptions inherent in the probability calculations. Several assumptions strongly affect the mineral resource predictions, so it is crucial that they are checked during an assessment. The evaluation of the new implementation leads to new findings about the probability calculations,namely findings regarding the precision of the computations,the computation time, and the sensitivity of the calculation results to the input.

  19. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  20. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  1. A technique of evaluating most probable stochastic valuables from a small number of samples and their accuracies and degrees of confidence

    Energy Technology Data Exchange (ETDEWEB)

    Katoh, K [Ibaraki Pref. Univ. Health Sci., (Japan)

    1997-12-31

    A problem of estimating stochastic characteristics of a population from a small number of samples is solved as an inverse problem, from view point of information theory and with the Bayesian statistics. For both Poisson-process and Bernoulli-process, the most probable values of the characteristics of the mother population and their accuracies and degrees of confidence are successfully obtained. Mathematical expressions are given to the general case where a limit amount of information and/or knowledge with the stochastic characteristics are available and a special case where no a priori information nor knowledge are available. Mathematical properties of the solutions obtained, practical appreciation to the problem to radiation measurement are also discussed.

  2. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  3. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  4. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. A method for estimating failure rates for low probability events arising in PSA

    International Nuclear Information System (INIS)

    Thorne, M.C.; Williams, M.M.R.

    1995-01-01

    The authors develop a method for predicting failure rates and failure probabilities per event when, over a given test period or number of demands, no failures have occurred. A Bayesian approach is adopted to calculate a posterior probability distribution for the failure rate or failure probability per event subsequent to the test period. This posterior is then used to estimate effective failure rates or probabilities over a subsequent period of time or number of demands. In special circumstances, the authors results reduce to the well-known rules of thumb, viz: 1/N and 1/T, where N is the number of demands during the test period for no failures and T is the test period for no failures. However, the authors are able to give strict conditions on the validity of these rules of thumb and to improve on them when necessary

  6. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Science.gov (United States)

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  7. Rapid quantitative estimation of chlorinated methane utilizing bacteria in drinking water and the effect of nanosilver on biodegradation of the trichloromethane in the environment.

    Science.gov (United States)

    Zamani, Isaac; Bouzari, Majid; Emtiazi, Giti; Fanaei, Maryam

    2015-03-01

    Halomethanes are toxic and carcinogenic chemicals, which are widely used in industry. Also they can be formed during water disinfection by chlorine. Biodegradation by methylotrophs is the most important way to remove these pollutants from the environment. This study aimed to represent a simple and rapid method for quantitative study of halomethanes utilizing bacteria in drinking water and also a method to facilitate the biodegradation of these compounds in the environment compared to cometabolism. Enumeration of chlorinated methane utilizing bacteria in drinking water was carried out by most probable number (MPN) method in two steps. First, the presence and the number of methylotroph bacteria were confirmed on methanol-containing medium. Then, utilization of dichloromethane was determined by measuring the released chloride after the addition of 0.04 mol/L of it to the growth medium. Also, the effect of nanosilver particles on biodegradation of multiple chlorinated methanes was studied by bacterial growth on Bushnell-Haas Broth containing chloroform (trichloromethane) that was treated with 0.2 ppm nanosilver. Most probable number of methylotrophs and chlorinated methane utilizing bacteria in tested drinking water were 10 and 4 MPN Index/L, respectively. Chloroform treatment by nanosilver leads to dechlorination and the production of formaldehyde. The highest growth of bacteria and formic acid production were observed in the tubes containing 1% chloroform treated with nanosilver. By combining the two tests, a rapid approach to estimation of most probable number of chlorinated methane utilizing bacteria is introduced. Treatment by nanosilver particles was resulted in the easier and faster biodegradation of chloroform by bacteria. Thus, degradation of these chlorinated compounds is more efficient compared to cometabolism.

  8. Probabilistic analysis of Millstone Unit 3 ultimate containment failure probability given high pressure: Chapter 14

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1983-01-01

    The quantification of the containment event trees in the Millstone Unit 3 Probabilistic Safety Study utilizes a conditional probability of failure given high pressure which is based on a new approach. The generation of this conditional probability was based on a weakest link failure mode model which considered contributions from a number of overlapping failure modes. This overlap effect was due to a number of failure modes whose mean failure pressures were clustered within a 5 psi range and which had uncertainties due to variances in material strengths and analytical uncertainties which were between 9 and 15 psi. Based on a review of possible probability laws to describe the failure probability of individual structural failure modes, it was determined that a Weibull probability law most adequately described the randomness in the physical process of interest. The resultant conditional probability of failure is found to have a median failure pressure of 132.4 psia. The corresponding 5-95 percentile values are 112 psia and 146.7 psia respectively. The skewed nature of the conditional probability of failure vs. pressure results in a lower overall containment failure probability for an appreciable number of the severe accident sequences of interest, but also probabilities which are more rigorously traceable from first principles

  9. Finite-size scaling of survival probability in branching processes.

    Science.gov (United States)

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Álvaro

    2015-04-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We derive analytically the existence of finite-size scaling for the survival probability as a function of the control parameter and the maximum number of generations, obtaining the critical exponents as well as the exact scaling function, which is G(y)=2ye(y)/(e(y)-1), with y the rescaled distance to the critical point. Our findings are valid for any branching process of the Galton-Watson type, independently of the distribution of the number of offspring, provided its variance is finite. This proves the universal behavior of the finite-size effects in branching processes, including the universality of the metric factors. The direct relation to mean-field percolation is also discussed.

  10. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  11. STRIP: stream learning of influence probabilities

    DEFF Research Database (Denmark)

    Kutzkov, Konstantin

    2013-01-01

    cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...... algorithm is to keep up with a continuous stream of tweets using a small amount of time and memory. Our contribution is a number of randomized approximation algorithms, categorized according to the available space (superlinear, linear, and sublinear in the number of nodes n) and according to dierent models...

  12. Effects of amphibian chytrid fungus on individual survival probability in wild boreal toads

    Science.gov (United States)

    Pilliod, D.S.; Muths, E.; Scherer, R. D.; Bartelt, P.E.; Corn, P.S.; Hossack, B.R.; Lambert, B.A.; Mccaffery, R.; Gaughan, C.

    2010-01-01

    Chytridiomycosis is linked to the worldwide decline of amphibians, yet little is known about the demographic effects of the disease. We collected capture-recapture data on three populations of boreal toads (Bufo boreas [Bufo = Anaxyrus]) in the Rocky Mountains (U.S.A.). Two of the populations were infected with chytridiomycosis and one was not. We examined the effect of the presence of amphibian chytrid fungus (Batrachochytrium dendrobatidis [Bd]; the agent of chytridiomycosis) on survival probability and population growth rate. Toads that were infected with Bd had lower average annual survival probability than uninfected individuals at sites where Bd was detected, which suggests chytridiomycosis may reduce survival by 31-42% in wild boreal toads. Toads that were negative for Bd at infected sites had survival probabilities comparable to toads at the uninfected site. Evidence that environmental covariates (particularly cold temperatures during the breeding season) influenced toad survival was weak. The number of individuals in diseased populations declined by 5-7%/year over the 6 years of the study, whereas the uninfected population had comparatively stable population growth. Our data suggest that the presence of Bd in these toad populations is not causing rapid population declines. Rather, chytridiomycosis appears to be functioning as a low-level, chronic disease whereby some infected individuals survive but the overall population effects are still negative. Our results show that some amphibian populations may be coexisting with Bd and highlight the importance of quantitative assessments of survival in diseased animal populations. Journal compilation. ?? 2010 Society for Conservation Biology. No claim to original US government works.

  13. Limited test data: The choice between confidence limits and inverse probability

    International Nuclear Information System (INIS)

    Nichols, P.

    1975-01-01

    For a unit which has been successfully designed to a high standard of reliability, any test programme of reasonable size will result in only a small number of failures. In these circumstances the failure rate estimated from the tests will depend on the statistical treatment applied. When a large number of units is to be manufactured, an unexpected high failure rate will certainly result in a large number of failures, so it is necessary to guard against optimistic unrepresentative test results by using a confidence limit approach. If only a small number of production units is involved, failures may not occur even with a higher than expected failure rate, and so one may be able to accept a method which allows for the possibility of either optimistic or pessimistic test results, and in this case an inverse probability approach, based on Bayes' theorem, might be used. The paper first draws attention to an apparently significant difference in the numerical results from the two methods, particularly for the overall probability of several units arranged in redundant logic. It then discusses a possible objection to the inverse method, followed by a demonstration that, for a large population and a very reasonable choice of prior probability, the inverse probability and confidence limit methods give the same numerical result. Finally, it is argued that a confidence limit approach is overpessimistic when a small number of production units is involved, and that both methods give the same answer for a large population. (author)

  14. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  15. More efficient integrated safeguards by applying a reasonable detection probability for maintaining low presence probability of undetected nuclear proliferating activities

    International Nuclear Information System (INIS)

    Otsuka, Naoto

    2013-01-01

    Highlights: • A theoretical foundation is presented for more efficient Integrated Safeguards (IS). • Probability of undetected nuclear proliferation activities should be maintained low. • For nations under IS, the probability to start proliferation activities is very low. • The fact can decrease the detection probability of IS by dozens of percentage points. • The cost of IS per nation can be cut down by reducing inspection frequencies etc. - Abstract: A theoretical foundation is presented for implementing more efficiently the present International Atomic Energy Agency (IAEA) integrated safeguards (ISs) on the basis of fuzzy evaluation of the probability that the evaluated nation will continue peaceful activities. It is shown that by determining the presence probability of undetected nuclear proliferating activities, nations under IS can be maintained at acceptably low proliferation risk levels even if the detection probability of current IS is decreased by dozens of percentage from the present value. This makes it possible to reduce inspection frequency and the number of collected samples, allowing the IAEA to cut costs per nation. This will contribute to further promotion and application of IS to more nations by the IAEA, and more efficient utilization of IAEA resources from the viewpoint of whole IS framework

  16. Modelling the Probability of Landslides Impacting Road Networks

    Science.gov (United States)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  17. Counting problems for number rings

    NARCIS (Netherlands)

    Brakenhoff, Johannes Franciscus

    2009-01-01

    In this thesis we look at three counting problems connected to orders in number fields. First we study the probability that for a random polynomial f in Z[X] the ring Z[X]/f is the maximal order in Q[X]/f. Connected to this is the probability that a random polynomial has a squarefree

  18. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  19. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  20. The statistical significance of error probability as determined from decoding simulations for long codes

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  1. On the universality of knot probability ratios

    Energy Technology Data Exchange (ETDEWEB)

    Janse van Rensburg, E J [Department of Mathematics and Statistics, York University, Toronto, Ontario M3J 1P3 (Canada); Rechnitzer, A, E-mail: rensburg@yorku.ca, E-mail: andrewr@math.ubc.ca [Department of Mathematics, University of British Columbia, 1984 Mathematics Road, Vancouver, BC V6T 1Z2 (Canada)

    2011-04-22

    Let p{sub n} denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let p{sub n}(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is p{sub n}(K)/p{sub n} and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of p{sub n}(K), but there is substantial numerical evidence. It is believed that the entropic exponent, {alpha}, is universal, while the exponential growth rate is independent of the knot type but varies with the lattice. The amplitude, C{sub K}, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L. In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot types in closed curves. (fast track communication)

  2. Generation of Random Numbers and Parallel Random Number Streams for Monte Carlo Simulations

    Directory of Open Access Journals (Sweden)

    L. Yu. Barash

    2012-01-01

    Full Text Available Modern methods and libraries for high quality pseudorandom number generation and for generation of parallel random number streams for Monte Carlo simulations are considered. The probability equidistribution property and the parameters when the property holds at dimensions up to logarithm of mesh size are considered for Multiple Recursive Generators.

  3. Constraints on rapidity-dependent initial conditions from charged-particle pseudorapidity densities and two-particle correlations

    Science.gov (United States)

    Ke, Weiyao; Moreland, J. Scott; Bernhard, Jonah E.; Bass, Steffen A.

    2017-10-01

    We study the initial three-dimensional spatial configuration of the quark-gluon plasma (QGP) produced in relativistic heavy-ion collisions using centrality and pseudorapidity-dependent measurements of the medium's charged particle density and two-particle correlations. A cumulant-generating function is first used to parametrize the rapidity dependence of local entropy deposition and extend arbitrary boost-invariant initial conditions to nonzero beam rapidities. The model is then compared to p +Pb and Pb + Pb charged-particle pseudorapidity densities and two-particle pseudorapidity correlations and systematically optimized using Bayesian parameter estimation to extract high-probability initial condition parameters. The optimized initial conditions are then compared to a number of experimental observables including the pseudorapidity-dependent anisotropic flows, event-plane decorrelations, and flow correlations. We find that the form of the initial local longitudinal entropy profile is well constrained by these experimental measurements.

  4. Digital dice computational solutions to practical probability problems

    CERN Document Server

    Nahin, Paul J

    2013-01-01

    Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the

  5. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  6. Conditional probability of the tornado missile impact given a tornado occurrence

    International Nuclear Information System (INIS)

    Goodman, J.; Koch, J.E.

    1982-01-01

    Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets

  7. Optimal design of unit hydrographs using probability distribution and ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    optimization formulation is solved using binary-coded genetic algorithms. The number of variables to ... Unit hydrograph; rainfall-runoff; hydrology; genetic algorithms; optimization; probability ..... Application of the model. Data derived from the ...

  8. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  9. Most probable number methodology for quantifying dilute concentrations and fluxes of Escherichia coli O157:H7 in surface waters.

    Science.gov (United States)

    Jenkins, M B; Endale, D M; Fisher, D S; Gay, P A

    2009-02-01

    To better understand the transport and enumeration of dilute densities of Escherichia coli O157:H7 in agricultural watersheds, we developed a culture-based, five tube-multiple dilution most probable number (MPN) method. The MPN method combined a filtration technique for large volumes of surface water with standard selective media, biochemical and immunological tests, and a TaqMan confirmation step. This method determined E. coli O157:H7 concentrations as low as 0.1 MPN per litre, with a 95% confidence level of 0.01-0.7 MPN per litre. Escherichia coli O157:H7 densities ranged from not detectable to 9 MPN per litre for pond inflow, from not detectable to 0.9 MPN per litre for pond outflow and from not detectable to 8.3 MPN per litre for within pond. The MPN methodology was extended to mass flux determinations. Fluxes of E. coli O157:H7 ranged from 10(4) MPN per hour. This culture-based method can detect small numbers of viable/culturable E. coli O157:H7 in surface waters of watersheds containing animal agriculture and wildlife. This MPN method will improve our understanding of the transport and fate of E. coli O157:H7 in agricultural watersheds, and can be the basis of collections of environmental E. coli O157:H7.

  10. Universal critical wrapping probabilities in the canonical ensemble

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2015-09-01

    Full Text Available Universal dimensionless quantities, such as Binder ratios and wrapping probabilities, play an important role in the study of critical phenomena. We study the finite-size scaling behavior of the wrapping probability for the Potts model in the random-cluster representation, under the constraint that the total number of occupied bonds is fixed, so that the canonical ensemble applies. We derive that, in the limit L→∞, the critical values of the wrapping probability are different from those of the unconstrained model, i.e. the model in the grand-canonical ensemble, but still universal, for systems with 2yt−d>0 where yt=1/ν is the thermal renormalization exponent and d is the spatial dimension. Similar modifications apply to other dimensionless quantities, such as Binder ratios. For systems with 2yt−d≤0, these quantities share same critical universal values in the two ensembles. It is also derived that new finite-size corrections are induced. These findings apply more generally to systems in the canonical ensemble, e.g. the dilute Potts model with a fixed total number of vacancies. Finally, we formulate an efficient cluster-type algorithm for the canonical ensemble, and confirm these predictions by extensive simulations.

  11. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  12. A stochastic model for the probability of malaria extinction by mass drug administration.

    Science.gov (United States)

    Pemberton-Ross, Peter; Chitnis, Nakul; Pothin, Emilie; Smith, Thomas A

    2017-09-18

    Mass drug administration (MDA) has been proposed as an intervention to achieve local extinction of malaria. Although its effect on the reproduction number is short lived, extinction may subsequently occur in a small population due to stochastic fluctuations. This paper examines how the probability of stochastic extinction depends on population size, MDA coverage and the reproduction number under control, R c . A simple compartmental model is developed which is used to compute the probability of extinction using probability generating functions. The expected time to extinction in small populations after MDA for various scenarios in this model is calculated analytically. The results indicate that mass drug administration (Firstly, R c must be sustained at R c  95% to have a non-negligible probability of successful elimination. Stochastic fluctuations only significantly affect the probability of extinction in populations of about 1000 individuals or less. The expected time to extinction via stochastic fluctuation is less than 10 years only in populations less than about 150 individuals. Clustering of secondary infections and of MDA distribution both contribute positively to the potential probability of success, indicating that MDA would most effectively be administered at the household level. There are very limited circumstances in which MDA will lead to local malaria elimination with a substantial probability.

  13. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  14. Rapidity gap survival in enhanced Pomeron scheme

    Energy Technology Data Exchange (ETDEWEB)

    Ostapchenko, Sergey [Frankfurt Institute for Advanced Studies, Frankfurt am Main (Germany); Moscow State University, D.V. Skobeltsyn Institute of Nuclear Physics, Moscow (Russian Federation); Bleicher, Marcus [Frankfurt Institute for Advanced Studies, Frankfurt am Main (Germany); Goethe-Universitat, Institute for Theoretical Physics, Frankfurt am Main (Germany)

    2018-01-15

    We apply the phenomenological Reggeon field theory framework to investigate rapidity gap survival (RGS) probability for diffractive dijet production in proton-proton collisions. In particular, we study in some detail rapidity gap suppression due to elastic rescatterings of intermediate partons in the underlying parton cascades, described by enhanced (Pomeron-Pomeron interaction) diagrams. We demonstrate that such contributions play a subdominant role, compared to the usual, so-called ''eikonal'', rapidity gap suppression due to elastic rescatterings of constituent partons of the colliding protons. On the other hand, the overall RGS factor proves to be sensitive to color fluctuations in the proton. Hence, experimental data on diffractive dijet production can be used to constrain the respective model approaches. (orig.)

  15. On the probability of occurrence of rogue waves

    Directory of Open Access Journals (Sweden)

    E. M. Bitner-Gregersen

    2012-03-01

    Full Text Available A number of extreme and rogue wave studies have been conducted theoretically, numerically, experimentally and based on field data in the last years, which have significantly advanced our knowledge of ocean waves. So far, however, consensus on the probability of occurrence of rogue waves has not been achieved. The present investigation is addressing this topic from the perspective of design needs. Probability of occurrence of extreme and rogue wave crests in deep water is here discussed based on higher order time simulations, experiments and hindcast data. Focus is given to occurrence of rogue waves in high sea states.

  16. Allelic drop-out probabilities estimated by logistic regression

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic drop-out...... is occurring. With this discussion, we hope to improve the drop-out model, so that it can be used for practical forensic genetics and stimulate further discussions. We discuss how to estimate drop-out probabilities when using a varying number of PCR cycles and other experimental conditions....

  17. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  18. Probability Maps for the Visualization of Assimilation Ensemble Flow Data

    KAUST Repository

    Hollt, Thomas

    2015-05-25

    Ocean forecasts nowadays are created by running ensemble simulations in combination with data assimilation techniques. Most of these techniques resample the ensemble members after each assimilation cycle. This means that in a time series, after resampling, every member can follow up on any of the members before resampling. Tracking behavior over time, such as all possible paths of a particle in an ensemble vector field, becomes very difficult, as the number of combinations rises exponentially with the number of assimilation cycles. In general a single possible path is not of interest but only the probabilities that any point in space might be reached by a particle at some point in time. In this work we present an approach using probability-weighted piecewise particle trajectories to allow such a mapping interactively, instead of tracing quadrillions of individual particles. We achieve interactive rates by binning the domain and splitting up the tracing process into the individual assimilation cycles, so that particles that fall into the same bin after a cycle can be treated as a single particle with a larger probability as input for the next time step. As a result we loose the possibility to track individual particles, but can create probability maps for any desired seed at interactive rates.

  19. Introduction to the Interface of Probability and Algorithms

    OpenAIRE

    Aldous, David; Steele, J. Michael

    1993-01-01

    Probability and algorithms enjoy an almost boisterous interaction that has led to an active, extensive literature that touches fields as diverse as number theory and the design of computer hardware. This article offers a gentle introduction to the simplest, most basic ideas that underlie this development.

  20. Statistical complexity without explicit reference to underlying probabilities

    Science.gov (United States)

    Pennini, F.; Plastino, A.

    2018-06-01

    We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.

  1. Negative values of quasidistributions and quantum wave and number statistics

    Science.gov (United States)

    Peřina, J.; Křepelka, J.

    2018-04-01

    We consider nonclassical wave and number quantum statistics, and perform a decomposition of quasidistributions for nonlinear optical down-conversion processes using Bessel functions. We show that negative values of the quasidistribution do not directly represent probabilities; however, they directly influence measurable number statistics. Negative terms in the decomposition related to the nonclassical behavior with negative amplitudes of probability can be interpreted as positive amplitudes of probability in the negative orthogonal Bessel basis, whereas positive amplitudes of probability in the positive basis describe classical cases. However, probabilities are positive in all cases, including negative values of quasidistributions. Negative and positive contributions of decompositions to quasidistributions are estimated. The approach can be adapted to quantum coherence functions.

  2. Application of quasi-random numbers for simulation

    International Nuclear Information System (INIS)

    Kazachenko, O.N.; Takhtamyshev, G.G.

    1985-01-01

    Application of the Monte-Carlo method for multidimensional integration is discussed. The main goal is to check the statement that the application of quasi-random numbers instead of regular pseudo-random numbers provides more rapid convergency. The Sobol, Richtmayer and Halton algorithms of quasi-random sequences are described. Over 50 tests to compare these quasi-random numbers as well as pseudo-random numbers were fulfilled. In all cases quasi-random numbers have clearly demonstrated a more rapid convergency as compared with pseudo-random ones. Positive test results on quasi-random trend in Monte-Carlo method seem very promising

  3. The optimal number of surveys when detectability varies.

    Directory of Open Access Journals (Sweden)

    Alana L Moore

    Full Text Available The survey of plant and animal populations is central to undertaking field ecology. However, detection is imperfect, so the absence of a species cannot be determined with certainty. Methods developed to account for imperfect detectability during surveys do not yet account for stochastic variation in detectability over time or space. When each survey entails a fixed cost that is not spent searching (e.g., time required to travel to the site, stochastic detection rates result in a trade-off between the number of surveys and the length of each survey when surveying a single site. We present a model that addresses this trade-off and use it to determine the number of surveys that: 1 maximizes the expected probability of detection over the entire survey period; and 2 is most likely to achieve a minimally-acceptable probability of detection. We illustrate the applicability of our approach using three practical examples (minimum survey effort protocols, number of frog surveys per season, and number of quadrats per site to detect a plant species and test our model's predictions using data from experimental plant surveys. We find that when maximizing the expected probability of detection, the optimal survey design is most sensitive to the coefficient of variation in the rate of detection and the ratio of the search budget to the travel cost. When maximizing the likelihood of achieving a particular probability of detection, the optimal survey design is most sensitive to the required probability of detection, the expected number of detections if the budget were spent only on searching, and the expected number of detections that are missed due to travel costs. We find that accounting for stochasticity in detection rates is likely to be particularly important for designing surveys when detection rates are low. Our model provides a framework to do this.

  4. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  5. Description of atomic burials in compact globular proteins by Fermi-Dirac probability distributions.

    Science.gov (United States)

    Gomes, Antonio L C; de Rezende, Júlia R; Pereira de Araújo, Antônio F; Shakhnovich, Eugene I

    2007-02-01

    We perform a statistical analysis of atomic distributions as a function of the distance R from the molecular geometrical center in a nonredundant set of compact globular proteins. The number of atoms increases quadratically for small R, indicating a constant average density inside the core, reaches a maximum at a size-dependent distance R(max), and falls rapidly for larger R. The empirical curves turn out to be consistent with the volume increase of spherical concentric solid shells and a Fermi-Dirac distribution in which the distance R plays the role of an effective atomic energy epsilon(R) = R. The effective chemical potential mu governing the distribution increases with the number of residues, reflecting the size of the protein globule, while the temperature parameter beta decreases. Interestingly, betamu is not as strongly dependent on protein size and appears to be tuned to maintain approximately half of the atoms in the high density interior and the other half in the exterior region of rapidly decreasing density. A normalized size-independent distribution was obtained for the atomic probability as a function of the reduced distance, r = R/R(g), where R(g) is the radius of gyration. The global normalized Fermi distribution, F(r), can be reasonably decomposed in Fermi-like subdistributions for different atomic types tau, F(tau)(r), with Sigma(tau)F(tau)(r) = F(r), which depend on two additional parameters mu(tau) and h(tau). The chemical potential mu(tau) affects a scaling prefactor and depends on the overall frequency of the corresponding atomic type, while the maximum position of the subdistribution is determined by h(tau), which appears in a type-dependent atomic effective energy, epsilon(tau)(r) = h(tau)r, and is strongly correlated to available hydrophobicity scales. Better adjustments are obtained when the effective energy is not assumed to be necessarily linear, or epsilon(tau)*(r) = h(tau)*r(alpha,), in which case a correlation with hydrophobicity

  6. Evolution of an array of elements with logistic transition probability

    International Nuclear Information System (INIS)

    Majernik, Vladimir; Surda, Anton

    1996-01-01

    The paper addresses the problem how the state of an array of elements changes if the transition probabilities of its elements is chosen in the form of a logistic map. This problem leads to a special type of a discrete-time Markov which we simulated numerically for the different transition probabilities and the number of elements in the array. We show that the time evolution of the array exhibits a wide scale of behavior depending on the value of the total number of its elements and on the logistic constant a. We point out that this problem can be applied for description of a spin system with a certain type of mean field and of the multispecies ecosystems with an internal noise. (authors)

  7. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  8. Lower Confidence Bounds for the Probabilities of Correct Selection

    Directory of Open Access Journals (Sweden)

    Radhey S. Singh

    2011-01-01

    Full Text Available We extend the results of Gupta and Liang (1998, derived for location parameters, to obtain lower confidence bounds for the probability of correctly selecting the t best populations (PCSt simultaneously for all t=1,…,k−1 for the general scale parameter models, where k is the number of populations involved in the selection problem. The application of the results to the exponential and normal probability models is discussed. The implementation of the simultaneous lower confidence bounds for PCSt is illustrated through real-life datasets.

  9. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  10. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  11. Posterior probability of linkage and maximal lod score.

    Science.gov (United States)

    Génin, E; Martinez, M; Clerget-Darpoux, F

    1995-01-01

    To detect linkage between a trait and a marker, Morton (1955) proposed to calculate the lod score z(theta 1) at a given value theta 1 of the recombination fraction. If z(theta 1) reaches +3 then linkage is concluded. However, in practice, lod scores are calculated for different values of the recombination fraction between 0 and 0.5 and the test is based on the maximum value of the lod score Zmax. The impact of this deviation of the test on the probability that in fact linkage does not exist, when linkage was concluded, is documented here. This posterior probability of no linkage can be derived by using Bayes' theorem. It is less than 5% when the lod score at a predetermined theta 1 is used for the test. But, for a Zmax of +3, we showed that it can reach 16.4%. Thus, considering a composite alternative hypothesis instead of a single one decreases the reliability of the test. The reliability decreases rapidly when Zmax is less than +3. Given a Zmax of +2.5, there is a 33% chance that linkage does not exist. Moreover, the posterior probability depends not only on the value of Zmax but also jointly on the family structures and on the genetic model. For a given Zmax, the chance that linkage exists may then vary.

  12. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  13. Emptiness formation probability and quantum Knizhnik-Zamolodchikov equation

    International Nuclear Information System (INIS)

    Boos, H.E.; Korepin, V.E.; Smirnov, F.A.

    2003-01-01

    We consider the one-dimensional XXX spin-1/2 Heisenberg antiferromagnet at zero temperature and zero magnetic field. We are interested in a probability of formation of a ferromagnetic string P(n) in the antiferromagnetic ground-state. We call it emptiness formation probability (EFP). We suggest a new technique for computation of the EFP in the inhomogeneous case. It is based on the quantum Knizhnik-Zamolodchikov equation (qKZ). We calculate EFP for n≤6 for inhomogeneous case. The homogeneous limit confirms our hypothesis about the relation of quantum correlations and number theory. We also make a conjecture about a structure of EFP for arbitrary n

  14. Heuristics for the Buffer Allocation Problem with Collision Probability Using Computer Simulation

    Directory of Open Access Journals (Sweden)

    Eishi Chiba

    2015-01-01

    Full Text Available The standard manufacturing system for Flat Panel Displays (FPDs consists of a number of pieces of equipment in series. Each piece of equipment usually has a number of buffers to prevent collision between glass substrates. However, in reality, very few of these buffers seem to be used. This means that redundant buffers exist. In order to reduce cost and space necessary for manufacturing, the number of buffers should be minimized with consideration of possible collisions. In this paper, we focus on an in-line system in which each piece of equipment can have any number of buffers. In this in-line system, we present a computer simulation method for the computation of the probability of a collision occurring. Based on this method, we try to find a buffer allocation that achieves the smallest total number of buffers under an arbitrarily specified collision probability. We also implement our proposed method and present some computational results.

  15. Rejecting probability summation for radial frequency patterns, not so Quick!

    Science.gov (United States)

    Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F

    2016-05-01

    Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Bernoulli Numbers: from Ada Lovelace to the Debye Functions

    OpenAIRE

    Sparavigna , Amelia Carolina

    2016-01-01

    Jacob Bernoulli owes his fame for the numerous contributions to calculus and for his discoveries in the field of probability. Here we will discuss one of his contributions to the theory of numbers, the Bernoulli numbers. They were proposed as a case study by Ada Lovelace in her analysis of Menabrea's report on Babbage Analytical Engine. It is probable that it was this Lovelace's work, that inspired Hans Thirring in using the Bernoulli numbers in the calculus of the Debye functions.

  17. Navier--Stokes relaxation to sinh--Poisson states at finite Reynolds numbers

    International Nuclear Information System (INIS)

    Montgomery, D.; Shan, X.; Matthaeus, W.H.

    1993-01-01

    A mathematical framework is proposed in which it seems possible to justify the computationally-observed relaxation of a two-dimensional Navier--Stokes fluid to a ''most probable,'' or maximum entropy, state. The relaxation occurs at large but finite Reynolds numbers, and involves substantial decay of higher-order ideal invariants such as enstrophy. A two-fluid formulation, involving interpenetrating positive and negative vorticity fluxes (continuous and square integrable) is developed, and is shown to be intimately related to the passive scalar decay problem. Increasing interpenetration of the two fluids corresponds to the decay of vorticity flux due to viscosity. It is demonstrated numerically that, in two dimensions, passive scalars decay rapidly, relative to mean-square vorticity (enstrophy). This observation provides a basis for assigning initial data to the two-fluid field variables

  18. Analysis of probability of defects in the disposal canisters

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Kuusela, P.

    2011-06-01

    This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)

  19. Out-of-hospital cardiac arrest: Probability of bystander defibrillation relative to distance to nearest automated external defibrillator.

    Science.gov (United States)

    Sondergaard, Kathrine B; Hansen, Steen Moller; Pallisgaard, Jannik L; Gerds, Thomas Alexander; Wissenberg, Mads; Karlsson, Lena; Lippert, Freddy K; Gislason, Gunnar H; Torp-Pedersen, Christian; Folke, Fredrik

    2018-03-01

    Despite wide dissemination of automated external defibrillators (AEDs), bystander defibrillation rates remain low. We aimed to investigate how route distance to the nearest accessible AED was associated with probability of bystander defibrillation in public and residential locations. We used data from the nationwide Danish Cardiac Arrest Registry and the Danish AED Network to identify out-of-hospital cardiac arrests and route distances to nearest accessible registered AED during 2008-2013. The association between route distance and bystander defibrillation was described using restricted cubic spline logistic regression. We included 6971 out-of-hospital cardiac arrest cases. The proportion of arrests according to distance in meters (≤100, 101-200, >200) to the nearest accessible AED was: 4.6% (n=320), 5.3% (n=370), and 90.1% (n=6281), respectively. For cardiac arrests in public locations, the probability of bystander defibrillation at 0, 100 and 200m from the nearest AED was 35.7% (95% confidence interval 28.0%-43.5%), 21.3% (95% confidence interval 17.4%-25.2%), and 13.7% (95% confidence interval 10.1%-16.8%), respectively. The corresponding numbers for cardiac arrests in residential locations were 7.0% (95% confidence interval -2.1%-16.1%), 1.5% (95% confidence interval 0.002%-2.8%), and 0.9% (95% confidence interval 0.0005%-1.7%), respectively. In public locations, the probability of bystander defibrillation decreased rapidly within the first 100m route distance from cardiac arrest to nearest accessible AED whereas the probability of bystander defibrillation was low for all distances in residential areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A Probability-Based Hybrid User Model for Recommendation System

    Directory of Open Access Journals (Sweden)

    Jia Hao

    2016-01-01

    Full Text Available With the rapid development of information communication technology, the available information or knowledge is exponentially increased, and this causes the well-known information overload phenomenon. This problem is more serious in product design corporations because over half of the valuable design time is consumed in knowledge acquisition, which highly extends the design cycle and weakens the competitiveness. Therefore, the recommender systems become very important in the domain of product domain. This research presents a probability-based hybrid user model, which is a combination of collaborative filtering and content-based filtering. This hybrid model utilizes user ratings and item topics or classes, which are available in the domain of product design, to predict the knowledge requirement. The comprehensive analysis of the experimental results shows that the proposed method gains better performance in most of the parameter settings. This work contributes a probability-based method to the community for implement recommender system when only user ratings and item topics are available.

  1. SENSITIVITY OF NEST SUCCESS, YOUNG FLEDGED, AND PROBABILITY OF RENESTING TO SEASONAL FECUNDITY IN MULTI-BROODED SPECIES

    Science.gov (United States)

    A considerable number of avian species can produce multiple broods within a season. Seasonal fecundity in these species can vary by changes in the number of young fledged per nest, the probability of a successful nest, and the probability of initiating additional nests (e.g., re...

  2. Random number generation

    International Nuclear Information System (INIS)

    Coveyou, R.R.

    1974-01-01

    The subject of random number generation is currently controversial. Differing opinions on this subject seem to stem from implicit or explicit differences in philosophy; in particular, from differing ideas concerning the role of probability in the real world of physical processes, electronic computers, and Monte Carlo calculations. An attempt is made here to reconcile these views. The role of stochastic ideas in mathematical models is discussed. In illustration of these ideas, a mathematical model of the use of random number generators in Monte Carlo calculations is constructed. This model is used to set up criteria for the comparison and evaluation of random number generators. (U.S.)

  3. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  4. Mining of high utility-probability sequential patterns from uncertain databases.

    Directory of Open Access Journals (Sweden)

    Binbin Zhang

    Full Text Available High-utility sequential pattern mining (HUSPM has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs. They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM for mining high utility-probability sequential patterns (HUPSPs in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds.

  5. Quantum probability and cognitive modeling: some cautions and a promising direction in modeling physics learning.

    Science.gov (United States)

    Franceschetti, Donald R; Gire, Elizabeth

    2013-06-01

    Quantum probability theory offers a viable alternative to classical probability, although there are some ambiguities inherent in transferring the quantum formalism to a less determined realm. A number of physicists are now looking at the applicability of quantum ideas to the assessment of physics learning, an area particularly suited to quantum probability ideas.

  6. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  7. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  8. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    Science.gov (United States)

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  9. Probability approaching method (PAM) and its application on fuel management optimization

    International Nuclear Information System (INIS)

    Liu, Z.; Hu, Y.; Shi, G.

    2004-01-01

    For multi-cycle reloading optimization problem, a new solving scheme is presented. The multi-cycle problem is de-coupled into a number of relatively independent mono-cycle issues, then this non-linear programming problem with complex constraints is solved by an advanced new algorithm -probability approaching method (PAM), which is based on probability theory. The result on simplified core model shows well effect of this new multi-cycle optimization scheme. (authors)

  10. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  11. An empirical probability model of detecting species at low densities.

    Science.gov (United States)

    Delaney, David G; Leung, Brian

    2010-06-01

    False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.

  12. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  13. Assessment of the Effectiveness of Ectomycorrhizal Inocula to Promote Growth and Root Ectomycorrhizal Colonization in Pinus patula Seedlings Using the Most Probable Number Technique

    Directory of Open Access Journals (Sweden)

    Manuel Restrepo-Llano

    2014-01-01

    Full Text Available The aim of this study was to evaluate the response of Pinus patula seedlings to two inocula types: soil from a Pinus plantation (ES and an in vitro produced inoculum (EM. The most probable number method (MPN was used to quantify ectomycorrhizal propagule density (EPD in both inocula in a 7-order dilution series ranging from 100 (undiluted inoculum to 10−6 (the most diluted inoculum. The MPN method allowed establishing differences in the number of infective ectomycorrhizal propagules’ density (EPD (ES=34 per g; EM=156 per g. The results suggest that the EPD of an inoculum may be a key factor that influences the successfulness of the inoculation. The low EPD of the ES inoculum suggests that soil extracted from forest plantations had very low effectiveness for promoting root colonization and plant growth. In contrast, the high EPD found in the formulated inoculum (EM reinforced the idea that it is better to use proven high quality inocula for forest nurseries than using soil from a forestry plantation.

  14. Calculation of the exit probability of a particle from a cylinder of matter

    International Nuclear Information System (INIS)

    Ertaud, A.; Mercier, C.

    1949-02-01

    In the elementary calculation of the ε coefficient and of the slowing down length inside a nuclear pile made of a network of cylindrical rods, it is necessary to know the exit probability of a neutron initially located inside a cylinder filled up with a given substance. This probability is the ratio between the number of output neutrons and the number of neutrons produced inside the surface of the cylinder. This report makes the resolution of this probabilistic equation (integral calculation) both for the cylindrical case and for the spherical case. (J.S.)

  15. No shortcut solution to the problem of Y-STR match probability calculation.

    Science.gov (United States)

    Caliebe, Amke; Jochens, Arne; Willuweit, Sascha; Roewer, Lutz; Krawczak, Michael

    2015-03-01

    Match probability calculation is deemed much more intricate for lineage genetic markers, including Y-chromosomal short tandem repeats (Y-STRs), than for autosomal markers. This is because, owing to the lack of recombination, strong interdependence between markers is likely, which implies that haplotype frequency estimates cannot simply be obtained through the multiplication of allele frequency estimates. As yet, however, the practical relevance of this problem has not been studied in much detail using real data. In fact, such scrutiny appears well warranted because the high mutation rates of Y-STRs and the possibility of backward mutation should have worked against the statistical association of Y-STRs. We examined haplotype data of 21 markers included in the PowerPlex(®)Y23 set (PPY23, Promega Corporation, Madison, WI) originating from six different populations (four European and two Asian). Assessing the conditional entropies of the markers, given different subsets of markers from the same panel, we demonstrate that the PowerPlex(®)Y23 set cannot be decomposed into smaller marker subsets that would be (conditionally) independent. Nevertheless, in all six populations, >94% of the joint entropy of the 21 markers is explained by the seven most rapidly mutating markers. Although this result might render a reduction in marker number a sensible option for practical casework, the partial haplotypes would still be almost as diverse as the full haplotypes. Therefore, match probability calculation remains difficult and calls for the improvement of currently available methods of haplotype frequency estimation. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Multiple model cardinalized probability hypothesis density filter

    Science.gov (United States)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  17. Evaluation of the Implementation of a Rapid Response Treatment Protocol for Patients with Acute Onset Stroke: Can We Increase the Number of Patients Treated and Shorten the Time Needed

    Directory of Open Access Journals (Sweden)

    Rajiv Advani

    2014-06-01

    Full Text Available Aims: This study aims to evaluate the implementation of a rapid response treatment protocol for patients presenting with acute onset ischemic stroke. Improvements of routines surrounding the admission and treatment of patients with intravenous thrombolysis (IVT, such as door-to-needle (DTN times, and increasing the numbers of patients treated are discussed. Methods: We conducted a retrospective analysis of all patients (n = 320 treated with IVT for acute onset ischemic stroke at the Stavanger University Hospital, Norway, between 2003 and 2012. In 2009, a succession of changes to pre- and intra-hospital routines were made as well as an improvement in the education of primary health care physicians, nurses and paramedics involved in the treatment of acute onset stroke patients (rapid response treatment protocol. Analyses of DTN times, onset-to-needle times and the number of patients treated per year were carried out to ascertain the effect of the changes made. The primary aim was to analyze DTN times to look for any changes, and the secondary aim was to analyze changes in the number of patients treated per year. Results: In the years after the implementation of the rapid treatment protocol, we saw an improvement in the median DTN time with a decrease from 73 to 50 min in the first year (p = 0.03, a decrease of 45 min in the second year (p = 0.01 and a decrease of 31 min in the third year (p Conclusions: The implementation of the rapid treatment protocol for acute onset ischemic stroke patients led to a significant decrease in the DTN time at our center. These improvements also produced an increase in the number of patients treated per year. The extension of the therapeutic window from 3 to 4.5 h for the use of intravenous recombinant tissue plasminogen activator also played a role in the increased treatment numbers.

  18. Gray matter volume and rapid decision-making in major depressive disorder.

    Science.gov (United States)

    Nakano, Masayuki; Matsuo, Koji; Nakashima, Mami; Matsubara, Toshio; Harada, Kenichiro; Egashira, Kazuteru; Masaki, Hiroaki; Takahashi, Kanji; Watanabe, Yoshifumi

    2014-01-03

    Reduced motivation and blunted decision-making are key features of major depressive disorder (MDD). Patients with MDD show abnormal decision-making when given negative feedback regarding a reward. The brain mechanisms underpinning this behavior remain unclear. In the present study, we examined the association between rapid decision-making with negative feedback and brain volume in MDD. Thirty-six patients with MDD and 54 age-, sex- and IQ-matched healthy subjects were studied. Subjects performed a rapid decision-making monetary task in which participants could make high- or low-risk choices. We compared between the 2 groups the probability that a high-risk choice followed negative feedback. In addition, we used voxel-based morphometry (VBM) to compare between group differences in gray matter volume, and the correlation between the probability for high-risk choices and brain volume. Compared to the healthy group, the MDD group showed significantly lower probabilities for high-risk choices following negative feedback. VBM analysis revealed that the MDD group had less gray matter volume in the right medial prefrontal cortex and orbitofrontal cortex (OFC) compared to the healthy group. The right OFC volume was negatively correlated with the probability that a high-risk choice followed negative feedback in patients with MDD. We did not observe these trends in healthy subjects. Patients with MDD show reduced motivation for monetary incentives when they were required to make rapid decisions following negative feedback. We observed a correlation between this reduced motivation and gray matter volume in the medial and ventral prefrontal cortex, which suggests that these brain regions are likely involved in the pathophysiology of aberrant decision-making in MDD. © 2013.

  19. STADIC: a computer code for combining probability distributions

    International Nuclear Information System (INIS)

    Cairns, J.J.; Fleming, K.N.

    1977-03-01

    The STADIC computer code uses a Monte Carlo simulation technique for combining probability distributions. The specific function for combination of the input distribution is defined by the user by introducing the appropriate FORTRAN statements to the appropriate subroutine. The code generates a Monte Carlo sampling from each of the input distributions and combines these according to the user-supplied function to provide, in essence, a random sampling of the combined distribution. When the desired number of samples is obtained, the output routine calculates the mean, standard deviation, and confidence limits for the resultant distribution. This method of combining probability distributions is particularly useful in cases where analytical approaches are either too difficult or undefined

  20. Evaluation of the probability distribution of intake from a single measurement on a personal air sampler

    International Nuclear Information System (INIS)

    Birchall, A.; Muirhead, C.R.; James, A.C.

    1988-01-01

    An analytical expression has been derived for the k-sum distribution, formed by summing k random variables from a lognormal population. Poisson statistics are used with this distribution to derive distribution of intake when breathing an atmosphere with a constant particle number concentration. Bayesian inference is then used to calculate the posterior probability distribution of concentrations from a given measurement. This is combined with the above intake distribution to give the probability distribution of intake resulting from a single measurement of activity made by an ideal sampler. It is shown that the probability distribution of intake is very dependent on the prior distribution used in Bayes' theorem. The usual prior assumption, that all number concentrations are equally probable, leads to an imbalance in the posterior intake distribution. This can be resolved if a new prior proportional to w -2/3 is used, where w is the expected number of particles collected. (author)

  1. Application of damping mechanism model and stacking fault probability in Fe-Mn alloy

    International Nuclear Information System (INIS)

    Huang, S.K.; Wen, Y.H.; Li, N.; Teng, J.; Ding, S.; Xu, Y.G.

    2008-01-01

    In this paper, the damping mechanism model of Fe-Mn alloy was analyzed using dislocation theory. Moreover, as an important parameter in Fe-Mn based alloy, the effect of stacking fault probability on the damping capacity of Fe-19.35Mn alloy after deep-cooling or tensile deformation was also studied. The damping capacity was measured using reversal torsion pendulum. The stacking fault probability of γ-austenite and ε-martensite was determined by means of X-ray diffraction (XRD) profile analysis. The microstructure was observed using scanning electronic microscope (SEM). The results indicated that with the strain amplitude increasing above a critical value, the damping capacity of Fe-19.35Mn alloy increased rapidly which could be explained using the breakaway model of Shockley partial dislocations. Deep-cooling and suitable tensile deformation could improve the damping capacity owning to the increasing of stacking fault probability of Fe-19.35Mn alloy

  2. 76 FR 36891 - Guidelines for Determining Probability of Causation Under the Energy Employees Occupational...

    Science.gov (United States)

    2011-06-23

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES 42 CFR Part 81 [Docket Number NIOSH-0209] RIN 0920-AA39 Guidelines for Determining Probability of Causation Under the Energy Employees Occupational Illness...: HHS published a proposed rule entitled ``Guidelines for Determining Probability of Causation Under the...

  3. Rapidity gap survival in the black-disk regime

    International Nuclear Information System (INIS)

    Leonid Frankfurt; Charles Hyde; Mark Strikman; Christian Weiss

    2007-01-01

    We summarize how the approach to the black-disk regime (BDR) of strong interactions at TeV energies influences rapidity gap survival in exclusive hard diffraction pp -> p + H + p (H = dijet, Qbar Q, Higgs). Employing a recently developed partonic description of such processes, we discuss (a) the suppression of diffraction at small impact parameters by soft spectator interactions in the BDR; (b) further suppression by inelastic interactions of hard spectator partons in the BDR; (c) correlations between hard and soft interactions. Hard spectator interactions substantially reduce the rapidity gap survival probability at LHC energies compared to previously reported estimates

  4. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment.

    Directory of Open Access Journals (Sweden)

    Amber M Sprenger

    2011-06-01

    Full Text Available We tested the predictions of HyGene (Thomas, Dougherty, Sprenger, & Harbison, 2008 that both divided attention at encoding and judgment should affect degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention at encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments.

  5. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment

    Science.gov (United States)

    Sprenger, Amber M.; Dougherty, Michael R.; Atkins, Sharona M.; Franco-Watkins, Ana M.; Thomas, Rick P.; Lange, Nicholas; Abbs, Brandon

    2011-01-01

    We tested the predictions of HyGene (Thomas et al., 2008) that both divided attention at encoding and judgment should affect the degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention during encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments. PMID:21734897

  6. Maximizing probable oil field profit: uncertainties on well spacing

    International Nuclear Information System (INIS)

    MacKay, J.A.; Lerche, I.

    1997-01-01

    The influence of uncertainties in field development costs, well costs, lifting costs, selling price, discount factor, and oil field reserves are evaluated for their impact on assessing probable ranges of uncertainty on present day worth (PDW), oil field lifetime τ 2/3 , optimum number of wells (OWI), and the minimum (n-) and maximum (n+) number of wells to produce a PDW ≥ O. The relative importance of different factors in contributing to the uncertainties in PDW, τ 2/3 , OWI, nsub(-) and nsub(+) is also analyzed. Numerical illustrations indicate how the maximum PDW depends on the ranges of parameter values, drawn from probability distributions using Monte Carlo simulations. In addition, the procedure illustrates the relative importance of contributions of individual factors to the total uncertainty, so that one can assess where to place effort to improve ranges of uncertainty; while the volatility of each estimate allows one to determine when such effort is needful. (author)

  7. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  8. Communicating through Probabilities: Does Quantum Theory Optimize the Transfer of Information?

    Directory of Open Access Journals (Sweden)

    William K. Wootters

    2013-08-01

    Full Text Available A quantum measurement can be regarded as a communication channel, in which the parameters of the state are expressed only in the probabilities of the outcomes of the measurement. We begin this paper by considering, in a non-quantum-mechanical setting, the problem of communicating through probabilities. For example, a sender, Alice, wants to convey to a receiver, Bob, the value of a continuous variable, θ, but her only means of conveying this value is by sending Bob a coin in which the value of θ is encoded in the probability of heads. We ask what the optimal encoding is when Bob will be allowed to flip the coin only a finite number of times. As the number of tosses goes to infinity, we find that the optimal encoding is the same as what nature would do if we lived in a world governed by real-vector-space quantum theory. We then ask whether the problem might be modified, so that the optimal communication strategy would be consistent with standard, complex-vector-space quantum theory.

  9. The relationship between the number of loci and the statistical support for the topology of UPGMA trees obtained from genetic distance data.

    Science.gov (United States)

    Highton, R

    1993-12-01

    An analysis of the relationship between the number of loci utilized in an electrophoretic study of genetic relationships and the statistical support for the topology of UPGMA trees is reported for two published data sets. These are Highton and Larson (Syst. Zool.28:579-599, 1979), an analysis of the relationships of 28 species of plethodonine salamanders, and Hedges (Syst. Zool., 35:1-21, 1986), a similar study of 30 taxa of Holarctic hylid frogs. As the number of loci increases, the statistical support for the topology at each node in UPGMA trees was determined by both the bootstrap and jackknife methods. The results show that the bootstrap and jackknife probabilities supporting the topology at some nodes of UPGMA trees increase as the number of loci utilized in a study is increased, as expected for nodes that have groupings that reflect phylogenetic relationships. The pattern of increase varies and is especially rapid in the case of groups with no close relatives. At nodes that likely do not represent correct phylogenetic relationships, the bootstrap probabilities do not increase and often decline with the addition of more loci.

  10. Heads or tails an introduction to limit theorems in probability

    CERN Document Server

    Lesigne, Emmanuel

    2005-01-01

    Everyone knows some of the basics of probability, perhaps enough to play cards. Beyond the introductory ideas, there are many wonderful results that are unfamiliar to the layman, but which are well within our grasp to understand and appreciate. Some of the most remarkable results in probability are those that are related to limit theorems--statements about what happens when the trial is repeated many times. The most famous of these is the Law of Large Numbers, which mathematicians, engineers, economists, and many others use every day. In this book, Lesigne has made these limit theorems accessible by stating everything in terms of a game of tossing of a coin: heads or tails. In this way, the analysis becomes much clearer, helping establish the reader's intuition about probability. Moreover, very little generality is lost, as many situations can be modelled from combinations of coin tosses. This book is suitable for anyone who would like to learn more about mathematical probability and has had a one-year underg...

  11. In vitro activity of flomoxef against rapidly growing mycobacteria.

    Science.gov (United States)

    Tsai, Moan-Shane; Tang, Ya-Fen; Eng, Hock-Liew

    2008-06-01

    The aim of this study was to determine the in vitro sensitivity of rapidly growing mycobacteria (RGM) to flomoxef in respiratory secretions collected from 61 consecutive inpatients and outpatients at Chang Gung Memorial Hospital-Kaohsiung medical center between July and December, 2005. Minimal inhibitory concentrations (MIC) of flomoxef were determined by the broth dilution method for the 61 clinical isolates of RGMs. The MICs of flomoxef at which 90% of clinical isolates were inhibited was >128 microg/mL in 26 isolates of Mycobacterium abscessus and 4 microg/mL in 31 isolates of M. fortuitum. Three out of 4 clinical M. peregrinum isolates were inhibited by flomoxef at concentrations of 4 microg/mL or less. Although the numbers of the clinical isolates of RGMs were small, these preliminary in vitro results demonstrate the potential activity of flomoxef in the management of infections due to M. fortuitum, and probably M. peregrinum in humans.

  12. Generation of pseudo-random numbers

    Science.gov (United States)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  13. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  14. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  15. A probability model for the failure of pressure containing parts

    International Nuclear Information System (INIS)

    Thomas, H.M.

    1978-01-01

    The model provides a method of estimating the order of magnitude of the leakage failure probability of pressure containing parts. It is a fatigue based model which makes use of the statistics available for both specimens and vessels. Some novel concepts are introduced but essentially the model simply quantifies the obvious i.e. that failure probability increases with increases in stress levels, number of cycles, volume of material and volume of weld metal. A further model based on fracture mechanics estimates the catastrophic fraction of leakage failures. (author)

  16. Strategy evolution driven by switching probabilities in structured multi-agent systems

    Science.gov (United States)

    Zhang, Jianlei; Chen, Zengqiang; Li, Zhiqi

    2017-10-01

    Evolutionary mechanism driving the commonly seen cooperation among unrelated individuals is puzzling. Related models for evolutionary games on graphs traditionally assume that players imitate their successful neighbours with higher benefits. Notably, an implicit assumption here is that players are always able to acquire the required pay-off information. To relax this restrictive assumption, a contact-based model has been proposed, where switching probabilities between strategies drive the strategy evolution. However, the explicit and quantified relation between a player's switching probability for her strategies and the number of her neighbours remains unknown. This is especially a key point in heterogeneously structured system, where players may differ in the numbers of their neighbours. Focusing on this, here we present an augmented model by introducing an attenuation coefficient and evaluate its influence on the evolution dynamics. Results show that the individual influence on others is negatively correlated with the contact numbers specified by the network topologies. Results further provide the conditions under which the coexisting strategies can be calculated analytically.

  17. A hierarchical probabilistic model for rapid object categorization in natural scenes.

    Directory of Open Access Journals (Sweden)

    Xiaofu He

    Full Text Available Humans can categorize objects in complex natural scenes within 100-150 ms. This amazing ability of rapid categorization has motivated many computational models. Most of these models require extensive training to obtain a decision boundary in a very high dimensional (e.g., ∼6,000 in a leading model feature space and often categorize objects in natural scenes by categorizing the context that co-occurs with objects when objects do not occupy large portions of the scenes. It is thus unclear how humans achieve rapid scene categorization.To address this issue, we developed a hierarchical probabilistic model for rapid object categorization in natural scenes. In this model, a natural object category is represented by a coarse hierarchical probability distribution (PD, which includes PDs of object geometry and spatial configuration of object parts. Object parts are encoded by PDs of a set of natural object structures, each of which is a concatenation of local object features. Rapid categorization is performed as statistical inference. Since the model uses a very small number (∼100 of structures for even complex object categories such as animals and cars, it requires little training and is robust in the presence of large variations within object categories and in their occurrences in natural scenes. Remarkably, we found that the model categorized animals in natural scenes and cars in street scenes with a near human-level performance. We also found that the model located animals and cars in natural scenes, thus overcoming a flaw in many other models which is to categorize objects in natural context by categorizing contextual features. These results suggest that coarse PDs of object categories based on natural object structures and statistical operations on these PDs may underlie the human ability to rapidly categorize scenes.

  18. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  19. Arbuscular mycorrhizal propagules in soils from a tropical forest and an abandoned cornfield in Quintana Roo, Mexico: visual comparison of most-probable-number estimates.

    Science.gov (United States)

    Ramos-Zapata, José A; Guadarrama, Patricia; Navarro-Alberto, Jorge; Orellana, Roger

    2011-02-01

    The present study was aimed at comparing the number of arbuscular mycorrhizal fungi (AMF) propagules found in soil from a mature tropical forest and that found in an abandoned cornfield in Noh-Bec Quintana Roo, Mexico, during three seasons. Agricultural practices can dramatically reduce the availability and viability of AMF propagules, and in this way delay the regeneration of tropical forests in abandoned agricultural areas. In addition, rainfall seasonality, which characterizes deciduous tropical forests, may strongly influence AMF propagules density. To compare AMF propagule numbers between sites and seasons (summer rainy, winter rainy and dry season), a "most probable number" (MPN) bioassay was conducted under greenhouse conditions employing Sorgum vulgare L. as host plant. Results showed an average value of 3.5 ± 0.41 propagules in 50 ml of soil for the mature forest while the abandoned cornfield had 15.4 ± 5.03 propagules in 50 ml of soil. Likelihood analysis showed no statistical differences in MPN of propagules between seasons within each site, or between sites, except for the summer rainy season for which soil from the abandoned cornfield had eight times as many propagules compared to soil from the mature forest site for this season. Propagules of arbuscular mycorrhizal fungi remained viable throughout the sampling seasons at both sites. Abandoned areas resulting from traditional slash and burn agriculture practices involving maize did not show a lower number of AMF propagules, which should allow the establishment of mycotrophic plants thus maintaining the AMF inoculum potential in these soils.

  20. LOGISTIC REGRESSION AS A TOOL FOR DETERMINATION OF THE PROBABILITY OF DEFAULT FOR ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Erika SPUCHLAKOVA

    2017-12-01

    Full Text Available In a rapidly changing world it is necessary to adapt to new conditions. From a day to day approaches can vary. For the proper management of the company it is essential to know the financial situation. Assessment of the company financial health can be carried out by financial analysis which provides a number of methods how to evaluate the company financial health. Analysis indicators are often included in the company assessment, in obtaining bank loans and other financial resources to ensure the functioning of the company. As company focuses on the future and its planning, it is essential to forecast the future financial situation. According to the results of company´s financial health prediction, the company decides on the extension or limitation of its business. It depends mainly on the capabilities of company´s management how they will use information obtained from financial analysis in practice. The findings of logistic regression methods were published firstly in the 60s, as an alternative to the least squares method. The essence of logistic regression is to determine the relationship between being explained (dependent variable and explanatory (independent variables. The basic principle of this static method is based on the regression analysis, but unlike linear regression, it can predict the probability of a phenomenon that has occurred or not. The aim of this paper is to determine the probability of bankruptcy enterprises.

  1. The exact probability distribution of the rank product statistics for replicated experiments.

    Science.gov (United States)

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  2. Modulation of cognitive control levels via manipulation of saccade trial-type probability assessed with event-related BOLD fMRI.

    Science.gov (United States)

    Pierce, Jordan E; McDowell, Jennifer E

    2016-02-01

    Cognitive control supports flexible behavior adapted to meet current goals and can be modeled through investigation of saccade tasks with varying cognitive demands. Basic prosaccades (rapid glances toward a newly appearing stimulus) are supported by neural circuitry, including occipital and posterior parietal cortex, frontal and supplementary eye fields, and basal ganglia. These trials can be contrasted with complex antisaccades (glances toward the mirror image location of a stimulus), which are characterized by greater functional magnetic resonance imaging (MRI) blood oxygenation level-dependent (BOLD) signal in the aforementioned regions and recruitment of additional regions such as dorsolateral prefrontal cortex. The current study manipulated the cognitive demands of these saccade tasks by presenting three rapid event-related runs of mixed saccades with a varying probability of antisaccade vs. prosaccade trials (25, 50, or 75%). Behavioral results showed an effect of trial-type probability on reaction time, with slower responses in runs with a high antisaccade probability. Imaging results exhibited an effect of probability in bilateral pre- and postcentral gyrus, bilateral superior temporal gyrus, and medial frontal gyrus. Additionally, the interaction between saccade trial type and probability revealed a strong probability effect for prosaccade trials, showing a linear increase in activation parallel to antisaccade probability in bilateral temporal/occipital, posterior parietal, medial frontal, and lateral prefrontal cortex. In contrast, antisaccade trials showed elevated activation across all runs. Overall, this study demonstrated that improbable performance of a typically simple prosaccade task led to augmented BOLD signal to support changing cognitive control demands, resulting in activation levels similar to the more complex antisaccade task. Copyright © 2016 the American Physiological Society.

  3. Pólya number and first return of bursty random walk: Rigorous solutions

    Science.gov (United States)

    Wan, J.; Xu, X. P.

    2012-03-01

    The recurrence properties of random walks can be characterized by Pólya number, i.e., the probability that the walker has returned to the origin at least once. In this paper, we investigate Pólya number and first return for bursty random walk on a line, in which the walk has different step size and moving probabilities. Using the concept of the Catalan number, we obtain exact results for first return probability, the average first return time and Pólya number for the first time. We show that Pólya number displays two different functional behavior when the walk deviates from the recurrent point. By utilizing the Lagrange inversion formula, we interpret our findings by transferring Pólya number to the closed-form solutions of an inverse function. We also calculate Pólya number using another approach, which corroborates our results and conclusions. Finally, we consider the recurrence properties and Pólya number of two variations of the bursty random walk model.

  4. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  5. Recurrence and Polya Number of General One-Dimensional Random Walks

    International Nuclear Information System (INIS)

    Zhang Xiaokun; Wan Jing; Lu Jingju; Xu Xinping

    2011-01-01

    The recurrence properties of random walks can be characterized by Polya number, i.e., the probability that the walker has returned to the origin at least once. In this paper, we consider recurrence properties for a general 1D random walk on a line, in which at each time step the walker can move to the left or right with probabilities l and r, or remain at the same position with probability o (l + r + o = 1). We calculate Polya number P of this model and find a simple expression for P as, P = 1 - Δ, where Δ is the absolute difference of l and r (Δ = |l - r|). We prove this rigorous expression by the method of creative telescoping, and our result suggests that the walk is recurrent if and only if the left-moving probability l equals to the right-moving probability r. (general)

  6. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  7. A staggered conservative scheme for every Froude number in rapidly varied shallow water flows

    Science.gov (United States)

    Stelling, G. S.; Duinmeijer, S. P. A.

    2003-12-01

    This paper proposes a numerical technique that in essence is based upon the classical staggered grids and implicit numerical integration schemes, but that can be applied to problems that include rapidly varied flows as well. Rapidly varied flows occur, for instance, in hydraulic jumps and bores. Inundation of dry land implies sudden flow transitions due to obstacles such as road banks. Near such transitions the grid resolution is often low compared to the gradients of the bathymetry. In combination with the local invalidity of the hydrostatic pressure assumption, conservation properties become crucial. The scheme described here, combines the efficiency of staggered grids with conservation properties so as to ensure accurate results for rapidly varied flows, as well as in expansions as in contractions. In flow expansions, a numerical approximation is applied that is consistent with the momentum principle. In flow contractions, a numerical approximation is applied that is consistent with the Bernoulli equation. Both approximations are consistent with the shallow water equations, so under sufficiently smooth conditions they converge to the same solution. The resulting method is very efficient for the simulation of large-scale inundations.

  8. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  9. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  10. An analytical evaluation for spatial-dependent intra-pebble Dancoff factor and escape probability

    International Nuclear Information System (INIS)

    Kim, Songhyun; Kim, Hong-Chul; Kim, Jong Kyung; Kim, Soon Young; Noh, Jae Man

    2009-01-01

    The analytical evaluation of spatial-dependent intra-pebble Dancoff factors and their escape probabilities is pursued by the model developed in this study. Intra-pebble Dancoff factors and their escape probabilities are calculated as a function of fuel kernel radius, number of fuel kernels, and fuel region radius. The method in this study can be easily utilized to analyze the tendency of spatial-dependent intra-pebble Dancoff factor and spatial-dependent fuel region escape probability for the various geometries because it is faster than the MCNP method as well as good accuracy. (author)

  11. Unbiased multi-fidelity estimate of failure probability of a free plane jet

    Science.gov (United States)

    Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin

    2017-11-01

    Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.

  12. Mental status and suicide probability of young people: A cross-sectional study

    Directory of Open Access Journals (Sweden)

    Selen Ozakar Akca

    Full Text Available Summary Objective: The most important determinant of suicide ideation, tendency and initiative is the presence of mental disorders. Since the number of those who lost their lives due to suicide in the world rose rapidly among the young population, the World Health Organization emphasizes the importance of assessing young people in the high-risk age group to prevent suicidal behavior. This study aimed to determine psychological symptom levels and suicide probability in young people. Method: The cross-sectional research consisted of 15-24 year-old individuals (N=348, who have sought a psychiatric clinic between February and June, 2015. The Research Data was collected by applying Data Collection Form, Suicide Probability Scale (SPS and Brief Symptom Inventory (BSI. SPSS 22.0 statistical package program was used for data analysis. Results: There was a statistically significant difference (p<0.05 between the mean SPS scores according to education, psychiatric treatment, self-harm, smoking and drinking status of the participants in the study. Apart from this, there was also a statistically significant correlation between anxiety, depression, negative self and hostility according to the SPS and BSI subscales (p<0.001, r=0.739; p<0.001, r=0.729; p<0.001, r=0.747; p<0.001, r=0.715; respectively. Conclusion: The results of our study show that suicide risk is significantly higher in young people with depression, anxiety, negative self-perception and hostility symptoms. In this regard, we suggest the relevance of assessing the suicide risk of young people seeking a psychiatric clinic, with thorough attention to those who have high potential for suicide.

  13. Survival and compound nucleus probability of super heavy element Z = 117

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First grade College, Department of Physics, Kolar, Karnataka (India)

    2017-05-15

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of {sup 289-297}Ts, we have calculated the transmission probability (T{sub l}), compound nucleus formation probabilities (P{sub CN}) and survival probability (P{sub sur}) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of {sup 289-297}Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei {sup 289-297}Ts are worked out and listed explicitly. We have also studied the variation of P{sub CN} and P{sub sur} with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  14. Knotting probability of self-avoiding polygons under a topological constraint

    Science.gov (United States)

    Uehara, Erica; Deguchi, Tetsuo

    2017-09-01

    We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius rex. For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius rex. It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius rex corresponds to the screening length.

  15. Knotting probability of self-avoiding polygons under a topological constraint.

    Science.gov (United States)

    Uehara, Erica; Deguchi, Tetsuo

    2017-09-07

    We define the knotting probability of a knot K by the probability for a random polygon or self-avoiding polygon (SAP) of N segments having the knot type K. We show fundamental and generic properties of the knotting probability particularly its dependence on the excluded volume. We investigate them for the SAP consisting of hard cylindrical segments of unit length and radius r ex . For various prime and composite knots, we numerically show that a compact formula describes the knotting probabilities for the cylindrical SAP as a function of segment number N and radius r ex . It connects the small-N to the large-N behavior and even to lattice knots in the case of large values of radius. As the excluded volume increases, the maximum of the knotting probability decreases for prime knots except for the trefoil knot. If it is large, the trefoil knot and its descendants are dominant among the nontrivial knots in the SAP. From the factorization property of the knotting probability, we derive a sum rule among the estimates of a fitting parameter for all prime knots, which suggests the local knot picture and the dominance of the trefoil knot in the case of large excluded volumes. Here we remark that the cylindrical SAP gives a model of circular DNA which is negatively charged and semiflexible, where radius r ex corresponds to the screening length.

  16. Conceptual and Statistical Issues Regarding the Probability of Default and Modeling Default Risk

    Directory of Open Access Journals (Sweden)

    Emilia TITAN

    2011-03-01

    Full Text Available In today’s rapidly evolving financial markets, risk management offers different techniques in order to implement an efficient system against market risk. Probability of default (PD is an essential part of business intelligence and customer relation management systems in the financial institutions. Recent studies indicates that underestimating this important component, and also the loss given default (LGD, might threaten the stability and smooth running of the financial markets. From the perspective of risk management, the result of predictive accuracy of the estimated probability of default is more valuable than the standard binary classification: credible or non credible clients. The Basle II Accord recognizes the methods of reducing credit risk and also PD and LGD as important components of advanced Internal Rating Based (IRB approach.

  17. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  18. Evolvement simulation of the probability of neutron-initiating persistent fission chain

    International Nuclear Information System (INIS)

    Wang Zhe; Hong Zhenying

    2014-01-01

    Background: Probability of neutron-initiating persistent fission chain, which has to be calculated in analysis of critical safety, start-up of reactor, burst waiting time on pulse reactor, bursting time on pulse reactor, etc., is an inherent parameter in a multiplying assembly. Purpose: We aim to derive time-dependent integro-differential equation for such probability in relative velocity space according to the probability conservation, and develop the deterministic code Dynamic Segment Number Probability (DSNP) based on the multi-group S N method. Methods: The reliable convergence of dynamic calculation was analyzed and numerical simulation of the evolvement process of dynamic probability for varying concentration was performed under different initial conditions. Results: On Highly Enriched Uranium (HEU) Bare Spheres, when the time is long enough, the results of dynamic calculation approach to those of static calculation. The most difference of such results between DSNP and Partisn code is less than 2%. On Baker model, over the range of about 1 μs after the first criticality, the most difference between the dynamic and static calculation is about 300%. As for a super critical system, the finite fission chains decrease and the persistent fission chains increase as the reactivity aggrandizes, the dynamic evolvement curve of initiation probability is close to the static curve within the difference of 5% when the K eff is more than 1.2. The cumulative probability curve also indicates that the difference of integral results between the dynamic calculation and the static calculation decreases from 35% to 5% as the K eff increases. This demonstrated that the ability of initiating a self-sustaining fission chain reaction approaches stabilization, while the former difference (35%) showed the important difference of the dynamic results near the first criticality with the static ones. The DSNP code agrees well with Partisn code. Conclusions: There are large numbers of

  19. Feynman quasi probability distribution for spin-(1/2), and its generalizations

    International Nuclear Information System (INIS)

    Colucci, M.

    1999-01-01

    It has been examined the Feynman's paper Negative probability, in which, after a discussion about the possibility of attributing a real physical meaning to quasi probability distributions, he introduces a new kind of distribution for spin-(1/2), with a possible method of generalization to systems with arbitrary number of states. The principal aim of this article is to shed light upon the method of construction of these distributions, taking into consideration their application to some experiments, and discussing their positive and negative aspects

  20. Fermi-Dirac statistics and the number theory

    OpenAIRE

    Kubasiak, A.; Korbicz, J.; Zakrzewski, J.; Lewenstein, M.

    2005-01-01

    We relate the Fermi-Dirac statistics of an ideal Fermi gas in a harmonic trap to partitions of given integers into distinct parts, studied in number theory. Using methods of quantum statistical physics we derive analytic expressions for cumulants of the probability distribution of the number of different partitions.

  1. Maximum parsimony, substitution model, and probability phylogenetic trees.

    Science.gov (United States)

    Weng, J F; Thomas, D A; Mareels, I

    2011-01-01

    The problem of inferring phylogenies (phylogenetic trees) is one of the main problems in computational biology. There are three main methods for inferring phylogenies-Maximum Parsimony (MP), Distance Matrix (DM) and Maximum Likelihood (ML), of which the MP method is the most well-studied and popular method. In the MP method the optimization criterion is the number of substitutions of the nucleotides computed by the differences in the investigated nucleotide sequences. However, the MP method is often criticized as it only counts the substitutions observable at the current time and all the unobservable substitutions that really occur in the evolutionary history are omitted. In order to take into account the unobservable substitutions, some substitution models have been established and they are now widely used in the DM and ML methods but these substitution models cannot be used within the classical MP method. Recently the authors proposed a probability representation model for phylogenetic trees and the reconstructed trees in this model are called probability phylogenetic trees. One of the advantages of the probability representation model is that it can include a substitution model to infer phylogenetic trees based on the MP principle. In this paper we explain how to use a substitution model in the reconstruction of probability phylogenetic trees and show the advantage of this approach with examples.

  2. Linear and nonlinear optical signals in probability and phase-space representations

    International Nuclear Information System (INIS)

    Man'ko, Margarita A

    2006-01-01

    Review of different representations of signals including the phase-space representations and tomographic representations is presented. The signals under consideration are either linear or nonlinear ones. The linear signals satisfy linear quantumlike Schroedinger and von Neumann equations. Nonlinear signals satisfy nonlinear Schroedinger equations as well as Gross-Pitaevskii equation describing solitons in Bose-Einstein condensate. The Ville-Wigner distributions for solitons are considered in comparison with tomographic-probability densities describing solitons completely. different kinds of tomographies - symplectic tomography, optical tomography and Fresnel tomography are reviewed. New kind of map of the signals onto probability distributions of discrete photon number-like variable is discussed. Mutual relations between different transformations of signal functions are established in explicit form. Such characteristics of the signal-probability distribution as entropy is discussed

  3. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  4. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  5. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  6. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  7. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  8. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  9. Efficient Simulation of the Outage Probability of Multihop Systems

    KAUST Repository

    Ben Issaid, Chaouki; Alouini, Mohamed-Slim; Tempone, Raul

    2017-01-01

    In this paper, we present an efficient importance sampling estimator for the evaluation of the outage probability of multihop systems with amplify-and-forward channel state-information-assisted. The proposed estimator is endowed with the bounded relative error property. Simulation results show a significant reduction in terms of number of simulation runs compared to naive Monte Carlo.

  10. Efficient Simulation of the Outage Probability of Multihop Systems

    KAUST Repository

    Ben Issaid, Chaouki

    2017-10-23

    In this paper, we present an efficient importance sampling estimator for the evaluation of the outage probability of multihop systems with amplify-and-forward channel state-information-assisted. The proposed estimator is endowed with the bounded relative error property. Simulation results show a significant reduction in terms of number of simulation runs compared to naive Monte Carlo.

  11. A Short History of Probability Theory and Its Applications

    Science.gov (United States)

    Debnath, Lokenath; Basu, Kanadpriya

    2015-01-01

    This paper deals with a brief history of probability theory and its applications to Jacob Bernoulli's famous law of large numbers and theory of errors in observations or measurements. Included are the major contributions of Jacob Bernoulli and Laplace. It is written to pay the tricentennial tribute to Jacob Bernoulli, since the year 2013…

  12. On the probability of cure for heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Hanin, Leonid; Zaider, Marco

    2014-01-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)

  13. Numeracy moderates the influence of task-irrelevant affect on probability weighting.

    Science.gov (United States)

    Traczyk, Jakub; Fulawka, Kamil

    2016-06-01

    Statistical numeracy, defined as the ability to understand and process statistical and probability information, plays a significant role in superior decision making. However, recent research has demonstrated that statistical numeracy goes beyond simple comprehension of numbers and mathematical operations. On the contrary to previous studies that were focused on emotions integral to risky prospects, we hypothesized that highly numerate individuals would exhibit more linear probability weighting because they would be less biased by incidental and decision-irrelevant affect. Participants were instructed to make a series of insurance decisions preceded by negative (i.e., fear-inducing) or neutral stimuli. We found that incidental negative affect increased the curvature of the probability weighting function (PWF). Interestingly, this effect was significant only for less numerate individuals, while probability weighting in more numerate people was not altered by decision-irrelevant affect. We propose two candidate mechanisms for the observed effect. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  15. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  16. Some relations between entropy and approximation numbers

    Institute of Scientific and Technical Information of China (English)

    郑志明

    1999-01-01

    A general result is obtained which relates the entropy numbers of compact maps on Hilbert space to its approximation numbers. Compared with previous works in this area, it is particularly convenient for dealing with the cases where the approximation numbers decay rapidly. A nice estimation between entropy and approximation numbers for noncompact maps is given.

  17. A comparison of probability of ruin and expected discounted utility ...

    African Journals Online (AJOL)

    Individuals in defined-contribution retirement funds currently have a number of options as to how to finance their post-retirement spending. The paper considers the ranking of selected annuitisation strategies by the probability of ruin and by expected discounted utility under different scenarios. 'Ruin' is defined as occurring ...

  18. Neutron transport by collision probability method in complicated geometries

    International Nuclear Information System (INIS)

    Constantin, Marin

    2000-01-01

    For the first flight collision probability (FFCP) method a rapidly increasing of the memory requirements and execution time with the number of discrete regions occurs. Generally, the use of the method is restricted at cell/supercell level. However, the amazing developments both in computer hardware and computer architecture allow a real extending of the problems' domain and a more detailed treatment of the geometry. Two ways are discussed into the paper: the direct design of new codes and the improving of the mainframe old versions. The author's experience is focused on the performances' improving of the 3D integral transport code PIJXYZ (from an old version to a modern one) and on the design and developing of the 2D transport code CP 2 D in the last years. In the first case an optimization process have been performed before the parallelization. In the second a modular design and the newest techniques (factorization of the geometry, the macrobands method, the mobile set of chords, the automatic calculation of the integration error, optimal algorithms for the innermost programming level, the mixed method for tracking process and CPs calculation, etc.) were adopted. In both cases the parallelization uses a PCs network system. Some short examples for CP 2 D and PIJXYZ calculation are presented: reactivity void effect in typical CANDU cells using a multistratified coolant model, a problem of some adjacent fuel assemblies, CANDU reactivity devices 3D simulation. (author)

  19. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  20. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  1. Rapidity gap survival in the black-disk regime

    International Nuclear Information System (INIS)

    Leonid Frankfurt; Charles Hyde; Mark Strikman; Christian Weiss

    2007-01-01

    We summarize how the approach to the black-disk regime (BDR) of strong interactions at TeV energies influences rapidity gap survival in exclusive hard diffraction pp -> p + H + p (H = dijet, Qbar Q, Higgs). Employing a recently developed partonic description of such processes, we discuss (a) the suppression of diffraction at small impact parameters by soft spectator interactions in the BDR; (b) further suppression by inelastic interactions of hard spectator partons in the BDR; (c) effects of correlations between hard and soft interactions, as suggested by various models of proton structure (color fluctuations, spatial correlations of partons). Hard spectator interactions in the BDR substantially reduce the rapidity gap survival probability at LHC energies compared to previously reported estimates

  2. Predictive probability methods for interim monitoring in clinical trials with longitudinal outcomes.

    Science.gov (United States)

    Zhou, Ming; Tang, Qi; Lang, Lixin; Xing, Jun; Tatsuoka, Kay

    2018-04-17

    In clinical research and development, interim monitoring is critical for better decision-making and minimizing the risk of exposing patients to possible ineffective therapies. For interim futility or efficacy monitoring, predictive probability methods are widely adopted in practice. Those methods have been well studied for univariate variables. However, for longitudinal studies, predictive probability methods using univariate information from only completers may not be most efficient, and data from on-going subjects can be utilized to improve efficiency. On the other hand, leveraging information from on-going subjects could allow an interim analysis to be potentially conducted once a sufficient number of subjects reach an earlier time point. For longitudinal outcomes, we derive closed-form formulas for predictive probabilities, including Bayesian predictive probability, predictive power, and conditional power and also give closed-form solutions for predictive probability of success in a future trial and the predictive probability of success of the best dose. When predictive probabilities are used for interim monitoring, we study their distributions and discuss their analytical cutoff values or stopping boundaries that have desired operating characteristics. We show that predictive probabilities utilizing all longitudinal information are more efficient for interim monitoring than that using information from completers only. To illustrate their practical application for longitudinal data, we analyze 2 real data examples from clinical trials. Copyright © 2018 John Wiley & Sons, Ltd.

  3. Survival and compound nucleus probability of super heavy element Z = 117

    International Nuclear Information System (INIS)

    Manjunatha, H.C.; Sridhar, K.N.

    2017-01-01

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of "2"8"9"-"2"9"7Ts, we have calculated the transmission probability (T_l), compound nucleus formation probabilities (P_C_N) and survival probability (P_s_u_r) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of "2"8"9"-"2"9"7Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei "2"8"9"-"2"9"7Ts are worked out and listed explicitly. We have also studied the variation of P_C_N and P_s_u_r with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  4. Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies.

    Science.gov (United States)

    Kuo, Chia-Ling; Vsevolozhskaya, Olga A; Zaykin, Dmitri V

    2015-01-01

    Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease.

  5. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  6. Probability and risk criteria for channel depth design and channel operation

    CSIR Research Space (South Africa)

    Moes, H

    2008-05-01

    Full Text Available The paper reviews the various levels of probability of bottom touching and risk criteria which are being used. This leads to a relationship between the statistically expected number of vertical ship motions in the channel during a single shipping...

  7. Quantification of diazotrophs bacteria isolated from cocoa soils (Theobroma cacao L., by the technique of Most Probable Number (MPN

    Directory of Open Access Journals (Sweden)

    Adriana Zulay Argüello Navarro

    2016-07-01

    Full Text Available The objective of this research was to quantify diazotrophic bacteria and compare physicochemically rhizospheric soils of three cocoa plantations (Theobroma cacao L. in Norte de Santander Department, Colombia; for which they were characterized, differing in cultivated area, agronomic management and crop age. From serial dilutions of the samples and using the technique of Most Probable Number (MPN, In semisolid culture media (NFb, JMV, LGI, JNFb, the diazotrophs were quantified, evaluating as positive the formation of a subsurface film in the medium contained in sealed vials; equal samples were sent to the Bioambiental laboratory (UNET for physicochemical analyzes. As a result, the evaluated samples showed deficiencies in the percentage of organic matter and elements such as Potassium, Phosphorus and Magnesium. Statistically highly significant differences in MPN were reported. The highest quantification of diazotrophs was reported in the Florilandia farm, which was characterized by drip irrigation. The highest quantification of diazotrophs was recorded in the media NFb and JMV, demonstrating a greater presence of the presumed genera Azospirillum sp. and Burkholderia sp. which are easily isolated from rhizospheric soils, unlike the genera Herbaspirillum sp. and Gluconacetobacter sp. which by their endophytic character tend to be less predominant in this type of samples. It is also concluded that the physicochemical characteristics of the soil, humidity and climatic relationships at the moment of sampling, condition the amount of root exudates and therefore are factors that conditioned the presence of diazotrophs in the samples.

  8. Fixation Probabilities of Evolutionary Graphs Based on the Positions of New Appearing Mutants

    Directory of Open Access Journals (Sweden)

    Pei-ai Zhang

    2014-01-01

    Full Text Available Evolutionary graph theory is a nice measure to implement evolutionary dynamics on spatial structures of populations. To calculate the fixation probability is usually regarded as a Markov chain process, which is affected by the number of the individuals, the fitness of the mutant, the game strategy, and the structure of the population. However the position of the new mutant is important to its fixation probability. Here the position of the new mutant is laid emphasis on. The method is put forward to calculate the fixation probability of an evolutionary graph (EG of single level. Then for a class of bilevel EGs, their fixation probabilities are calculated and some propositions are discussed. The conclusion is obtained showing that the bilevel EG is more stable than the corresponding one-rooted EG.

  9. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  10. Construction of unitary matrices from observable transition probabilities

    International Nuclear Information System (INIS)

    Peres, A.

    1989-01-01

    An ideal measuring apparatus defines an orthonormal basis vertical strokeu m ) in Hilbert space. Another apparatus defines another basis vertical strokeυ μ ). Both apparatuses together allow to measure the transition probabilities P mμ =vertical stroke(u m vertical strokeυ μ )vertical stroke 2 . The problem is: Given all the elements of a doubly stochastic matrix P mμ , find a unitary matrix U mμ such that P mμ =vertical strokeU mμ vertical stroke 2 . The number of unknown nontrivial phases is equal to the number of independent equations to satisfy. The problem can therefore be solved provided that the values of the P mμ satisfy some inequalities. (orig.)

  11. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  12. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    Science.gov (United States)

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  13. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  14. Rapid Reagentless Detection of M. tuberculosis H37Ra in Respiratory Effluents

    International Nuclear Information System (INIS)

    Adams, K.L.; Steele, P.T.; Bogan, M.J.; Sadler, N.M.; Martin, S.; Martin, A.N.; Frank, M.

    2008-01-01

    Two similar mycobacteria, Mycobacteria tuberculosis H37Ra and Mycobacteria smegmatis are rapidly detected and identified within samples containing a complex background of respiratory effluents using Single Particle Aerosol Mass Spectrometry (SPAMS). M. tuberculosis H37Ra (TBa), an avirulent strain, is used as a surrogate for virulent tuberculosis (TBv); M. smegmatis (MSm) is utilized as a near neighbor confounder for TBa. Bovine lung surfactant and human exhaled breath condensate are used as first-order surrogates for infected human lung expirations from patients with pulmonary tuberculosis. This simulated background sputum is mixed with TBa or MSm and nebulized to produce conglomerate aerosol particles, single particles that contain a bacterium embedded within a background respiratory matrix. Mass spectra of single conglomerate particles exhibit ions associated with both respiratory effluents and mycobacteria. Spectral features distinguishing TBa from MSm in pure and conglomerate particles are shown. SPAMS pattern matching alarm algorithms are able to distinguish TBa containing particles from background matrix and MSm for >50% of the test particles, which is sufficient to enable a high probability of detection and a low false alarm rate if an adequate number of such particles are present. These results indicate the potential usefulness of SPAMS for rapid, reagentless tuberculosis screening

  15. Rapid Reagentless Detection of M. tuberculosis H37Ra in Respiratory Effluents

    Energy Technology Data Exchange (ETDEWEB)

    Adams, K L; Steele, P T; Bogan, M J; Sadler, N M; Martin, S; Martin, A N; Frank, M

    2008-01-29

    Two similar mycobacteria, Mycobacteria tuberculosis H37Ra and Mycobacteria smegmatis are rapidly detected and identified within samples containing a complex background of respiratory effluents using Single Particle Aerosol Mass Spectrometry (SPAMS). M. tuberculosis H37Ra (TBa), an avirulent strain, is used as a surrogate for virulent tuberculosis (TBv); M. smegmatis (MSm) is utilized as a near neighbor confounder for TBa. Bovine lung surfactant and human exhaled breath condensate are used as first-order surrogates for infected human lung expirations from patients with pulmonary tuberculosis. This simulated background sputum is mixed with TBa or MSm and nebulized to produce conglomerate aerosol particles, single particles that contain a bacterium embedded within a background respiratory matrix. Mass spectra of single conglomerate particles exhibit ions associated with both respiratory effluents and mycobacteria. Spectral features distinguishing TBa from MSm in pure and conglomerate particles are shown. SPAMS pattern matching alarm algorithms are able to distinguish TBa containing particles from background matrix and MSm for >50% of the test particles, which is sufficient to enable a high probability of detection and a low false alarm rate if an adequate number of such particles are present. These results indicate the potential usefulness of SPAMS for rapid, reagentless tuberculosis screening.

  16. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  17. Rapid Spontaneously Resolving Acute Subdural Hematoma

    Science.gov (United States)

    Gan, Qi; Zhao, Hexiang; Zhang, Hanmei; You, Chao

    2017-01-01

    Introduction: This study reports a rare patient of a rapid spontaneously resolving acute subdural hematoma. In addition, an analysis of potential clues for the phenomenon is presented with a review of the literature. Patient Presentation: A 1-year-and-2-month-old boy fell from a height of approximately 2 m. The patient was in a superficial coma with a Glasgow Coma Scale of 8 when he was transferred to the authors’ hospital. Computed tomography revealed the presence of an acute subdural hematoma with a midline shift beyond 1 cm. His guardians refused invasive interventions and chose conservative treatment. Repeat imaging after 15 hours showed the evident resolution of the hematoma and midline reversion. Progressive magnetic resonance imaging demonstrated the complete resolution of the hematoma, without redistribution to a remote site. Conclusions: Even though this phenomenon has a low incidence, the probability of a rapid spontaneously resolving acute subdural hematoma should be considered when patients present with the following characteristics: children or elderly individuals suffering from mild to moderate head trauma; stable or rapidly recovered consciousness; and simple acute subdural hematoma with a moderate thickness and a particularly low-density band in computed tomography scans. PMID:28468224

  18. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  19. Image Watermarking Scheme for Specifying False Positive Probability and Bit-pattern Embedding

    Science.gov (United States)

    Sayama, Kohei; Nakamoto, Masayoshi; Muneyasu, Mitsuji; Ohno, Shuichi

    This paper treats a discrete wavelet transform(DWT)-based image watermarking with considering the false positive probability and bit-pattern embedding. We propose an iterative embedding algorithm of watermarking signals which are K sets pseudo-random numbers generated by a secret key. In the detection, K correlations between the watermarked DWT coefficients and watermark signals are computed by using the secret key. L correlations are made available for the judgment of the watermark presence with specified false positive probability, and the other K-L correlations are corresponding to the bit-pattern signal. In the experiment, we show the detection results with specified false positive probability and the bit-pattern recovery, and the comparison of the proposed method against JPEG compression, scaling down and cropping.

  20. Decision making generalized by a cumulative probability weighting function

    Science.gov (United States)

    dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto

    2018-01-01

    Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.

  1. Optimal sample size for probability of detection curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2013-01-01

    Highlights: • We investigate sample size requirement to develop probability of detection curves. • We develop simulations to determine effective inspection target sizes, number and distribution. • We summarize these findings and provide guidelines for the NDE practitioner. -- Abstract: The use of probability of detection curves to quantify the reliability of non-destructive examination (NDE) systems is common in the aeronautical industry, but relatively less so in the nuclear industry, at least in European countries. Due to the nature of the components being inspected, sample sizes tend to be much lower. This makes the manufacturing of test pieces with representative flaws, in sufficient numbers, so to draw statistical conclusions on the reliability of the NDT system under investigation, quite costly. The European Network for Inspection and Qualification (ENIQ) has developed an inspection qualification methodology, referred to as the ENIQ Methodology. It has become widely used in many European countries and provides assurance on the reliability of NDE systems, but only qualitatively. The need to quantify the output of inspection qualification has become more important as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. A measure of the NDE reliability is necessary to quantify risk reduction after inspection and probability of detection (POD) curves provide such a metric. The Joint Research Centre, Petten, The Netherlands supported ENIQ by investigating the question of the sample size required to determine a reliable POD curve. As mentioned earlier manufacturing of test pieces with defects that are typically found in nuclear power plants (NPPs) is usually quite expensive. Thus there is a tendency to reduce sample sizes, which in turn increases the uncertainty associated with the resulting POD curve. The main question in conjunction with POS curves is the appropriate sample size. Not

  2. Rapid arc - clinical rationale and results

    International Nuclear Information System (INIS)

    Cozzi, Lucca

    2008-01-01

    The presentation will focus on the background of Intensity modulation volumetric arc therapy Rapid Arc from Varian Medical Systems aiming to highlight the technical and clinical rational also from an historical perspective to the founding pillars of fast delivery with a minimum number of arcs and a minimum number of monitor units

  3. Application of probability generating function to the essentials of nondestructive nuclear materials assay system using neutron correlation

    International Nuclear Information System (INIS)

    Hosoma, Takashi

    2017-01-01

    In the previous research (JAEA-Research 2015-009), essentials of neutron multiplicity counting mathematics were reconsidered where experiences obtained at the Plutonium Conversion Development Facility were taken into, and formulae of multiplicity distribution were algebraically derived up to septuplet using a probability generating function to make a strategic move in the future. Its principle was reported by K. Böhnel in 1985, but such a high-order expansion was the first case due to its increasing complexity. In this research, characteristics of the high-order correlation were investigated. It was found that higher-order correlation increases rapidly in response to the increase of leakage multiplication, crosses and leaves lower-order correlations behind, when leakage multiplication is > 1.3 that depends on detector efficiency and counter setting. In addition, fission rates and doubles count rates by fast neutron and by thermal neutron in their coexisting system were algebraically derived using a probability generating function again. Its principle was reported by I. Pázsit and L. Pál in 2012, but such a physical interpretation, i.e. associating their stochastic variables with fission rate, doubles count rate and leakage multiplication, is the first case. From Rossi-alpha combined distribution and measured ratio of each area obtained by Differential Die-Away Self-Interrogation (DDSI) and conventional assay data, it is possible to estimate: the number of induced fissions per unit time by fast neutron and by thermal neutron; the number of induced fissions (< 1) by one source neutron; and individual doubles count rates. During the research, a hypothesis introduced in their report was proved to be true. Provisional calculations were done for UO_2 of 1∼10 kgU containing ∼ 0.009 wt% "2"4"4Cm. (author)

  4. USING THE WEB-SERVICES WOLFRAM|ALPHA TO SOLVE PROBLEMS IN PROBABILITY THEORY

    Directory of Open Access Journals (Sweden)

    Taras Kobylnyk

    2015-10-01

    Full Text Available The trend towards the use of remote network resources on the Internet clearly delineated. Traditional training combined with increasingly networked, remote technologies become popular cloud computing. Research methods of probability theory are used in various fields. Of particular note is the use of methods of probability theory in psychological and educational research in statistical analysis of experimental data. Conducting such research is impossible without the use of modern information technology. Given the advantages of web-based software, the article describes web-service Wolfram|Alpha. Detailed analysis of the possibilities of using web-service Wolfram|Alpha for solving problems of probability theory. In the case studies described the results of queries for solving of probability theory, in particular the sections random events and random variables. Considered and analyzed the problem of the number of occurrences of event A in n independent trials using Wolfram|Alpha, detailed analysis of the possibilities of using the service Wolfram|Alpha for the study of continuous random variable that has a normal and uniform probability distribution, including calculating the probability of getting the value of a random variable in a given interval. The problem in applying the binomial and hypergeometric probability distribution of a discrete random variable and demonstrates the possibility of using the service Wolfram|Alpha for solving it.

  5. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  6. A method for generating skewed random numbers using two overlapping uniform distributions

    International Nuclear Information System (INIS)

    Ermak, D.L.; Nasstrom, J.S.

    1995-02-01

    The objective of this work was to implement and evaluate a method for generating skewed random numbers using a combination of uniform random numbers. The method provides a simple and accurate way of generating skewed random numbers from the specified first three moments without an a priori specification of the probability density function. We describe the procedure for generating skewed random numbers from unifon-n random numbers, and show that it accurately produces random numbers with the desired first three moments over a range of skewness values. We also show that in the limit of zero skewness, the distribution of random numbers is an accurate approximation to the Gaussian probability density function. Future work win use this method to provide skewed random numbers for a Langevin equation model for diffusion in skewed turbulence

  7. Safeguarding a Lunar Rover with Wald's Sequential Probability Ratio Test

    Science.gov (United States)

    Furlong, Michael; Dille, Michael; Wong, Uland; Nefian, Ara

    2016-01-01

    The virtual bumper is a safeguarding mechanism for autonomous and remotely operated robots. In this paper we take a new approach to the virtual bumper system by using an old statistical test. By using a modified version of Wald's sequential probability ratio test we demonstrate that we can reduce the number of false positive reported by the virtual bumper, thereby saving valuable mission time. We use the concept of sequential probability ratio to control vehicle speed in the presence of possible obstacles in order to increase certainty about whether or not obstacles are present. Our new algorithm reduces the chances of collision by approximately 98 relative to traditional virtual bumper safeguarding without speed control.

  8. A fluctuation relation for the probability of energy backscatter

    Science.gov (United States)

    Vela-Martin, Alberto; Jimenez, Javier

    2017-11-01

    We simulate the large scales of an inviscid turbulent flow in a triply periodic box using a dynamic Smagorinsky model for the sub-grid stresses. The flow, which is forced to constant kinetic energy, is fully reversible and can develop a sustained inverse energy cascade. However, due to the large number of degrees freedom, the probability of spontaneous mean inverse energy flux is negligible. In order to quantify the probability of inverse energy cascades, we test a local fluctuation relation of the form log P(A) = - c(V , t) A , where P(A) = p(| Cs|V,t = A) / p(| Cs|V , t = - A) , p is probability, and | Cs|V,t is the average of the least-squared dynamic model coefficient over volume V and time t. This is confirmed when Cs is averaged over sufficiently large domains and long times, and c is found to depend linearly on V and t. In the limit in which V 1 / 3 is of the order of the integral scale and t is of the order of the eddy-turnover time, we recover a global fluctuation relation that predicts a negligible probability of a sustained inverse energy cascade. For smaller V and t, the local fluctuation relation provides useful predictions on the occurrence of local energy backscatter. Funded by the ERC COTURB project.

  9. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  10. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  11. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  12. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  13. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  14. High temperature triggers latent variation among individuals: oviposition rate and probability for outbreaks.

    Directory of Open Access Journals (Sweden)

    Christer Björkman

    2011-01-01

    Full Text Available It is anticipated that extreme population events, such as extinctions and outbreaks, will become more frequent as a consequence of climate change. To evaluate the increased probability of such events, it is crucial to understand the mechanisms involved. Variation between individuals in their response to climatic factors is an important consideration, especially if microevolution is expected to change the composition of populations.Here we present data of a willow leaf beetle species, showing high variation among individuals in oviposition rate at a high temperature (20 °C. It is particularly noteworthy that not all individuals responded to changes in temperature; individuals laying few eggs at 20 °C continued to do so when transferred to 12 °C, whereas individuals that laid many eggs at 20 °C reduced their oviposition and laid the same number of eggs as the others when transferred to 12 °C. When transferred back to 20 °C most individuals reverted to their original oviposition rate. Thus, high variation among individuals was only observed at the higher temperature. Using a simple population model and based on regional climate change scenarios we show that the probability of outbreaks increases if there is a realistic increase in the number of warm summers. The probability of outbreaks also increased with increasing heritability of the ability to respond to increased temperature.If climate becomes warmer and there is latent variation among individuals in their temperature response, the probability for outbreaks may increase. However, the likelihood for microevolution to play a role may be low. This conclusion is based on the fact that it has been difficult to show that microevolution affect the probability for extinctions. Our results highlight the urge for cautiousness when predicting the future concerning probabilities for extreme population events.

  15. On random number generators providing convergence more rapid than 1/√N

    International Nuclear Information System (INIS)

    Belov, V.A.

    1982-01-01

    To realize the simulation of processes in High Energy Physics a practical test of the efficiency in applying quasirandom numbers to check multiple integration with Monte-Karlo method is presented together with the comparison of the wellknown generators of quasirandom and pseudorandom numbers [ru

  16. Probability versus representativeness in infancy: can infants use naïve physics to adjust population base rates in probabilistic inference?

    Science.gov (United States)

    Denison, Stephanie; Trikutam, Pallavi; Xu, Fei

    2014-08-01

    A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar contexts with different outcomes. Can infants rapidly acquire probabilistic physical knowledge, such as some leaves fall and some glasses break by simply observing the statistical regularity with which objects behave and apply that knowledge in subsequent reasoning? We taught 11-month-old infants physical constraints on objects and asked them to reason about the probability of different outcomes when objects were drawn from a large distribution. Infants could have reasoned either by using the perceptual similarity between the samples and larger distributions or by applying physical rules to adjust base rates and estimate the probabilities. Infants learned the physical constraints quickly and used them to estimate probabilities, rather than relying on similarity, a version of the representativeness heuristic. These results indicate that infants can rapidly and flexibly acquire physical knowledge about objects following very brief exposure and apply it in subsequent reasoning. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  17. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  18. Estimating probable flaw distributions in PWR steam generator tubes

    International Nuclear Information System (INIS)

    Gorman, J.A.; Turner, A.P.L.

    1997-01-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses

  19. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  20. Rapidity distributions of secondary particles in hadron-nucleus collisions

    International Nuclear Information System (INIS)

    Alaverdyan, G.B.; Pak, A.S.; Tarasov, A.V.; Tseren, Ch.; Uzhinsky, V.V.

    1979-01-01

    In the framework of the cascade model of a leading particle the rapidity distributions of secondary particles in the hadron-nucleus interactions are considered. The energy loss fluctuations of leading particles in the successive collisions have been taken into account. It is shown that the centre of rapidity distribution is displaced towards small rapidity with target nucleus atomic number A growth. The model well reproduces the energy and A dependences of the rapidity distributions

  1. A Repetition Test for Pseudo-Random Number Generators

    OpenAIRE

    Gil, Manuel; Gonnet, Gaston H.; Petersen, Wesley P.

    2017-01-01

    A new statistical test for uniform pseudo-random number generators (PRNGs) is presented. The idea is that a sequence of pseudo-random numbers should have numbers reappear with a certain probability. The expectation time that a repetition occurs provides the metric for the test. For linear congruential generators (LCGs) failure can be shown theoretically. Empirical test results for a number of commonly used PRNGs are reported, showing that some PRNGs considered to have good statistical propert...

  2. Optimal Release Time and Sensitivity Analysis Using a New NHPP Software Reliability Model with Probability of Fault Removal Subject to Operating Environments

    Directory of Open Access Journals (Sweden)

    Kwang Yoon Song

    2018-05-01

    Full Text Available With the latest technological developments, the software industry is at the center of the fourth industrial revolution. In today’s complex and rapidly changing environment, where software applications must be developed quickly and easily, software must be focused on rapidly changing information technology. The basic goal of software engineering is to produce high-quality software at low cost. However, because of the complexity of software systems, software development can be time consuming and expensive. Software reliability models (SRMs are used to estimate and predict the reliability, number of remaining faults, failure intensity, total and development cost, etc., of software. Additionally, it is very important to decide when, how, and at what cost to release the software to users. In this study, we propose a new nonhomogeneous Poisson process (NHPP SRM with a fault detection rate function affected by the probability of fault removal on failure subject to operating environments and discuss the optimal release time and software reliability with the new NHPP SRM. The example results show a good fit to the proposed model, and we propose an optimal release time for a given change in the proposed model.

  3. Subjective probability appraisal of uranium resources in the state of New Mexico

    International Nuclear Information System (INIS)

    Ellis, J.R.; Harris, D.P.; VanWie, N.H.

    1975-12-01

    This report presents an estimate of undiscovered uranium resources in New Mexico of 226,681,000 tons of material containing 455,480 tons U 3 O 8 . The basis for this estimate was a survey of expectations of 36 geologists, in terms of subjective probabilities of number of deposits, ore tonnage, and grade. Weighting of the geologists' estimates to derive a mean value used a self-appraisal index of their knowledge within the field. Detailed estimates are presented for the state, for each of 62 subdivisions (cells), and for an aggregation of eight cells encompassing the San Juan Basin, which is estimated to contain 92 percent of the undiscovered uranium resources in New Mexico. Ore-body attributes stated as probability distributions enabled the application of Monte Carlo methods to the analysis of the data. Sampling of estimates of material and contained U 3 O 8 which are provided as probability distributions indicates a 10 percent probability of there being at least 600,000 tons U 3 O 8 remaining undiscovered in deposits virtually certain to number between 500 and 565. An indicated probability of 99.5 percent that the ore grade is greater than 0.12 percent U 3 O 8 suggests that this survey may not provide reliable estimates of the abundance of material in very low-grade categories. Extrapolation to examine the potential for such deposits indicates more than 1,000,000 tons U 3 O 8 may be available down to a grade of 0.05 percent U 3 O 8 . Supplemental point estimates of ore depth and thickness allowed derivative estimates of cost of development, extraction, and milling. 80 percent of the U 3 O 8 is estimated to be available at a cost less than dollars 15/lb (1974) and about 98 percent at less than dollars 30/lb

  4. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  5. Predicting the probability of slip in gait: methodology and distribution study.

    Science.gov (United States)

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  6. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  7. Baryon number transfer in hadronic interactions

    International Nuclear Information System (INIS)

    Arakelyan, G.H.; Capella, A.; Kaidalov, A.B.; Shabelski, Yu.M.

    2002-01-01

    The process of baryon number transfer due to string junction propagation in rapidity space is analyzed. It has a significant effect on the net baryon production in pp collisions at mid-rapidities and an even larger effect in the forward hemisphere in the cases of πp and γp interactions. The results of numerical calculations in the framework of the quark-gluon string model are in reasonable agreement with the data. (orig.)

  8. Clinical utility of RapidArc™ radiotherapy technology

    International Nuclear Information System (INIS)

    Infusino, Erminia

    2015-01-01

    RapidArc™ is a radiation technique that delivers highly conformal dose distributions through the complete rotation (360°) and speed variation of the linear accelerator gantry. This technique, called volumetric modulated arc therapy (VMAT), compared with conventional radiotherapy techniques, can achieve high-target volume coverage and sparing damage to normal tissues. RapidArc delivers precise dose distribution and conformity similar to or greater than intensity-modulated radiation therapy in a short time, generally a few minutes, to which image-guided radiation therapy is added. RapidArc has become a currently used technology in many centers, which use RapidArc technology to treat a large number of patients. Large and small hospitals use it to treat the most challenging cases, but more and more frequently for the most common cancers. The clinical use of RapidArc and VMAT technology is constantly growing. At present, a limited number of clinical data are published, mostly concerning planning and feasibility studies. Clinical outcome data are increasing for a few tumor sites, even if only a little. The purpose of this work is to discuss the current status of VMAT techniques in clinical use through a review of the published data of planning systems and clinical outcomes in several tumor sites. The study consisted of a systematic review based on analysis of manuscripts retrieved from the PubMed, BioMed Central, and Scopus databases by searching for the keywords “RapidArc”, “Volumetric modulated arc radiotherapy”, and “Intensity-modulated radiotherapy”

  9. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  10. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  11. Superthermal photon bunching in terms of simple probability distributions

    Science.gov (United States)

    Lettau, T.; Leymann, H. A. M.; Melcher, B.; Wiersig, J.

    2018-05-01

    We analyze the second-order photon autocorrelation function g(2 ) with respect to the photon probability distribution and discuss the generic features of a distribution that results in superthermal photon bunching [g(2 )(0 ) >2 ]. Superthermal photon bunching has been reported for a number of optical microcavity systems that exhibit processes such as superradiance or mode competition. We show that a superthermal photon number distribution cannot be constructed from the principle of maximum entropy if only the intensity and the second-order autocorrelation are given. However, for bimodal systems, an unbiased superthermal distribution can be constructed from second-order correlations and the intensities alone. Our findings suggest modeling superthermal single-mode distributions by a mixture of a thermal and a lasinglike state and thus reveal a generic mechanism in the photon probability distribution responsible for creating superthermal photon bunching. We relate our general considerations to a physical system, i.e., a (single-emitter) bimodal laser, and show that its statistics can be approximated and understood within our proposed model. Furthermore, the excellent agreement of the statistics of the bimodal laser and our model reveals that the bimodal laser is an ideal source of bunched photons, in the sense that it can generate statistics that contain no other features but the superthermal bunching.

  12. Atomic Transition Probabilities Scandium through Manganese

    International Nuclear Information System (INIS)

    Martin, G.A.; Fuhr, J.R.; Wiese, W.L.

    1988-01-01

    Atomic transition probabilities for about 8,800 spectral lines of five iron-group elements, Sc(Z = 21) to Mn(Z = 25), are critically compiled, based on all available literature sources. The data are presented in separate tables for each element and stage of ionization and are further subdivided into allowed (i.e., electric dipole-E1) and forbidden (magnetic dipole-M1, electric quadrupole-E2, and magnetic quadrupole-M2) transitions. Within each data table the spectral lines are grouped into multiplets, which are in turn arranged according to parent configurations, transition arrays, and ascending quantum numbers. For each line the transition probability for spontaneous emission and the line strength are given, along with the spectroscopic designation, the wavelength, the statistical weights, and the energy levels of the upper and lower states. For allowed lines the absorption oscillator strength is listed, while for forbidden transitions the type of transition is identified (M1, E2, etc.). In addition, the estimated accuracy and the source are indicated. In short introductions, which precede the tables for each ion, the main justifications for the choice of the adopted data and for the accuracy rating are discussed. A general introduction contains a discussion of our method of evaluation and the principal criteria for our judgements

  13. Transition probabilities for general birth-death processes with applications in ecology, genetics, and evolution

    Science.gov (United States)

    Crawford, Forrest W.; Suchard, Marc A.

    2011-01-01

    A birth-death process is a continuous-time Markov chain that counts the number of particles in a system over time. In the general process with n current particles, a new particle is born with instantaneous rate λn and a particle dies with instantaneous rate μn. Currently no robust and efficient method exists to evaluate the finite-time transition probabilities in a general birth-death process with arbitrary birth and death rates. In this paper, we first revisit the theory of continued fractions to obtain expressions for the Laplace transforms of these transition probabilities and make explicit an important derivation connecting transition probabilities and continued fractions. We then develop an efficient algorithm for computing these probabilities that analyzes the error associated with approximations in the method. We demonstrate that this error-controlled method agrees with known solutions and outperforms previous approaches to computing these probabilities. Finally, we apply our novel method to several important problems in ecology, evolution, and genetics. PMID:21984359

  14. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  15. Notes on the Lumped Backward Master Equation for the Neutron Extinction/Survival Probability

    Energy Technology Data Exchange (ETDEWEB)

    Prinja, Anil K [Los Alamos National Laboratory

    2012-07-02

    chains (a fission chain is defined as the initial source neutron and all its subsequent progeny) in which some chains are short lived while others propagate for unusually long times. Under these conditions, fission chains do not overlap strongly and this precludes the cancellation of neutron number fluctuations necessary for the mean to become established as the dominant measure of the neutron population. The fate of individual chains then plays a defining role in the evolution of the neutron population in strongly stochastic systems, and of particular interest and importance in supercritical systems is the extinction probability, defined as the probability that the neutron chain (initiating neutron and its progeny) will be extinguished at a particular time, or its complement, the time-dependent survival probability. The time-asymptotic limit of the latter, the probability of divergence, gives the probability that the neutron population will grow without bound, and is more commonly known as the probability of initiation or just POI. The ability to numerically compute these probabilities, with high accuracy and without overly restricting the underlying physics (e.g., fission neutron multiplicity, reactivity variation) is clearly essential in developing an understanding of the behavior of strongly stochastic systems.

  16. Compositional cokriging for mapping the probability risk of groundwater contamination by nitrates.

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Chica-Olmo, Mario; Luque-Espinar, Juan A; Rodríguez-Galiano, Víctor

    2015-11-01

    Contamination by nitrates is an important cause of groundwater pollution and represents a potential risk to human health. Management decisions must be made using probability maps that assess the nitrate concentration potential of exceeding regulatory thresholds. However these maps are obtained with only a small number of sparse monitoring locations where the nitrate concentrations have been measured. It is therefore of great interest to have an efficient methodology for obtaining those probability maps. In this paper, we make use of the fact that the discrete probability density function is a compositional variable. The spatial discrete probability density function is estimated by compositional cokriging. There are several advantages in using this approach: (i) problems of classical indicator cokriging, like estimates outside the interval (0,1) and order relations, are avoided; (ii) secondary variables (e.g. aquifer parameters) can be included in the estimation of the probability maps; (iii) uncertainty maps of the probability maps can be obtained; (iv) finally there are modelling advantages because the variograms and cross-variograms of real variables that do not have the restrictions of indicator variograms and indicator cross-variograms. The methodology was applied to the Vega de Granada aquifer in Southern Spain and the advantages of the compositional cokriging approach were demonstrated. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. RAPID TRANSFER ALIGNMENT USING FEDERATED KALMAN FILTER

    Institute of Scientific and Technical Information of China (English)

    GUDong-qing; QINYong-yuan; PENGRong; LIXin

    2005-01-01

    The dimension number of the centralized Kalman filter (CKF) for the rapid transfer alignment (TA) is as high as 21 if the aircraft wing flexure motion is considered in the rapid TA. The 21-dimensional CKF brings the calculation burden on the computer and the difficulty to meet a high filtering updating rate desired by rapid TA. The federated Kalman filter (FKF) for the rapid TA is proposed to solve the dilemma. The structure and the algorithm of the FKF, which can perform parallel computation and has less calculation burden, are designed.The wing flexure motion is modeled, and then the 12-order velocity matching local filter and the 15-order attitud ematching local filter are devised. Simulation results show that the proposed EKE for the rapid TA almost has the same performance as the CKF. Thus the calculation burden of the proposed FKF for the rapid TA is markedly decreased.

  18. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  19. Occupation probabilities and fluctuations in the asymmetric simple inclusion process

    Science.gov (United States)

    Reuveni, Shlomi; Hirschberg, Ori; Eliazar, Iddo; Yechiali, Uri

    2014-04-01

    The asymmetric simple inclusion process (ASIP), a lattice-gas model of unidirectional transport and aggregation, was recently proposed as an "inclusion" counterpart of the asymmetric simple exclusion process. In this paper we present an exact closed-form expression for the probability that a given number of particles occupies a given set of consecutive lattice sites. Our results are expressed in terms of the entries of Catalan's trapezoids—number arrays which generalize Catalan's numbers and Catalan's triangle. We further prove that the ASIP is asymptotically governed by the following: (i) an inverse square-root law of occupation, (ii) a square-root law of fluctuation, and (iii) a Rayleigh law for the distribution of interexit times. The universality of these results is discussed.

  20. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  1. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  2. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  3. Potential applications of rapid/elementary nonparametric statistical techniques (NST) to electrochemical problems

    International Nuclear Information System (INIS)

    Fahidy, Thomas Z.

    2009-01-01

    A major advantage of NST lies in the unimportance of the probability distribution of observations. In this paper, the sign test, the rank-sum test, the Kruskal-Wallis test, the Friedman test, and the runs test illustrate the potential of certain rapid NST for the evaluation of electrochemical process performance.

  4. A closer look at the probabilities of the notorious three prisoners.

    Science.gov (United States)

    Falk, R

    1992-06-01

    The "problem of three prisoners", a counterintuitive teaser, is analyzed. It is representative of a class of probability puzzles where the correct solution depends on explication of underlying assumptions. Spontaneous beliefs concerning the problem and intuitive heuristics are reviewed. The psychological background of these beliefs is explored. Several attempts to find a simple criterion to predict whether and how the probability of the target event will change as a result of obtaining evidence are examined. However, despite the psychological appeal of these attempts, none proves to be valid in general. A necessary and sufficient condition for change in the probability of the target event, following observation of new data, is proposed. That criterion is an extension of the likelihood-ratio principle (which holds in the case of only two complementary alternatives) to any number of alternatives. Some didactic implications concerning the significance of the chance set-up and reliance on analogies are discussed.

  5. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  6. Path probability distribution of stochastic motion of non dissipative systems: a classical analog of Feynman factor of path integral

    International Nuclear Information System (INIS)

    Lin, T.L.; Wang, R.; Bi, W.P.; El Kaabouchi, A.; Pujos, C.; Calvayrac, F.; Wang, Q.A.

    2013-01-01

    We investigate, by numerical simulation, the path probability of non dissipative mechanical systems undergoing stochastic motion. The aim is to search for the relationship between this probability and the usual mechanical action. The model of simulation is a one-dimensional particle subject to conservative force and Gaussian random displacement. The probability that a sample path between two fixed points is taken is computed from the number of particles moving along this path, an output of the simulation, divided by the total number of particles arriving at the final point. It is found that the path probability decays exponentially with increasing action of the sample paths. The decay rate increases with decreasing randomness. This result supports the existence of a classical analog of the Feynman factor in the path integral formulation of quantum mechanics for Hamiltonian systems

  7. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    Kuzio, S.

    2001-01-01

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  8. On the number of encoder states for a type of RLL codes

    NARCIS (Netherlands)

    Cai, K.; Schouhamer Immink, K.A.

    2006-01-01

    The relationship between the number of encoder states and the probable size of certain runlength-limited (RLL) codes is derived analytically. By associating the number of encoder states with (generalized) Fibonacci numbers, the minimum number of encoder states is obtained, which maximizes the rate

  9. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  10. Unequal Probability Marking Approach to Enhance Security of Traceback Scheme in Tree-Based WSNs.

    Science.gov (United States)

    Huang, Changqin; Ma, Ming; Liu, Xiao; Liu, Anfeng; Zuo, Zhengbang

    2017-06-17

    Fog (from core to edge) computing is a newly emerging computing platform, which utilizes a large number of network devices at the edge of a network to provide ubiquitous computing, thus having great development potential. However, the issue of security poses an important challenge for fog computing. In particular, the Internet of Things (IoT) that constitutes the fog computing platform is crucial for preserving the security of a huge number of wireless sensors, which are vulnerable to attack. In this paper, a new unequal probability marking approach is proposed to enhance the security performance of logging and migration traceback (LM) schemes in tree-based wireless sensor networks (WSNs). The main contribution of this paper is to overcome the deficiency of the LM scheme that has a higher network lifetime and large storage space. In the unequal probability marking logging and migration (UPLM) scheme of this paper, different marking probabilities are adopted for different nodes according to their distances to the sink. A large marking probability is assigned to nodes in remote areas (areas at a long distance from the sink), while a small marking probability is applied to nodes in nearby area (areas at a short distance from the sink). This reduces the consumption of storage and energy in addition to enhancing the security performance, lifetime, and storage capacity. Marking information will be migrated to nodes at a longer distance from the sink for increasing the amount of stored marking information, thus enhancing the security performance in the process of migration. The experimental simulation shows that for general tree-based WSNs, the UPLM scheme proposed in this paper can store 1.12-1.28 times the amount of stored marking information that the equal probability marking approach achieves, and has 1.15-1.26 times the storage utilization efficiency compared with other schemes.

  11. Fitting the Probability Distribution Functions to Model Particulate Matter Concentrations

    International Nuclear Information System (INIS)

    El-Shanshoury, Gh.I.

    2017-01-01

    The main objective of this study is to identify the best probability distribution and the plotting position formula for modeling the concentrations of Total Suspended Particles (TSP) as well as the Particulate Matter with an aerodynamic diameter<10 μm (PM 10 ). The best distribution provides the estimated probabilities that exceed the threshold limit given by the Egyptian Air Quality Limit value (EAQLV) as well the number of exceedance days is estimated. The standard limits of the EAQLV for TSP and PM 10 concentrations are 24-h average of 230 μg/m 3 and 70 μg/m 3 , respectively. Five frequency distribution functions with seven formula of plotting positions (empirical cumulative distribution functions) are compared to fit the average of daily TSP and PM 10 concentrations in year 2014 for Ain Sokhna city. The Quantile-Quantile plot (Q-Q plot) is used as a method for assessing how closely a data set fits a particular distribution. A proper probability distribution that represents the TSP and PM 10 has been chosen based on the statistical performance indicator values. The results show that Hosking and Wallis plotting position combined with Frechet distribution gave the highest fit for TSP and PM 10 concentrations. Burr distribution with the same plotting position follows Frechet distribution. The exceedance probability and days over the EAQLV are predicted using Frechet distribution. In 2014, the exceedance probability and days for TSP concentrations are 0.052 and 19 days, respectively. Furthermore, the PM 10 concentration is found to exceed the threshold limit by 174 days

  12. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  13. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  14. The probability of the creation of extra dimensions in nuclear collisions

    International Nuclear Information System (INIS)

    Nazarenko, A.V.

    2008-01-01

    The minisuperspace model in 3+d spatial dimensions with matter described by the bag model is considered with the aim of estimating the probability of creation of compactified extra dimensions in nuclear collisions. The amplitude of transition from three- to (3+d)-dimensional space has been calculated both in the case of completely confined matter, when the contribution of radiation is ignored, and in the case of radiation domination, when the bag constant is negligible. It turns out that the number of additional dimensions is limited in the first regime, while it is infinite in the second one. It is shown that the probability of creation of extra dimensions is finite in both regimes. (author)

  15. Failure probability estimate of type 304 stainless steel piping

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Mehta, H.S.; Ranganath, S.

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These events are combined as the product of four factors: (1) the probability that a given weld heat affected zone contains IGSCC; (2) the conditional probability, given the presence of IGSCC, that the cracking will escape detection during UT examination; (3) the conditional probability, given a crack escapes detection by UT, that it will not grow through-wall and be detected by leakage; (4) the conditional probability, given a crack is not detected by leakage, that it grows to instability prior to the next UT exam. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process water piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusions of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break

  16. Knock probability estimation through an in-cylinder temperature model with exogenous noise

    Science.gov (United States)

    Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.

    2018-01-01

    This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.

  17. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  18. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  19. The prime numbers and their distribution

    CERN Document Server

    Tenenbaum, Gerald

    2000-01-01

    One notable new direction this century in the study of primes has been the influx of ideas from probability. The goal of this book is to provide insights into the prime numbers and to describe how a sequence so tautly determined can incorporate such a striking amount of randomness. The book opens with some classic topics of number theory. It ends with a discussion of some of the outstanding conjectures in number theory. In between are an excellent chapter on the stochastic properties of primes and a walk through an elementary proof of the Prime Number Theorem. This book is suitable for anyone who has had a little number theory and some advanced calculus involving estimates. Its engaging style and invigorating point of view will make refreshing reading for advanced undergraduates through research mathematicians.

  20. A mass spectrometer for the rapid analysis of gaseous mixtures; Spectrometre de masse pour l'analyse rapide des melanges gazeux

    Energy Technology Data Exchange (ETDEWEB)

    Cassignol, C; Ortel, Y; Taieb, J

    1950-07-01

    A mass spectrometer for leak detection and rapid gas analysis were constructed, having the characteristics and several structural features of a simple instrument described by Siry in Rev. Sri. Instruments. 540 (1947). Although exhibiting a good resolving power, the apparatus, which has no ion lenses and whose electrodes can be regulated during the performance, has not been sufficiently tested. Since several design defects have been discovered, it will probably be rebuilt with various improvements (ion source outside the magnetic field, modified circuits, etc.). (author)

  1. Probability of climatic change. Identification of key questions

    International Nuclear Information System (INIS)

    Fransen, W.

    1995-01-01

    Addressing the question what the probability is of an anthropogenically induced change in the climate, leads to a number of other, underlying questions. These questions, which deal with the characteristics of climate, of climatic change, and of probabilistic statements on climatic change, should be addressed first. The long-term objective of the underlying study, i.e. a quantitative assessment of the risks and opportunities of the predicted climatic change, sets the context against which of those questions should be answered. In addition, this context induces extra questions, i.e. about the characteristics of risk

  2. Normal tissue complication probability for salivary glands

    International Nuclear Information System (INIS)

    Rana, B.S.

    2008-01-01

    The purpose of radiotherapy is to make a profitable balance between the morbidity (due to side effects of radiation) and cure of malignancy. To achieve this, one needs to know the relation between NTCP (normal tissue complication probability) and various treatment variables of a schedule viz. daily dose, duration of treatment, total dose and fractionation along with tissue conditions. Prospective studies require that a large number of patients be treated with varied schedule parameters and a statistically acceptable number of patients develop complications so that a true relation between NTCP and a particular variable is established. In this study Salivary Glands Complications have been considered. The cases treated in 60 Co teletherapy machine during the period 1994 to 2002 were analyzed and the clinicians judgement in ascertaining the end points was the only means of observations. The only end points were early and late xerestomia which were considered for NTCP evaluations for a period of 5 years

  3. Ruin probabilities with compounding assets for discrete time finite horizon problems, independent period claim sizes and general premium structure

    NARCIS (Netherlands)

    Kok, de A.G.

    2003-01-01

    In this paper we present fast and accurate approximations for the probability of ruin over a finite number of periods, assuming inhomogeneous independent claim size distributions and arbitrary premium income in subsequent periods. We develop exact recursive expressions for the non-ruin probabilities

  4. Probability Estimates of Solar Particle Event Doses During a Period of Low Sunspot Number for Thinly-Shielded Spacecraft and Short Duration Missions

    Science.gov (United States)

    Atwell, William; Tylka, Allan J.; Dietrich, William; Rojdev, Kristina; Matzkind, Courtney

    2016-01-01

    In an earlier paper (Atwell, et al., 2015), we investigated solar particle event (SPE) radiation exposures (absorbed dose) to small, thinly-shielded spacecraft during a period when the sunspot number (SSN) was less than 30. These SPEs contain Ground Level Events (GLE), sub-GLEs, and sub-sub-GLEs (Tylka and Dietrich, 2009, Tylka and Dietrich, 2008, and Atwell, et al., 2008). GLEs are extremely energetic solar particle events having proton energies extending into the several GeV range and producing secondary particles in the atmosphere, mostly neutrons, observed with ground station neutron monitors. Sub-GLE events are less energetic, extending into the several hundred MeV range, but do not produce secondary atmospheric particles. Sub-sub GLEs are even less energetic with an observable increase in protons at energies greater than 30 MeV, but no observable proton flux above 300 MeV. In this paper, we consider those SPEs that occurred during 1973-2010 when the SSN was greater than 30 but less than 50. In addition, we provide probability estimates of absorbed dose based on mission duration with a 95% confidence level (CL). We also discuss the implications of these data and provide some recommendations that may be useful to spacecraft designers of these smaller spacecraft.

  5. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  6. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  7. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  8. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  9. p-adic probability prediction of correlations between particles in the two-slit and neutron interferometry experiments

    International Nuclear Information System (INIS)

    Khrennikov, A.

    1998-01-01

    The Author start from Feynman's idea to use negative probabilities to describe the two-slit experiment and other quantum interference experiments. Formally by using negative probability distributions the Author can explain the results of the two-slit experiment on the basis of the pure corpuscular picture of quantum mechanics. However, negative probabilities are absurd objects in the framework of the standard Kolmogorov theory of probability. The Author present a large class of non-Kolmogorovean probability models where negative probabilities are well defined on the frequency basis. These are models with probabilities which belong to the so-called field of p-adic numbers. However, these models are characterized by correlations between trails. Therefore, the Author predict correlations between particles in interference experiments. In fact, the predictions are similar to the predictions of the so-called nonen ergodic interpretation of quantum mechanics, which was proposed by V. Buonomano. The Author propose the concrete experiments (in particular, in the framework of the neutron interferometry) to verify our predictions on the correlations

  10. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  11. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Genefer: Programs for Finding Large Probable Generalized Fermat Primes

    Directory of Open Access Journals (Sweden)

    Iain Arthur Bethune

    2015-11-01

    Full Text Available Genefer is a suite of programs for performing Probable Primality (PRP tests of Generalised Fermat numbers 'b'2'n'+1 (GFNs using a Fermat test. Optimised implementations are available for modern CPUs using single instruction, multiple data (SIMD instructions, as well as for GPUs using CUDA or OpenCL. Genefer has been extensively used by PrimeGrid – a volunteer computing project searching for large prime numbers of various kinds, including GFNs. Genefer’s architecture separates the high level logic such as checkpointing and user interface from the architecture-specific performance-critical parts of the implementation, which are suitable for re-use. Genefer is released under the MIT license. Source and binaries are available from www.assembla.com/spaces/genefer.

  13. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  14. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  15. Unification of field theory and maximum entropy methods for learning probability densities

    Science.gov (United States)

    Kinney, Justin B.

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  16. Unification of field theory and maximum entropy methods for learning probability densities.

    Science.gov (United States)

    Kinney, Justin B

    2015-09-01

    The need to estimate smooth probability distributions (a.k.a. probability densities) from finite sampled data is ubiquitous in science. Many approaches to this problem have been described, but none is yet regarded as providing a definitive solution. Maximum entropy estimation and Bayesian field theory are two such approaches. Both have origins in statistical physics, but the relationship between them has remained unclear. Here I unify these two methods by showing that every maximum entropy density estimate can be recovered in the infinite smoothness limit of an appropriate Bayesian field theory. I also show that Bayesian field theory estimation can be performed without imposing any boundary conditions on candidate densities, and that the infinite smoothness limit of these theories recovers the most common types of maximum entropy estimates. Bayesian field theory thus provides a natural test of the maximum entropy null hypothesis and, furthermore, returns an alternative (lower entropy) density estimate when the maximum entropy hypothesis is falsified. The computations necessary for this approach can be performed rapidly for one-dimensional data, and software for doing this is provided.

  17. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  19. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Simonen, Fredric A.

    2009-05-01

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  20. An optimized Line Sampling method for the estimation of the failure probability of nuclear passive systems

    International Nuclear Information System (INIS)

    Zio, E.; Pedroni, N.

    2010-01-01

    The quantitative reliability assessment of a thermal-hydraulic (T-H) passive safety system of a nuclear power plant can be obtained by (i) Monte Carlo (MC) sampling the uncertainties of the system model and parameters, (ii) computing, for each sample, the system response by a mechanistic T-H code and (iii) comparing the system response with pre-established safety thresholds, which define the success or failure of the safety function. The computational effort involved can be prohibitive because of the large number of (typically long) T-H code simulations that must be performed (one for each sample) for the statistical estimation of the probability of success or failure. In this work, Line Sampling (LS) is adopted for efficient MC sampling. In the LS method, an 'important direction' pointing towards the failure domain of interest is determined and a number of conditional one-dimensional problems are solved along such direction; this allows for a significant reduction of the variance of the failure probability estimator, with respect, for example, to standard random sampling. Two issues are still open with respect to LS: first, the method relies on the determination of the 'important direction', which requires additional runs of the T-H code; second, although the method has been shown to improve the computational efficiency by reducing the variance of the failure probability estimator, no evidence has been given yet that accurate and precise failure probability estimates can be obtained with a number of samples reduced to below a few hundreds, which may be required in case of long-running models. The work presented in this paper addresses the first issue by (i) quantitatively comparing the efficiency of the methods proposed in the literature to determine the LS important direction; (ii) employing artificial neural network (ANN) regression models as fast-running surrogates of the original, long-running T-H code to reduce the computational cost associated to the

  1. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  2. Rapidity gap survival in central exclusive diffraction: Dynamical mechanisms and uncertainties

    International Nuclear Information System (INIS)

    Strikman, Mark; Weiss, Christian

    2009-01-01

    We summarize our understanding of the dynamical mechanisms governing rapidity gap survival in central exclusive diffraction, pp -> p + H + p (H = high-mass system), and discuss the uncertainties in present estimates of the survival probability. The main suppression of diffractive scattering is due to inelastic soft spectator interactions at small pp impact parameters and can be described in a mean-field approximation (independent hard and soft interactions). Moderate extra suppression results from fluctuations of the partonic configurations of the colliding protons. At LHC energies absorptive interactions of hard spectator partons associated with the gg -> H process reach the black-disk regime and cause substantial additional suppression, pushing the survival probability below 0.01.

  3. Contribution to the neutronic theory of random stacks (diffusion coefficient and first-flight collision probabilities) with a general theorem on collision probabilities

    International Nuclear Information System (INIS)

    Dixmier, Marc.

    1980-10-01

    A general expression of the diffusion coefficient (d.c.) of neutrons was given, with stress being put on symmetries. A system of first-flight collision probabilities for the case of a random stack of any number of types of one- and two-zoned spherical pebbles, with an albedo at the frontiers of the elements or (either) consideration of the interstital medium, was built; to that end, the bases of collision probability theory were reviewed, and a wide generalisation of the reciprocity theorem for those probabilities was demonstrated. The migration area of neutrons was expressed for any random stack of convex, 'simple' and 'regular-contact' elements, taking into account the correlations between free-paths; the average cosinus of re-emission of neutrons by an element, in the case of a homogeneous spherical pebble and the transport approximation, was expressed; the superiority of the so-found result over Behrens' theory, for the type of media under consideration, was established. The 'fine structure current term' of the d.c. was also expressed, and it was shown that its 'polarisation term' is negligible. Numerical applications showed that the global heterogeneity effect on the d.c. of pebble-bed reactors is comparable with that for Graphite-moderated, Carbon gas-cooled, natural Uranium reactors. The code CARACOLE, which integrates all the results here obtained, was introduced [fr

  4. Fuzzy probability based fault tree analysis to propagate and quantify epistemic uncertainty

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Ekariansyah, Andi Sofrany; Tjahjono, Hendro

    2015-01-01

    Highlights: • Fuzzy probability based fault tree analysis is to evaluate epistemic uncertainty in fuzzy fault tree analysis. • Fuzzy probabilities represent likelihood occurrences of all events in a fault tree. • A fuzzy multiplication rule quantifies epistemic uncertainty of minimal cut sets. • A fuzzy complement rule estimate epistemic uncertainty of the top event. • The proposed FPFTA has successfully evaluated the U.S. Combustion Engineering RPS. - Abstract: A number of fuzzy fault tree analysis approaches, which integrate fuzzy concepts into the quantitative phase of conventional fault tree analysis, have been proposed to study reliabilities of engineering systems. Those new approaches apply expert judgments to overcome the limitation of the conventional fault tree analysis when basic events do not have probability distributions. Since expert judgments might come with epistemic uncertainty, it is important to quantify the overall uncertainties of the fuzzy fault tree analysis. Monte Carlo simulation is commonly used to quantify the overall uncertainties of conventional fault tree analysis. However, since Monte Carlo simulation is based on probability distribution, this technique is not appropriate for fuzzy fault tree analysis, which is based on fuzzy probabilities. The objective of this study is to develop a fuzzy probability based fault tree analysis to overcome the limitation of fuzzy fault tree analysis. To demonstrate the applicability of the proposed approach, a case study is performed and its results are then compared to the results analyzed by a conventional fault tree analysis. The results confirm that the proposed fuzzy probability based fault tree analysis is feasible to propagate and quantify epistemic uncertainties in fault tree analysis

  5. Basic gambling mathematics the numbers behind the neon

    CERN Document Server

    Bollman, Mark

    2014-01-01

    Introduction HISTORICAL BACKGROUND MATHEMATICAL BACKGROUND WHAT DOES IT MEAN TO BE RANDOM? Fundamental Ideas DEFINITIONS AXIOMS OF PROBABILITY ELEMENTARY COUNTING ARGUMENTS ADVANCED COUNTING ARGUMENTS ODDS Compound Events THE ADDITION RULES THE MULTIPLICATION RULES AND CONDITIONAL PROBABILITY Probability Distributions and Expectation RANDOM VARIABLES EXPECTED VALUE THE BINOMIAL DISTRIBUTION Modified Casino Games ROULETTE DICE GAMES CARD GAMES CASINO PROMOTIONS Blackjack: The Mathematical Exception RULES OF BLACKJACK THE MATHEMATICS OF BLACKJACK BASIC STRATEGY CARD COUNTING Betting Strategies: Why They Don't Work ROULETTE STRATEGIESCRAPS STRATEGIES SLOT MACHINE STRATEGIES BLACKJACK STRATEGIES AND ONE THAT DOES: LOTTERY STRATEGIES HOW TO DOUBLE YOUR MONEY Appendix A: House AdvantagesAppendix B: Mathematical Induction Appendix C: Internet Resources Answers to Odd-Numbered Exercises BibliographyIndexExercises appear at the end of each chapter.

  6. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  7. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  8. Allelic drop-out probabilities estimated by logistic regression--Further considerations and practical implementation

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic drop-out...... is occurring. With this discussion, we hope to improve the drop-out model, so that it can be used for practical forensic genetics and stimulate further discussions. We discuss how to estimate drop-out probabilities when using a varying number of PCR cycles and other experimental conditions....

  9. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  10. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    Science.gov (United States)

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  11. Persistent current and transmission probability in the Aharonov-Bohm ring with an embedded quantum dot

    International Nuclear Information System (INIS)

    Wu Suzhi; Li Ning; Jin Guojun; Ma Yuqiang

    2008-01-01

    Persistent current and transmission probability in the Aharonov-Bohm (AB) ring with an embedded quantum dot (QD) are studied using the technique of the scattering matrix. For the first time, we find that the persistent current can arise in the absence of magnetic flux in the ring with an embedded QD. The persistent current and the transmission probability are sensitive to the lead-ring coupling and the short-range potential barrier. It is shown that increasing the lead-ring coupling or the short-range potential barrier causes the suppression of the persistent current and the increasing resonance width of the transmission probability. The effect of the potential barrier on the number of the transmission peaks is also investigated. The dependence of the persistent current and the transmission probability on the magnetic flux exhibits a periodic property with period of the flux quantum

  12. Data-driven probability concentration and sampling on manifold

    Energy Technology Data Exchange (ETDEWEB)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)

    2016-09-15

    A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.

  13. Survival probability of diffusion with trapping in cellular neurobiology

    Science.gov (United States)

    Holcman, David; Marchewka, Avi; Schuss, Zeev

    2005-09-01

    The problem of diffusion with absorption and trapping sites arises in the theory of molecular signaling inside and on the membranes of biological cells. In particular, this problem arises in the case of spine-dendrite communication, where the number of calcium ions, modeled as random particles, is regulated across the spine microstructure by pumps, which play the role of killing sites, while the end of the dendritic shaft is an absorbing boundary. We develop a general mathematical framework for diffusion in the presence of absorption and killing sites and apply it to the computation of the time-dependent survival probability of ions. We also compute the ratio of the number of absorbed particles at a specific location to the number of killed particles. We show that the ratio depends on the distribution of killing sites. The biological consequence is that the position of the pumps regulates the fraction of calcium ions that reach the dendrite.

  14. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  15. Knot probability of polygons subjected to a force: a Monte Carlo study

    International Nuclear Information System (INIS)

    Rensburg, E J Janse van; Orlandini, E; Tesi, M C; Whittington, S G

    2008-01-01

    We use Monte Carlo methods to study the knot probability of lattice polygons on the cubic lattice in the presence of an external force f. The force is coupled to the span of the polygons along a lattice direction, say the z-direction. If the force is negative polygons are squeezed (the compressive regime), while positive forces tend to stretch the polygons along the z-direction (the tensile regime). For sufficiently large positive forces we verify that the Pincus scaling law in the force-extension curve holds. At a fixed number of edges n the knot probability is a decreasing function of the force. For a fixed force the knot probability approaches unity as 1 - exp(-α 0 (f)n + o(n)), where α 0 (f) is positive and a decreasing function of f. We also examine the average of the absolute value of the writhe and we verify the square root growth law (known for f = 0) for all values of f

  16. Rapid evolution and copy number variation of primate RHOXF2, an X-linked homeobox gene involved in male reproduction and possibly brain function.

    Science.gov (United States)

    Niu, Ao-lei; Wang, Yin-qiu; Zhang, Hui; Liao, Cheng-hong; Wang, Jin-kai; Zhang, Rui; Che, Jun; Su, Bing

    2011-10-12

    Homeobox genes are the key regulators during development, and they are in general highly conserved with only a few reported cases of rapid evolution. RHOXF2 is an X-linked homeobox gene in primates. It is highly expressed in the testicle and may play an important role in spermatogenesis. As male reproductive system is often the target of natural and/or sexual selection during evolution, in this study, we aim to dissect the pattern of molecular evolution of RHOXF2 in primates and its potential functional consequence. We studied sequences and copy number variation of RHOXF2 in humans and 16 nonhuman primate species as well as the expression patterns in human, chimpanzee, white-browed gibbon and rhesus macaque. The gene copy number analysis showed that there had been parallel gene duplications/losses in multiple primate lineages. Our evidence suggests that 11 nonhuman primate species have one RHOXF2 copy, and two copies are present in humans and four Old World monkey species, and at least 6 copies in chimpanzees. Further analysis indicated that the gene duplications in primates had likely been mediated by endogenous retrovirus (ERV) sequences flanking the gene regions. In striking contrast to non-human primates, humans appear to have homogenized their two RHOXF2 copies by the ERV-mediated non-allelic recombination mechanism. Coding sequence and phylogenetic analysis suggested multi-lineage strong positive selection on RHOXF2 during primate evolution, especially during the origins of humans and chimpanzees. All the 8 coding region polymorphic sites in human populations are non-synonymous, implying on-going selection. Gene expression analysis demonstrated that besides the preferential expression in the reproductive system, RHOXF2 is also expressed in the brain. The quantitative data suggests expression pattern divergence among primate species. RHOXF2 is a fast-evolving homeobox gene in primates. The rapid evolution and copy number changes of RHOXF2 had been driven by

  17. Rapid evolution and copy number variation of primate RHOXF2, an X-linked homeobox gene involved in male reproduction and possibly brain function

    Directory of Open Access Journals (Sweden)

    Zhang Rui

    2011-10-01

    Full Text Available Abstract Background Homeobox genes are the key regulators during development, and they are in general highly conserved with only a few reported cases of rapid evolution. RHOXF2 is an X-linked homeobox gene in primates. It is highly expressed in the testicle and may play an important role in spermatogenesis. As male reproductive system is often the target of natural and/or sexual selection during evolution, in this study, we aim to dissect the pattern of molecular evolution of RHOXF2 in primates and its potential functional consequence. Results We studied sequences and copy number variation of RHOXF2 in humans and 16 nonhuman primate species as well as the expression patterns in human, chimpanzee, white-browed gibbon and rhesus macaque. The gene copy number analysis showed that there had been parallel gene duplications/losses in multiple primate lineages. Our evidence suggests that 11 nonhuman primate species have one RHOXF2 copy, and two copies are present in humans and four Old World monkey species, and at least 6 copies in chimpanzees. Further analysis indicated that the gene duplications in primates had likely been mediated by endogenous retrovirus (ERV sequences flanking the gene regions. In striking contrast to non-human primates, humans appear to have homogenized their two RHOXF2 copies by the ERV-mediated non-allelic recombination mechanism. Coding sequence and phylogenetic analysis suggested multi-lineage strong positive selection on RHOXF2 during primate evolution, especially during the origins of humans and chimpanzees. All the 8 coding region polymorphic sites in human populations are non-synonymous, implying on-going selection. Gene expression analysis demonstrated that besides the preferential expression in the reproductive system, RHOXF2 is also expressed in the brain. The quantitative data suggests expression pattern divergence among primate species. Conclusions RHOXF2 is a fast-evolving homeobox gene in primates. The rapid

  18. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  19. Distribution function of faint galaxy numbers

    International Nuclear Information System (INIS)

    Fesenko, L.M.

    1981-01-01

    The Lick observatory counts of galaxies are considered. The distribution of number of galaxies in elementary regions (ER) of 1 degx1 deg is investigated. Each field of 6 degx6 deg was treated separately At b>40 deg the probab+lity to observe of n galaxies in ER is an exponential decreasing function of n, if unequality n> were fulfilled. The mean apparent multiplicity of a galaxy (2.8+-0.9) was derived. The galaxy number distribution was simple model for the number of various systems of galaxies. The supperclustering of galaxies was not introduced. Based on that model the approximate expression for galaxy number distribution was considered and was compared with observed distributions. The agreement between these distributions become better with reducing of the interstellar absorption of light

  20. Linking probabilities of off-lattice self-avoiding polygons and the effects of excluded volume

    International Nuclear Information System (INIS)

    Hirayama, Naomi; Deguchi, Tetsuo; Tsurusaki, Kyoichi

    2009-01-01

    We evaluate numerically the probability of linking, i.e. the probability of a given pair of self-avoiding polygons (SAPs) being entangled and forming a nontrivial link type L. In the simulation we generate pairs of SAPs of N spherical segments of radius r d such that they have no overlaps among the segments and each of the SAPs has the trivial knot type. We evaluate the probability of a self-avoiding pair of SAPs forming a given link type L for various link types with fixed distance R between the centers of mass of the two SAPs. We define normalized distance r by r=R/R g,0 1 where R g,0 1 denotes the square root of the mean square radius of gyration of SAP of the trivial knot 0 1 . We introduce formulae expressing the linking probability as a function of normalized distance r, which gives good fitting curves with respect to χ 2 values. We also investigate the dependence of linking probabilities on the excluded-volume parameter r d and the number of segments, N. Quite interestingly, the graph of linking probability versus normalized distance r shows no N-dependence at a particular value of the excluded volume parameter, r d = 0.2

  1. Survival probability for diffractive dijet production in p anti p collisions from next-to-leading order calculations

    International Nuclear Information System (INIS)

    Klasen, M.; Kramer, G.

    2009-08-01

    We perform next-to-leading order calculations of the single-diffractive and non-diffractive cross sections for dijet production in proton-antiproton collisions at the Tevatron. By comparing their ratio to the data published by the CDF collaboration for two different center-of-mass energies, we deduce the rapidity-gap survival probability as a function of the momentum fraction of the parton in the antiproton. Assuming Regge factorization, this probability can be interpreted as a suppression factor for the diffractive structure function measured in deep-inelastic scattering at HERA. In contrast to the observations for photoproduction, the suppression factor in protonantiproton collisions depends on the momentum fraction of the parton in the Pomeron even at next-to-leading order. (orig.)

  2. PDE-Foam - a probability-density estimation method using self-adapting phase-space binning

    CERN Document Server

    Dannheim, Dominik; Voigt, Alexander; Grahn, Karl-Johan; Speckmayer, Peter

    2009-01-01

    Probability-Density Estimation (PDE) is a multivariate discrimination technique based on sampling signal and background densities defined by event samples from data or Monte-Carlo (MC) simulations in a multi-dimensional phase space. To efficiently use large event samples to estimate the probability density, a binary search tree (range searching) is used in the PDE-RS implementation. It is a generalisation of standard likelihood methods and a powerful classification tool for problems with highly non-linearly correlated observables. In this paper, we present an innovative improvement of the PDE method that uses a self-adapting binning method to divide the multi-dimensional phase space in a finite number of hyper-rectangles (cells). The binning algorithm adjusts the size and position of a predefined number of cells inside the multidimensional phase space, minimizing the variance of the signal and background densities inside the cells. The binned density information is stored in binary trees, allowing for a very ...

  3. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  4. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  5. A Trust-Based Adaptive Probability Marking and Storage Traceback Scheme for WSNs

    Science.gov (United States)

    Liu, Anfeng; Liu, Xiao; Long, Jun

    2016-01-01

    Security is a pivotal issue for wireless sensor networks (WSNs), which are emerging as a promising platform that enables a wide range of military, scientific, industrial and commercial applications. Traceback, a key cyber-forensics technology, can play an important role in tracing and locating a malicious source to guarantee cybersecurity. In this work a trust-based adaptive probability marking and storage (TAPMS) traceback scheme is proposed to enhance security for WSNs. In a TAPMS scheme, the marking probability is adaptively adjusted according to the security requirements of the network and can substantially reduce the number of marking tuples and improve network lifetime. More importantly, a high trust node is selected to store marking tuples, which can avoid the problem of marking information being lost. Experimental results show that the total number of marking tuples can be reduced in a TAPMS scheme, thus improving network lifetime. At the same time, since the marking tuples are stored in high trust nodes, storage reliability can be guaranteed, and the traceback time can be reduced by more than 80%. PMID:27043566

  6. Probability-Based Determination Methods for Service Waiting in Service-Oriented Computing Environments

    Science.gov (United States)

    Zeng, Sen; Huang, Shuangxi; Liu, Yang

    Cooperative business processes (CBP)-based service-oriented enterprise networks (SOEN) are emerging with the significant advances of enterprise integration and service-oriented architecture. The performance prediction and optimization for CBP-based SOEN is very complex. To meet these challenges, one of the key points is to try to reduce an abstract service’s waiting number of its physical services. This paper introduces a probability-based determination method (PBDM) of an abstract service’ waiting number, M l , and time span, τ i , for its physical services. The determination of M i and τ i is according to the physical services’ arriving rule and their overall performance’s distribution functions. In PBDM, the arriving probability of the physical services with the best overall performance value is a pre-defined reliability. PBDM has made use of the information of the physical services’ arriving rule and performance distribution functions thoroughly, which will improve the computational efficiency for the scheme design and performance optimization of the collaborative business processes in service-oriented computing environments.

  7. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  9. Estimating the Probabilities of Low-Weight Differential and Linear Approximations on PRESENT-like Ciphers

    DEFF Research Database (Denmark)

    Abdelraheem, Mohamed Ahmed

    2012-01-01

    We use large but sparse correlation and transition-difference-probability submatrices to find the best linear and differential approximations respectively on PRESENT-like ciphers. This outperforms the branch and bound algorithm when the number of low-weight differential and linear characteristics...

  10. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  11. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  12. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  13. Rapid clozapine titration in treatment-refractory bipolar disorder.

    Science.gov (United States)

    Ifteni, Petru; Correll, Christoph U; Nielsen, Jimmi; Burtea, Victoria; Kane, John M; Manu, Peter

    2014-09-01

    Clozapine is effective in treatment-refractory bipolar disorder (BD). Guidelines recommend slow titration to prevent seizures, hypotension and myocarditis, but this stance is not supported by comparative data. To evaluate the safety and effectiveness of rapid clozapine titration in BD. Analysis of a consecutive cohort of treatment-refractory BD patients with mixed/manic episode admitted on alternate days to one of two units of a psychiatric hospital. On one unit, clozapine was started at 25mg followed by 25-50mg as needed every 6h (maximum=100mg/day) on day 1, followed by increases of 25-100mg/day. On the other unit, clozapine was initiated with 25mg in day 1, followed by increases of 25-50mg/day. The primary outcome was the number of days from starting clozapine until readiness for discharge, adjusted in logistic regression for the number of antipsychotics tried during the hospitalization, psychotropic co-treatments and presence of psychotic features. Patients subject to rapid (N=44) and standard (N=23) titration were similar in age, gender, smoking status, body mass index, illness severity at baseline and discharge, and highest clozapine dose. Clozapine was discontinued due to hypotension (N=1) and pneumonia (N=1) during rapid titration, and for excessive sedation (N=1) in each titration group. The number of hospital days from starting clozapine until readiness for discharge was 3.8 days shorter in the rapid titration group (12.7±6.3 vs. 16.5±5.8, p=0.0077). Rapid clozapine titration appeared safe and effective for treatment-refractory BD. The potential for shorter hospital stays justifies prospective trials of this method. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  15. Applications of Fuss-Catalan Numbers to Success Runs of Bernoulli Trials

    Directory of Open Access Journals (Sweden)

    S. J. Dilworth

    2016-01-01

    Full Text Available In a recent paper, the authors derived the exact solution for the probability mass function of the geometric distribution of order k, expressing the roots of the associated auxiliary equation in terms of generating functions for Fuss-Catalan numbers. This paper applies the above formalism for the Fuss-Catalan numbers to treat additional problems pertaining to occurrences of success runs. New exact analytical expressions for the probability mass function and probability generating function and so forth are derived. First, we treat sequences of Bernoulli trials with r≥1 occurrences of success runs of length k with l-overlapping. The case l<0, where there must be a gap of at least l trials between success runs, is also studied. Next we treat the distribution of the waiting time for the rth nonoverlapping appearance of a pair of successes separated by at most k-2 failures (k≥2.

  16. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  17. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Energy Technology Data Exchange (ETDEWEB)

    Portnoy, David, E-mail: david.portnoy@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States); Feuerbach, Robert; Heimberg, Jennifer [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States)

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of

  18. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    International Nuclear Information System (INIS)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-01-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of spectra

  19. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Science.gov (United States)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the "threat" set of spectra

  20. Efficient computation of the joint probability of multiple inherited risk alleles from pedigree data.

    Science.gov (United States)

    Madsen, Thomas; Braun, Danielle; Peng, Gang; Parmigiani, Giovanni; Trippa, Lorenzo

    2018-06-25

    The Elston-Stewart peeling algorithm enables estimation of an individual's probability of harboring germline risk alleles based on pedigree data, and serves as the computational backbone of important genetic counseling tools. However, it remains limited to the analysis of risk alleles at a small number of genetic loci because its computing time grows exponentially with the number of loci considered. We propose a novel, approximate version of this algorithm, dubbed the peeling and paring algorithm, which scales polynomially in the number of loci. This allows extending peeling-based models to include many genetic loci. The algorithm creates a trade-off between accuracy and speed, and allows the user to control this trade-off. We provide exact bounds on the approximation error and evaluate it in realistic simulations. Results show that the loss of accuracy due to the approximation is negligible in important applications. This algorithm will improve genetic counseling tools by increasing the number of pathogenic risk alleles that can be addressed. To illustrate we create an extended five genes version of BRCAPRO, a widely used model for estimating the carrier probabilities of BRCA1 and BRCA2 risk alleles and assess its computational properties. © 2018 WILEY PERIODICALS, INC.

  1. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  2. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  3. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  4. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  5. Rapid Methods for the Laboratory Identification of Pathogenic Microorganisms.

    Science.gov (United States)

    1982-09-01

    coli Hemophilus influenzae Bacillus anthracis Bacillus circulans Bacillus coagulans Bacillus cereus T Candida albicans Cryptococcus neoformans Legionel...reveree aide If neceeeary and Identify by block number) Lectins: Rapid Identification, Bacillus anthracisjCryptococcus " neoformans. Neisseria...field-type kit for the rapid identification of Bacillus anthracis. We have shown that certain lectins will selectively interact with B. anthracis

  6. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  7. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  8. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  9. Probable essential thrombocythemia in a dog

    Energy Technology Data Exchange (ETDEWEB)

    Hopper, P.E.; Mandell, C.P.; Turrel, J.M.; Jain, N.C.; Tablin, F.; Zinkl, J.G.

    1989-04-01

    Essential thrombocythemia (ET), in an 11-year-old dog was characterized by persistently high platelet counts range, 4.19 X 10(6)/microliters to 4.95 X 10(6)/microliters, abnormal platelet morphology, marked megakaryocytic hyperplasia in the bone marrow, absence of circulating megakaryoblasts, and history of splenomegaly and gastrointestinal bleeding. Increased numbers of megakaryocytes and megakaryoblasts (15% to 20%) in the bone marrow were confirmed by a positive acetylcholinesterase reaction. Another significant finding was the presence of a basophilia in blood (4,836/microliters) and bone marrow. The marked persistent thrombocytosis, absence of reactive (secondary) thrombocytosis, abnormal platelet morphology, and quantitative and qualitative changes in the megakaryocytic series in the bone marrow suggested the presence of a myeloproliferative disease. Cytochemical and ultrastructural findings aided in the diagnosis of ET. The dog was treated with radiophosphorus. The results was a rapid decline in the numbers of megakaryoblasts and megakaryocytes in the bone marrow and platelets and basophils in the peripheral blood. The dog died unexpectedly of acute necrotizing pancreatitis and diabetes mellitus before a complete remission was achieved.

  10. Flipping between Languages? An Exploratory Analysis of the Usage by Spanish-Speaking English Language Learner Tertiary Students of a Bilingual Probability Applet

    Science.gov (United States)

    Lesser, Lawrence M.; Wagler, Amy E.; Salazar, Berenice

    2016-01-01

    English language learners (ELLs) are a rapidly growing part of the student population in many countries. Studies on resources for language learners--especially Spanish-speaking ELLs--have focused on areas such as reading, writing, and mathematics, but not introductory probability and statistics. Semi-structured qualitative interviews investigated…

  11. Link importance incorporated failure probability measuring solution for multicast light-trees in elastic optical networks

    Science.gov (United States)

    Li, Xin; Zhang, Lu; Tang, Ying; Huang, Shanguo

    2018-03-01

    The light-tree-based optical multicasting (LT-OM) scheme provides a spectrum- and energy-efficient method to accommodate emerging multicast services. Some studies focus on the survivability technologies for LTs against a fixed number of link failures, such as single-link failure. However, a few studies involve failure probability constraints when building LTs. It is worth noting that each link of an LT plays different important roles under failure scenarios. When calculating the failure probability of an LT, the importance of its every link should be considered. We design a link importance incorporated failure probability measuring solution (LIFPMS) for multicast LTs under independent failure model and shared risk link group failure model. Based on the LIFPMS, we put forward the minimum failure probability (MFP) problem for the LT-OM scheme. Heuristic approaches are developed to address the MFP problem in elastic optical networks. Numerical results show that the LIFPMS provides an accurate metric for calculating the failure probability of multicast LTs and enhances the reliability of the LT-OM scheme while accommodating multicast services.

  12. Rapidity gaps in jet events at D0

    International Nuclear Information System (INIS)

    Abbott, B.; Abolins, M.; Acharya, B.S.

    1997-07-01

    Preliminary results from the D0 experiment on jet production with rapidity gaps in p anti p collisions are presented. A class of dijet events with a forward rapidity gap is observed at center-of-mass energies √s = 1800 GeV and 630 GeV. The number of events with rapidity gaps at both center-of-mass energies is significantly greater than the expectation from multiplicity fluctuations and is consistent with a hard diffractive process. A class of events with two forward gaps and central dijets is also observed at 1800 GeV. This topology is consistent with hard double pomeron exchange

  13. Mathematical conversations multicolor problems, problems in the theory of numbers, and random walks

    CERN Document Server

    Dynkin, E B

    2006-01-01

    Comprises Multicolor Problems, dealing with map-coloring problems; Problems in the Theory of Numbers, an elementary introduction to algebraic number theory; Random Walks, addressing basic problems in probability theory. 1963 edition.

  14. First-passage Probability Estimation of an Earthquake Response of Seismically Isolated Containment Buildings

    International Nuclear Information System (INIS)

    Hahm, Dae-Gi; Park, Kwan-Soon; Koh, Hyun-Moo

    2008-01-01

    The awareness of a seismic hazard and risk is being increased rapidly according to the frequent occurrences of the huge earthquakes such as the 2008 Sichuan earthquake which caused about 70,000 confirmed casualties and a 20 billion U.S. dollars economic loss. Since an earthquake load contains various uncertainties naturally, the safety of a structural system under an earthquake excitation has been assessed by probabilistic approaches. In many structural applications for a probabilistic safety assessment, it is often regarded that the failure of a system will occur when the response of the structure firstly crosses the limit barrier within a specified interval of time. The determination of such a failure probability is usually called the 'first-passage problem' and has been extensively studied during the last few decades. However, especially for the structures which show a significant nonlinear dynamic behavior, an effective and accurate method for the estimation of such a failure probability is not fully established yet. In this study, we presented a new approach to evaluate the first-passage probability of an earthquake response of seismically isolated structures. The proposed method is applied to the seismic isolation system for the containment buildings of a nuclear power plant. From the numerical example, we verified that the proposed method shows accurate results with more efficient computational efforts compared to the conventional approaches

  15. Probability-neighbor method of accelerating geometry treatment in reactor Monte Carlo code RMC

    International Nuclear Information System (INIS)

    She, Ding; Li, Zeguang; Xu, Qi; Wang, Kan; Yu, Ganglin

    2011-01-01

    Probability neighbor method (PNM) is proposed in this paper to accelerate geometry treatment of Monte Carlo (MC) simulation and validated in self-developed reactor Monte Carlo code RMC. During MC simulation by either ray-tracking or delta-tracking method, large amounts of time are spent in finding out which cell one particle is located in. The traditional way is to search cells one by one with certain sequence defined previously. However, this procedure becomes very time-consuming when the system contains a large number of cells. Considering that particles have different probability to enter different cells, PNM method optimizes the searching sequence, i.e., the cells with larger probability are searched preferentially. The PNM method is implemented in RMC code and the numerical results show that the considerable time of geometry treatment in MC calculation for complicated systems is saved, especially effective in delta-tracking simulation. (author)

  16. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  17. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  18. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  19. Saliency Detection via Absorbing Markov Chain With Learnt Transition Probability.

    Science.gov (United States)

    Lihe Zhang; Jianwu Ai; Bowen Jiang; Huchuan Lu; Xiukui Li

    2018-02-01

    In this paper, we propose a bottom-up saliency model based on absorbing Markov chain (AMC). First, a sparsely connected graph is constructed to capture the local context information of each node. All image boundary nodes and other nodes are, respectively, treated as the absorbing nodes and transient nodes in the absorbing Markov chain. Then, the expected number of times from each transient node to all other transient nodes can be used to represent the saliency value of this node. The absorbed time depends on the weights on the path and their spatial coordinates, which are completely encoded in the transition probability matrix. Considering the importance of this matrix, we adopt different hierarchies of deep features extracted from fully convolutional networks and learn a transition probability matrix, which is called learnt transition probability matrix. Although the performance is significantly promoted, salient objects are not uniformly highlighted very well. To solve this problem, an angular embedding technique is investigated to refine the saliency results. Based on pairwise local orderings, which are produced by the saliency maps of AMC and boundary maps, we rearrange the global orderings (saliency value) of all nodes. Extensive experiments demonstrate that the proposed algorithm outperforms the state-of-the-art methods on six publicly available benchmark data sets.

  20. Quantization and Quantum-Like Phenomena: A Number Amplitude Approach

    Science.gov (United States)

    Robinson, T. R.; Haven, E.

    2015-12-01

    Historically, quantization has meant turning the dynamical variables of classical mechanics that are represented by numbers into their corresponding operators. Thus the relationships between classical variables determine the relationships between the corresponding quantum mechanical operators. Here, we take a radically different approach to this conventional quantization procedure. Our approach does not rely on any relations based on classical Hamiltonian or Lagrangian mechanics nor on any canonical quantization relations, nor even on any preconceptions of particle trajectories in space and time. Instead we examine the symmetry properties of certain Hermitian operators with respect to phase changes. This introduces harmonic operators that can be identified with a variety of cyclic systems, from clocks to quantum fields. These operators are shown to have the characteristics of creation and annihilation operators that constitute the primitive fields of quantum field theory. Such an approach not only allows us to recover the Hamiltonian equations of classical mechanics and the Schrödinger wave equation from the fundamental quantization relations, but also, by freeing the quantum formalism from any physical connotation, makes it more directly applicable to non-physical, so-called quantum-like systems. Over the past decade or so, there has been a rapid growth of interest in such applications. These include, the use of the Schrödinger equation in finance, second quantization and the number operator in social interactions, population dynamics and financial trading, and quantum probability models in cognitive processes and decision-making. In this paper we try to look beyond physical analogies to provide a foundational underpinning of such applications.

  1. Computation of Probabilities in Causal Models of History of Science

    Directory of Open Access Journals (Sweden)

    Osvaldo Pessoa Jr.

    2006-12-01

    Full Text Available : The aim of this paper is to investigate the ascription of probabilities in a causal model of an episode in the history of science. The aim of such a quantitative approach is to allow the implementation of the causal model in a computer, to run simulations. As an example, we look at the beginning of the science of magnetism, “explaining” — in a probabilistic way, in terms of a single causal model — why the field advanced in China but not in Europe (the difference is due to different prior probabilities of certain cultural manifestations. Given the number of years between the occurrences of two causally connected advances X and Y, one proposes a criterion for stipulating the value pY=X of the conditional probability of an advance Y occurring, given X. Next, one must assume a specific form for the cumulative probability function pY=X(t, which we take to be the time integral of an exponential distribution function, as is done in physics of radioactive decay. Rules for calculating the cumulative functions for more than two events are mentioned, involving composition, disjunction and conjunction of causes. We also consider the problems involved in supposing that the appearance of events in time follows an exponential distribution, which are a consequence of the fact that a composition of causes does not follow an exponential distribution, but a “hypoexponential” one. We suggest that a gamma distribution function might more adequately represent the appearance of advances.

  2. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  3. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  4. Probability dynamics of a repopulating tumor in case of fractionated external radiotherapy.

    Science.gov (United States)

    Stavreva, Nadia; Stavrev, Pavel; Fallone, B Gino

    2009-12-01

    In this work two analytical methods are developed for computing the probability distribution of the number of surviving cells of a repopulating tumor during a fractionated external radio-treatment. Both methods are developed for the case of pure birth processes. They both allow the description of the tumor dynamics in case of cell radiosensitivity changing in time and for treatment schedules with variable dose per fraction and variable time intervals between fractions. The first method is based on a direct solution of the set of differential equations describing the tumor dynamics. The second method is based on the works of Hanin et al. [Hanin LG, Zaider M, Yakovlev AY. Distribution of the number of clonogens surviving fractionated radiotherapy: a long-standing problem revisited. Int J Radiat Biol 2001;77:205-13; Hanin LG. Iterated birth and death process as a model of radiation cell survival. Math Biosci 2001;169:89-107; Hanin LG. A stochastic model of tumor response to fractionated radiation: limit theorems and rate of convergence. Math Biosci 2004;191:1-17], where probability generating functions are used. In addition a Monte Carlo algorithm for simulating the probability distributions is developed for the same treatment conditions as for the analytical methods. The probability distributions predicted by the three methods are compared graphically for a certain set of values of the model parameters and an excellent agreement is found to exist between all three results, thus proving the correct implementation of the methods. However, numerical difficulties have been encountered with both analytical methods depending on the values of the model parameters. Therefore, the Poisson approximation is also studied and it is compared to the exact methods for several different combinations of the model parameter values. It is concluded that the Poisson approximation works sufficiently well only for slowly repopulating tumors and a low cell survival probability and that it

  5. Probability of primordial black hole formation and its dependence on the radial profile of initial configurations

    International Nuclear Information System (INIS)

    Hidalgo, J. C.; Polnarev, A. G.

    2009-01-01

    In this paper we derive the probability of the radial profiles of spherically symmetric inhomogeneities in order to provide an improved estimation of the number density of primordial black holes (PBHs). We demonstrate that the probability of PBH formation depends sensitively on the radial profile of the initial configuration. We do this by characterizing this profile with two parameters chosen heuristically: the amplitude of the inhomogeneity and the second radial derivative, both evaluated at the center of the configuration. We calculate the joint probability of initial cosmological inhomogeneities as a function of these two parameters and then find a correspondence between these parameters and those used in numerical computations of PBH formation. Finally, we extend our heuristic study to evaluate the probability of PBH formation taking into account for the first time the radial profile of curvature inhomogeneities.

  6. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  7. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  8. Integrating spatial, temporal, and size probabilities for the annual landslide hazard maps in the Shihmen watershed, Taiwan

    Directory of Open Access Journals (Sweden)

    C. Y. Wu

    2013-09-01

    Full Text Available Landslide spatial, temporal, and size probabilities were used to perform a landslide hazard assessment in this study. Eleven intrinsic geomorphological, and two extrinsic rainfall factors were evaluated as landslide susceptibility related factors as they related to the success rate curves, landslide ratio plots, frequency distributions of landslide and non-landslide groups, as well as probability–probability plots. Data on landslides caused by Typhoon Aere in the Shihmen watershed were selected to train the susceptibility model. The landslide area probability, based on the power law relationship between the landslide area and a noncumulative number, was analyzed using the Pearson type 5 probability density function. The exceedance probabilities of rainfall with various recurrence intervals, including 2, 5, 10, 20, 50, 100 and 200 yr, were used to determine the temporal probabilities of the events. The study was conducted in the Shihmen watershed, which has an area of 760 km2 and is one of the main water sources for northern Taiwan. The validation result of Typhoon Krosa demonstrated that this landslide hazard model could be used to predict the landslide probabilities. The results suggested that integration of spatial, area, and exceedance probabilities to estimate the annual probability of each slope unit is feasible. The advantage of this annual landslide probability model lies in its ability to estimate the annual landslide risk, instead of a scenario-based risk.

  9. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  10. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  11. Selective genomic copy number imbalances and probability of recurrence in early-stage breast cancer.

    Directory of Open Access Journals (Sweden)

    Patricia A Thompson

    Full Text Available A number of studies of copy number imbalances (CNIs in breast tumors support associations between individual CNIs and patient outcomes. However, no pattern or signature of CNIs has emerged for clinical use. We determined copy number (CN gains and losses using high-density molecular inversion probe (MIP arrays for 971 stage I/II breast tumors and applied a boosting strategy to fit hazards models for CN and recurrence, treating chromosomal segments in a dose-specific fashion (-1 [loss], 0 [no change] and +1 [gain]. The concordance index (C-Index was used to compare prognostic accuracy between a training (n = 728 and test (n = 243 set and across models. Twelve novel prognostic CNIs were identified: losses at 1p12, 12q13.13, 13q12.3, 22q11, and Xp21, and gains at 2p11.1, 3q13.12, 10p11.21, 10q23.1, 11p15, 14q13.2-q13.3, and 17q21.33. In addition, seven CNIs previously implicated as prognostic markers were selected: losses at 8p22 and 16p11.2 and gains at 10p13, 11q13.5, 12p13, 20q13, and Xq28. For all breast cancers combined, the final full model including 19 CNIs, clinical covariates, and tumor marker-approximated subtypes (estrogen receptor [ER], progesterone receptor, ERBB2 amplification, and Ki67 significantly outperformed a model containing only clinical covariates and tumor subtypes (C-Index(full model, train[test]  =  0.72[0.71] ± 0.02 vs. C-Index(clinical + subtype model, train[test]  =  0.62[0.62] ± 0.02; p<10(-6. In addition, the full model containing 19 CNIs significantly improved prognostication separately for ER-, HER2+, luminal B, and triple negative tumors over clinical variables alone. In summary, we show that a set of 19 CNIs discriminates risk of recurrence among early-stage breast tumors, independent of ER status. Further, our data suggest the presence of specific CNIs that promote and, in some cases, limit tumor spread.

  12. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  13. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    Energy Technology Data Exchange (ETDEWEB)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento, Trento (Italy)

    2016-06-14

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  14. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  15. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    comment on the general implications of this view, and specifically question the application of classical probability theory to cosmology in cases where key questions are known to have no quantum answer. We argue that the ideas developed here may offer a way out of the notorious measure problems of eternal inflation. The fourth project looks at finite universes as alternatives to multiverse theories of cosmology. We compare two holographic arguments that impose especially strong bounds on the amount of inflation. One comes from the de Sitter Equilibrium cosmology and the other from the work of Banks and Fischler. We find that simple versions of these two approaches yield the same bound on the number of e-foldings. A careful examination reveals that while these pictures are similar in spirit, they are not necessarily identical prescriptions. We apply the two pictures to specific cosmologies which expose potentially important differences and which also demonstrate ways these seemingly simple proposals can be tricky to implement in practice.

  16. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  17. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  18. Systematics of the breakup probability function for {sup 6}Li and {sup 7}Li projectiles

    Energy Technology Data Exchange (ETDEWEB)

    Capurro, O.A., E-mail: capurro@tandar.cnea.gov.ar [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); Pacheco, A.J.; Arazi, A. [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Carnelli, P.F.F. [CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Instituto de Investigación e Ingeniería Ambiental, Universidad Nacional de San Martín, 25 de Mayo y Francia, B1650BWA San Martín, Buenos Aires (Argentina); Fernández Niello, J.O. [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Instituto de Investigación e Ingeniería Ambiental, Universidad Nacional de San Martín, 25 de Mayo y Francia, B1650BWA San Martín, Buenos Aires (Argentina); and others

    2016-01-15

    Experimental non-capture breakup cross sections can be used to determine the probability of projectile and ejectile fragmentation in nuclear reactions involving weakly bound nuclei. Recently, the probability of both type of dissociations has been analyzed in nuclear reactions involving {sup 9}Be projectiles onto various heavy targets at sub-barrier energies. In the present work we extend this kind of systematic analysis to the case of {sup 6}Li and {sup 7}Li projectiles with the purpose of investigating general features of projectile-like breakup probabilities for reactions induced by stable weakly bound nuclei. For that purpose we have obtained the probabilities of projectile and ejectile breakup for a large number of systems, starting from a compilation of the corresponding reported non-capture breakup cross sections. We parametrize the results in accordance with the previous studies for the case of beryllium projectiles, and we discuss their systematic behavior as a function of the projectile, the target mass and the reaction Q-value.

  19. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  20. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  1. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  2. JINR rapid communications

    International Nuclear Information System (INIS)

    1996-01-01

    The present collection of rapid communications from JINR, Dubna, contains five separate reports on analytic QCD running coupling with finite IR behaviour and universal α bar s (0) value, quark condensate in the interacting pion- nucleon medium at finite temperature and baryon number density, γ-π 0 discrimination with a shower maximum detector using neural networks for the solenoidal tracker at RHIC, off-specular neutron reflection from magnetic media with nondiagonal reflectivity matrices and molecular cytogenetics of radiation-induced gene mutations in Drosophila melanogaster. 21 fig., 1 tab

  3. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  4. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  5. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  6. Imaging multipole gravity anomaly sources by 3D probability tomography

    International Nuclear Information System (INIS)

    Alaia, Raffaele; Patella, Domenico; Mauriello, Paolo

    2009-01-01

    We present a generalized theory of the probability tomography applied to the gravity method, assuming that any Bouguer anomaly data set can be caused by a discrete number of monopoles, dipoles, quadrupoles and octopoles. These elementary sources are used to characterize, in an as detailed as possible way and without any a priori assumption, the shape and position of the most probable minimum structure of the gravity sources compatible with the observed data set, by picking out the location of their centres and peculiar points of their boundaries related to faces, edges and vertices. A few synthetic examples using simple geometries are discussed in order to demonstrate the notably enhanced resolution power of the new approach, compared with a previous formulation that used only monopoles and dipoles. A field example related to a gravity survey carried out in the volcanic area of Mount Etna (Sicily, Italy) is presented, aimed at imaging the geometry of the minimum gravity structure down to 8 km of depth bsl

  7. Hydra-Ring: a computational framework to combine failure probabilities

    Science.gov (United States)

    Diermanse, Ferdinand; Roscoe, Kathryn; IJmker, Janneke; Mens, Marjolein; Bouwer, Laurens

    2013-04-01

    This presentation discusses the development of a new computational framework for the safety assessment of flood defence systems: Hydra-Ring. Hydra-Ring computes the failure probability of a flood defence system, which is composed of a number of elements (e.g., dike segments, dune segments or hydraulic structures), taking all relevant uncertainties explicitly into account. This is a major step forward in comparison with the current Dutch practice in which the safety assessment is done separately per individual flood defence section. The main advantage of the new approach is that it will result in a more balanced prioratization of required mitigating measures ('more value for money'). Failure of the flood defence system occurs if any element within the system fails. Hydra-Ring thus computes and combines failure probabilities of the following elements: - Failure mechanisms: A flood defence system can fail due to different failure mechanisms. - Time periods: failure probabilities are first computed for relatively small time scales (assessment of flood defense systems, Hydra-Ring can also be used to derive fragility curves, to asses the efficiency of flood mitigating measures, and to quantify the impact of climate change and land subsidence on flood risk. Hydra-Ring is being developed in the context of the Dutch situation. However, the computational concept is generic and the model is set up in such a way that it can be applied to other areas as well. The presentation will focus on the model concept and probabilistic computation techniques.

  8. Mutant number distribution in an exponentially growing population

    Science.gov (United States)

    Keller, Peter; Antal, Tibor

    2015-01-01

    We present an explicit solution to a classic model of cell-population growth introduced by Luria and Delbrück (1943 Genetics 28 491-511) 70 years ago to study the emergence of mutations in bacterial populations. In this model a wild-type population is assumed to grow exponentially in a deterministic fashion. Proportional to the wild-type population size, mutants arrive randomly and initiate new sub-populations of mutants that grow stochastically according to a supercritical birth and death process. We give an exact expression for the generating function of the total number of mutants at a given wild-type population size. We present a simple expression for the probability of finding no mutants, and a recursion formula for the probability of finding a given number of mutants. In the ‘large population-small mutation’ limit we recover recent results of Kessler and Levine (2014 J. Stat. Phys. doi:10.1007/s10955-014-1143-3) for a fully stochastic version of the process.

  9. The intensity detection of single-photon detectors based on photon counting probability density statistics

    International Nuclear Information System (INIS)

    Zhang Zijing; Song Jie; Zhao Yuan; Wu Long

    2017-01-01

    Single-photon detectors possess the ultra-high sensitivity, but they cannot directly respond to signal intensity. Conventional methods adopt sampling gates with fixed width and count the triggered number of sampling gates, which is capable of obtaining photon counting probability to estimate the echo signal intensity. In this paper, we not only count the number of triggered sampling gates, but also record the triggered time position of photon counting pulses. The photon counting probability density distribution is obtained through the statistics of a series of the triggered time positions. Then Minimum Variance Unbiased Estimation (MVUE) method is used to estimate the echo signal intensity. Compared with conventional methods, this method can improve the estimation accuracy of echo signal intensity due to the acquisition of more detected information. Finally, a proof-of-principle laboratory system is established. The estimation accuracy of echo signal intensity is discussed and a high accuracy intensity image is acquired under low-light level environments. (paper)

  10. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  11. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  12. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  13. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  14. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  15. Gaussian distribution of LMOV numbers

    Directory of Open Access Journals (Sweden)

    A. Mironov

    2017-11-01

    Full Text Available Recent advances in knot polynomial calculus allowed us to obtain a huge variety of LMOV integers counting degeneracy of the BPS spectrum of topological theories on the resolved conifold and appearing in the genus expansion of the plethystic logarithm of the Ooguri–Vafa partition functions. Already the very first look at this data reveals that the LMOV numbers are randomly distributed in genus (! and are very well parameterized by just three parameters depending on the representation, an integer and the knot. We present an accurate formulation and evidence in support of this new puzzling observation about the old puzzling quantities. It probably implies that the BPS states, counted by the LMOV numbers can actually be composites made from some still more elementary objects.

  16. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  17. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  18. Participatory design of probability-based decision support tools for in-hospital nurses.

    Science.gov (United States)

    Jeffery, Alvin D; Novak, Laurie L; Kennedy, Betsy; Dietrich, Mary S; Mion, Lorraine C

    2017-11-01

    To describe nurses' preferences for the design of a probability-based clinical decision support (PB-CDS) tool for in-hospital clinical deterioration. A convenience sample of bedside nurses, charge nurses, and rapid response nurses (n = 20) from adult and pediatric hospitals completed participatory design sessions with researchers in a simulation laboratory to elicit preferred design considerations for a PB-CDS tool. Following theme-based content analysis, we shared findings with user interface designers and created a low-fidelity prototype. Three major themes and several considerations for design elements of a PB-CDS tool surfaced from end users. Themes focused on "painting a picture" of the patient condition over time, promoting empowerment, and aligning probability information with what a nurse already believes about the patient. The most notable design element consideration included visualizing a temporal trend of the predicted probability of the outcome along with user-selected overlapping depictions of vital signs, laboratory values, and outcome-related treatments and interventions. Participants expressed that the prototype adequately operationalized requests from the design sessions. Participatory design served as a valuable method in taking the first step toward developing PB-CDS tools for nurses. This information about preferred design elements of tools that support, rather than interrupt, nurses' cognitive workflows can benefit future studies in this field as well as nurses' practice. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the United States.

  19. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  20. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  1. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  2. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  3. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  4. Complex architecture of primes and natural numbers.

    Science.gov (United States)

    García-Pérez, Guillermo; Serrano, M Ángeles; Boguñá, Marián

    2014-08-01

    Natural numbers can be divided in two nonoverlapping infinite sets, primes and composites, with composites factorizing into primes. Despite their apparent simplicity, the elucidation of the architecture of natural numbers with primes as building blocks remains elusive. Here, we propose a new approach to decoding the architecture of natural numbers based on complex networks and stochastic processes theory. We introduce a parameter-free non-Markovian dynamical model that naturally generates random primes and their relation with composite numbers with remarkable accuracy. Our model satisfies the prime number theorem as an emerging property and a refined version of Cramér's conjecture about the statistics of gaps between consecutive primes that seems closer to reality than the original Cramér's version. Regarding composites, the model helps us to derive the prime factors counting function, giving the probability of distinct prime factors for any integer. Probabilistic models like ours can help to get deeper insights about primes and the complex architecture of natural numbers.

  5. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  6. Calculating failure probabilities for TRISO-coated fuel particles using an integral formulation

    International Nuclear Information System (INIS)

    Miller, Gregory K.; Maki, John T.; Knudson, Darrell L.; Petti, David A.

    2010-01-01

    The fundamental design for a gas-cooled reactor relies on the safe behavior of the coated particle fuel. The coating layers surrounding the fuel kernels in these spherical particles, termed the TRISO coating, act as a pressure vessel that retains fission products. The quality of the fuel is reflected in the number of particle failures that occur during reactor operation, where failed particles become a source for fission products that can then diffuse through the fuel element. The failure probability for any batch of particles, which has traditionally been calculated using the Monte Carlo method, depends on statistical variations in design parameters and on variations in the strengths of coating layers among particles in the batch. An alternative approach to calculating failure probabilities is developed herein that uses direct numerical integration of a failure probability integral. Because this is a multiple integral where the statistically varying parameters become integration variables, a fast numerical integration approach is also developed. In sample cases analyzed involving multiple failure mechanisms, results from the integration methods agree closely with Monte Carlo results. Additionally, the fast integration approach, particularly, is shown to significantly improve efficiency of failure probability calculations. These integration methods have been implemented in the PARFUME fuel performance code along with the Monte Carlo method, where each serves to verify accuracy of the others.

  7. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  8. Fluid dynamic propagation of initial baryon number perturbations on a Bjorken flow background

    CERN Document Server

    Floerchinger, Stefan

    2015-01-01

    Baryon number density perturbations offer a possible route to experimentally measure baryon number susceptibilities and heat conductivity of the quark gluon plasma. We study the fluid dynamical evolution of local and event-by-event fluctuations of baryon number density, flow velocity and energy density on top of a (generalized) Bjorken expansion. To that end we use a background-fluctuation splitting and a Bessel-Fourier decomposition for the fluctuating part of the fluid dynamical fields with respect to the azimuthal angle, the radius in the transverse plane and rapidity. We examine how the time evolution of linear perturbations depends on the equation of state as well as on shear viscosity, bulk viscosity and heat conductivity for modes with different azimuthal, radial and rapidity wave numbers. Finally we discuss how this information is accessible to experiments in terms of the transverse and rapidity dependence of correlation functions for baryonic particles in high energy nuclear collisions.

  9. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  10. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  11. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  12. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  13. RapidRMSD: Rapid determination of RMSDs corresponding to motions of flexible molecules.

    Science.gov (United States)

    Neveu, Emilie; Popov, Petr; Hoffmann, Alexandre; Migliosi, Angelo; Besseron, Xavier; Danoy, Grégoire; Bouvry, Pascal; Grudinin, Sergei

    2018-03-15

    The root mean square deviation (RMSD) is one of the most used similarity criteria in structural biology and bioinformatics. Standard computation of the RMSD has a linear complexity with respect to the number of atoms in a molecule, making RMSD calculations time-consuming for the large-scale modeling applications, such as assessment of molecular docking predictions or clustering of spatially proximate molecular conformations. Previously we introduced the RigidRMSD algorithm to compute the RMSD corresponding to the rigid-body motion of a molecule. In this study we go beyond the limits of the rigid-body approximation by taking into account conformational flexibility of the molecule. We model the flexibility with a reduced set of collective motions computed with e.g. normal modes or principal component analysis. The initialization of our algorithm is linear in the number of atoms and all the subsequent evaluations of RMSD values between flexible molecular conformations depend only on the number of collective motions that are selected to model the flexibility. Therefore, our algorithm is much faster compared to the standard RMSD computation for large-scale modeling applications. We demonstrate the efficiency of our method on several clustering examples, including clustering of flexible docking results and molecular dynamics (MD) trajectories. We also demonstrate how to use the presented formalism to generate pseudo-random constant-RMSD structural molecular ensembles and how to use these in cross-docking. We provide the algorithm written in C ++ as the open-source RapidRMSD library governed by the BSD-compatible license, which is available at http://team.inria.fr/nano-d/software/RapidRMSD/. The constant-RMSD structural ensemble application and clustering of MD trajectories is available at http://team.inria.fr/nano-d/software/nolb-normal-modes/. sergei.grudinin@inria.fr. Supplementary data are available at Bioinformatics.

  14. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  15. Integrating Preventive Maintenance Scheduling As Probability Machine Failure And Batch Production Scheduling

    Directory of Open Access Journals (Sweden)

    Zahedi Zahedi

    2016-06-01

    Full Text Available This paper discusses integrated model of batch production scheduling and machine maintenance scheduling. Batch production scheduling uses minimize total actual flow time criteria and machine maintenance scheduling uses the probability of machine failure based on Weibull distribution. The model assumed no nonconforming parts in a planning horizon. The model shows an increase in the number of the batch (length of production run up to a certain limit will minimize the total actual flow time. Meanwhile, an increase in the length of production run will implicate an increase in the number of PM. An example was given to show how the model and algorithm work.

  16. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  17. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  18. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  19. Measurement of the Mis-identification Probability of τ Leptons from Hadronic Jets and from Electrons

    CERN Document Server

    The ATLAS collaboration

    2011-01-01

    Measurements of the mis-identification probability of QCD jets and electrons as hadronically decaying τ leptons using tag-and-probe methods are described. The analyses are based on 35pb−1 of proton-proton collision data, taken by the ATLAS experiment at a center-of-mass energy of sqrt(s) = 7 TeV. The mis-identification probabilities range between 10% and 0.1% for QCD jets, and about (1 − 2)% for electrons. They depend on the identification algorithm chosen, the pT and the number of prongs of the τ candidate, and on the amount of pile up present in the event.

  20. Rate of atrazine mineralisation in New Zealand topsoils and subsoils depends on numbers of specialist atrazine-degrading microorganisms

    International Nuclear Information System (INIS)

    Sparling, G.; Fraser, R.; Aislabie, J.; Dragten, R.

    1998-01-01

    Full text: The herbicide atrazine (2-chloro-4-ethylamino-6-isopropylamino-1,3,5-s-triazine) is widely used in horticulture and arable farming in New Zealand and there is a trend towards increasing concentrations in aquifers and ground waters. Microbial degradation is considered a major route whereby atrazine is decomposed in soil. Microbial activity declines rapidly with depth of soil, so to predict the risks of atrazine reaching aquifers, we need to know the rates of mineralisation at different depths in the soil profile. We measured the rates of mineralisation of [U] 14 C-ring-labelled atrazine in topsoils and subsoils of two sandy loam soils and an allophanic soil under a range of temperature and moisture conditions. The numbers of atrazine-degrading organisms were measured using a most-probable number method based on the mineralisation of [U] 14 C-ring-labelled atrazine to 14 CO 2 . Numbers of atrazine-degraders and rates of mineralisation were generally very low in subsoils. However, one subsoil had unusually high numbers of atrazine-degrading microbes and showed equivalent rates of mineralisation rates to those in the surface soil. The rate of atrazine mineralisation could be predicted from the number of atrazine-degrading microbes and the cation exchange capacity of the soil (R 2 = 0.86). A large amount (54-77%) of 14 C remained in the soil as non-extractable residues after 263 days but only trace amounts of atrazine were detectable

  1. A mass spectrometer for the rapid analysis of gaseous mixtures

    International Nuclear Information System (INIS)

    Cassignol, C.; Ortel, Y.; Taieb, J.

    1950-01-01

    A mass spectrometer for leak detection and rapid gas analysis were constructed, having the characteristics and several structural features of a simple instrument described by Siry in Rev. Sri. Instruments. 540 (1947). Although exhibiting a good resolving power, the apparatus, which has no ion lenses and whose electrodes can be regulated during the performance, has not been sufficiently tested. Since several design defects have been discovered, it will probably be rebuilt with various improvements (ion source outside the magnetic field, modified circuits, etc.). (author)

  2. Probability theory plus noise: Replies to Crupi and Tentori (2016) and to Nilsson, Juslin, and Winman (2016).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-01-01

    A standard assumption in much of current psychology is that people do not reason about probability using the rules of probability theory but instead use various heuristics or "rules of thumb," which can produce systematic reasoning biases. In Costello and Watts (2014), we showed that a number of these biases can be explained by a model where people reason according to probability theory but are subject to random noise. More importantly, that model also predicted agreement with probability theory for certain expressions that cancel the effects of random noise: Experimental results strongly confirmed this prediction, showing that probabilistic reasoning is simultaneously systematically biased and "surprisingly rational." In their commentaries on that paper, both Crupi and Tentori (2016) and Nilsson, Juslin, and Winman (2016) point to various experimental results that, they suggest, our model cannot explain. In this reply, we show that our probability theory plus noise model can in fact explain every one of the results identified by these authors. This gives a degree of additional support to the view that people's probability judgments embody the rational rules of probability theory and that biases in those judgments can be explained as simply effects of random noise. (c) 2015 APA, all rights reserved).

  3. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  4. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  5. "I Don't Really Understand Probability at All": Final Year Pre-Service Teachers' Understanding of Probability

    Science.gov (United States)

    Maher, Nicole; Muir, Tracey

    2014-01-01

    This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…

  6. On Z-dependence of probability of atomic capture of mesons in matter

    International Nuclear Information System (INIS)

    Vasil'ev, V.A.; Petrukhin, V.I.; Suvorov, V.M.; Khorvat, D.

    1976-01-01

    All experimental data available on the atomic capture of negative muons and pions are systematically studied to find more appropriate empirical expression for the capture probability as a function of the atomic number. It is shown that Z-dependence, as a rule, does not hold. Zsup(1/3)-dependence gives more satisfactory results. A modified Zsup(1/3-dependence is proposed which is more appropriate for hydrogen - containing compounds

  7. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  8. Knotting probabilities after a local strand passage in unknotted self-avoiding polygons

    International Nuclear Information System (INIS)

    Szafron, M L; Soteros, C E

    2011-01-01

    We investigate, both theoretically and numerically, the knotting probabilities after a local strand passage is performed in an unknotted self-avoiding polygon (SAP) on the simple cubic lattice. In the polygons studied, it is assumed that two polygon segments have already been brought close together for the purpose of performing a strand passage. This restricts the polygons considered to those that contain a specific pattern called Θ at a fixed location; an unknotted polygon containing Θ is called a Θ-SAP. It is proved that the number of n-edge Θ-SAPs grows exponentially (with n) at the same rate as the total number of n-edge unknotted SAPs (those with no prespecified strand passage structure). Furthermore, it is proved that the same holds for subsets of n-edge Θ-SAPs that yield a specific after-strand-passage knot-type. Thus, the probability of a given after-strand-passage knot-type does not grow (or decay) exponentially with n. Instead, it is conjectured that these after-strand-passage knot probabilities approach, as n goes to infinity, knot-type dependent amplitude ratios lying strictly between 0 and 1. This conjecture is supported by numerical evidence from Monte Carlo data generated using a composite (aka multiple) Markov chain Monte Carlo BFACF algorithm developed to study Θ-SAPs. A new maximum likelihood method is used to estimate the critical exponents relevant to this conjecture. We also obtain strong numerical evidence that the after-strand-passage knotting probability depends on the local structure around the strand-passage site. If the local structure and the crossing sign at the strand-passage site are considered, then we observe that the more 'compact' the local structure, the less likely the after-strand-passage polygon is to be knotted. This trend for compactness versus knotting probability is consistent with results obtained for other strand-passage models; however, we are the first to note the influence of the crossing-sign information. We

  9. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  10. Effect of Reynolds Number on Aerodynamics of Airfoil with Gurney Flap

    Directory of Open Access Journals (Sweden)

    Shubham Jain

    2015-01-01

    Full Text Available Steady state, two-dimensional computational investigations performed on NACA 0012 airfoil to analyze the effect of variation in Reynolds number on the aerodynamics of the airfoil without and with a Gurney flap of height of 3% chord are presented in this paper. RANS based one-equation Spalart-Allmaras model is used for the computations. Both lift and drag coefficients increase with Gurney flap compared to those without Gurney flap at all Reynolds numbers at all angles of attack. The zero lift angle of attack seems to become more negative as Reynolds number increases due to effective increase of the airfoil camber. However the stall angle of attack decreased by 2° for the airfoil with Gurney flap. Lift coefficient decreases rapidly and drag coefficient increases rapidly when Reynolds number is decreased below critical range. This occurs due to change in flow pattern near Gurney flap at low Reynolds numbers.

  11. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  12. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  13. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  14. Rapid scanning system for fuel drawers

    International Nuclear Information System (INIS)

    Caldwell, J.T.; Fehlau, P.E.; France, S.W.

    1981-01-01

    A nondestructive method for uniqely distinguishing among and quantifying the mass of individual fuel plates in situ in fuel drawers utilized in nuclear reactors is described. The method is both rapid and passive, eliminating the personnel hazard of the commonly used irradiation techniques which require that the analysis be performed in proximity to an intense neutron source such as a reactor. In the present technique, only normally decaying nuclei are observed. This allows the analysis to be performed anywhere. This feature, combined with rapid scanning of a given fuel drawer (in approximately 30 s), and the computer data analysis allows the processing of large numbers of fuel drawers efficiently in the event of a loss alert

  15. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  16. Network-level reproduction number and extinction threshold for vector-borne diseases.

    Science.gov (United States)

    Xue, Ling; Scoglio, Caterina

    2015-06-01

    The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.

  17. Multiple-event probability in general-relativistic quantum mechanics

    International Nuclear Information System (INIS)

    Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-01-01

    We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse

  18. Rapid detection, characterization, and enumeration of foodborne pathogens

    DEFF Research Database (Denmark)

    Hoorfar, Jeffrey

    2011-01-01

    . The present review discusses the reasons for the increasing interest in rapid methods; current developments in the field, the research needs, and the future trends. The advent of biotechnology has introduced new technologies that led to the emergence of rapid diagnostic methods and altered food testing...... of rapid methods is for fast screening of large number of samples, where most of them are expected to be test-negative, leading to faster product release for sale. This has been the main strength of rapid methods such as real-time Polymerase Chain Reaction (PCR). Enrichment PCR, where a primary culture...... of pathogen in a contaminated product. Another key issue is automation, where the key drivers are miniaturization and multiple testing, which mean that not only one instrument is flexible enough to test for many pathogens but also many pathogens can be detected with one test. The review is mainly based...

  19. Evaluation of probability and hazard in nuclear energy

    International Nuclear Information System (INIS)

    Novikov, V.Ya.; Romanov, N.L.

    1979-01-01

    Various methods of evaluation of accident probability on NPP are proposed because of NPP security statistic evaluation unreliability. The conception of subjective probability for quantitative analysis of security and hazard are described. Intrepretation of probability as real faith of an expert is assumed as a basis of the conception. It is suggested to study the event uncertainty in the framework of subjective probability theory which not only permits but demands to take into account expert opinions when evaluating the probability. These subjective expert evaluations effect to a certain extent the calculation of the usual mathematical event probability. The above technique is advantageous to use for consideration of a separate experiment or random event

  20. A statistical model for investigating binding probabilities of DNA nucleotide sequences using microarrays.

    Science.gov (United States)

    Lee, Mei-Ling Ting; Bulyk, Martha L; Whitmore, G A; Church, George M

    2002-12-01

    There is considerable scientific interest in knowing the probability that a site-specific transcription factor will bind to a given DNA sequence. Microarray methods provide an effective means for assessing the binding affinities of a large number of DNA sequences as demonstrated by Bulyk et al. (2001, Proceedings of the National Academy of Sciences, USA 98, 7158-7163) in their study of the DNA-binding specificities of Zif268 zinc fingers using microarray technology. In a follow-up investigation, Bulyk, Johnson, and Church (2002, Nucleic Acid Research 30, 1255-1261) studied the interdependence of nucleotides on the binding affinities of transcription proteins. Our article is motivated by this pair of studies. We present a general statistical methodology for analyzing microarray intensity measurements reflecting DNA-protein interactions. The log probability of a protein binding to a DNA sequence on an array is modeled using a linear ANOVA model. This model is convenient because it employs familiar statistical concepts and procedures and also because it is effective for investigating the probability structure of the binding mechanism.

  1. Assessing the probability of carbon and greenhouse gas benefit from the management of peat soils

    International Nuclear Information System (INIS)

    Worrall, F.; Bell, M.J.; Bhogal, A.

    2010-01-01

    This study proposes a method for assessing the probability that land management interventions will lead to an improvement in the carbon sink represented by peat soils. The method is able to: combine studies of different carbon uptake and release pathways in order to assess changes on the overall carbon or greenhouse gas budget; calculate the probability of the management or restoration leading to an improvement in the budget; calculate the uncertainty in that probability estimate; estimate the equivalent number of complete budgets available from the combination of the literature; test the difference in the outcome of different land management interventions; and provide a method for updating the predicted probabilities as new studies become available. Using this methodology, this study considered the impact of: afforestation, managed burning, drainage, drain-blocking, grazing removal; and revegetation, on the carbon budget of peat soils in the UK. The study showed that afforestation, drain-blocking, revegetation, grazing removal and cessation of managed burning would bring a carbon benefit, whereas deforestation, managed burning and drainage would bring a disbenefit. The predicted probabilities of a benefit are often equivocal as each management type or restoration often leads to increase in uptake in one pathway while increasing losses in another.

  2. p-adic probability interpretation of Bell's inequality

    International Nuclear Information System (INIS)

    Khrennikov, A.

    1995-01-01

    We study the violation of Bell's inequality using a p-adic generalization of the theory of probability. p-adic probability is introduced as a limit of relative frequencies but this limit exists with respect to a p-adic metric. In particular, negative probability distributions are well defined on the basis of the frequency definition. This new type of stochastics can be used to describe hidden-variables distributions of some quantum models. If the hidden variables have a p-adic probability distribution, Bell's inequality is not valid and it is not necessary to discuss the experimental violations of this inequality. ((orig.))

  3. Cosmological constraints from the convergence 1-point probability distribution

    Energy Technology Data Exchange (ETDEWEB)

    Patton, Kenneth [The Ohio State Univ., Columbus, OH (United States); Blazek, Jonathan [The Ohio State Univ., Columbus, OH (United States); Ecole Polytechnique Federale de Lausanne (EPFL), Versoix (Switzerland); Honscheid, Klaus [The Ohio State Univ., Columbus, OH (United States); Huff, Eric [The Ohio State Univ., Columbus, OH (United States); California Inst. of Technology (CalTech), Pasadena, CA (United States); Melchior, Peter [Princeton Univ., Princeton, NJ (United States); Ross, Ashley J. [The Ohio State Univ., Columbus, OH (United States); Suchyta, Eric D. [The Ohio State Univ., Columbus, OH (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-06-29

    Here, we examine the cosmological information available from the 1-point probability density function (PDF) of the weak-lensing convergence field, utilizing fast l-picola simulations and a Fisher analysis. We find competitive constraints in the Ωm–σ8 plane from the convergence PDF with 188 arcmin2 pixels compared to the cosmic shear power spectrum with an equivalent number of modes (ℓ < 886). The convergence PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is less susceptible, and improves the total figure of merit by a factor of 2–3, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.

  4. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  5. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  6. Comment on 'The meaning of probability in probabilistic safety analysis'

    International Nuclear Information System (INIS)

    Yellman, Ted W.; Murray, Thomas M.

    1995-01-01

    A recent article in Reliability Engineering and System Safety argues that there is 'fundamental confusion over how to interpret the numbers which emerge from a Probabilistic Safety Analysis [PSA]', [Watson, S. R., The meaning of probability in probabilistic safety analysis. Reliab. Engng and System Safety, 45 (1994) 261-269.] As a standard for comparison, the author employs the 'realist' interpretation that a PSA output probability should be a 'physical property' of the installation being analyzed, 'objectively measurable' without controversy. The author finds all the other theories and philosophies discussed wanting by this standard. Ultimately, he argues that the outputs of a PSA should be considered to be no more than constructs of the computational procedure chosen - just an 'argument' or a 'framework for the debate about safety' rather than a 'representation of truth'. He even suggests that 'competing' PSA's be done - each trying to 'argue' for a different message. The commentors suggest that the position the author arrives at is an overreaction to the subjectivity which is part of any complex PSA, and that that overreaction could in fact easily lead to the belief that PSA's are meaningless. They suggest a broader interpretation, one based strictly on relative frequency--a concept which the commentors believe the author abandoned too quickly. Their interpretation does not require any 'tests' to determine whether a statement of likelihood is qualified to be a 'true' probability and it applies equally well in pure analytical models. It allows anyone's proper numerical statement of the likelihood of an event to be considered a probability. It recognizes that the quality of PSA's and their results will vary. But, unlike the author, the commentors contend that a PSA should always be a search for truth--not a vehicle for adversarial pleadings

  7. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  8. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  9. Trial type probability modulates the cost of antisaccades

    Science.gov (United States)

    Chiau, Hui-Yan; Tseng, Philip; Su, Jia-Han; Tzeng, Ovid J. L.; Hung, Daisy L.; Muggleton, Neil G.

    2011-01-01

    The antisaccade task, where eye movements are made away from a target, has been used to investigate the flexibility of cognitive control of behavior. Antisaccades usually have longer saccade latencies than prosaccades, the so-called antisaccade cost. Recent studies have shown that this antisaccade cost can be modulated by event probability. This may mean that the antisaccade cost can be reduced, or even reversed, if the probability of surrounding events favors the execution of antisaccades. The probabilities of prosaccades and antisaccades were systematically manipulated by changing the proportion of a certain type of trial in an interleaved pro/antisaccades task. We aimed to disentangle the intertwined relationship between trial type probabilities and the antisaccade cost with the ultimate goal of elucidating how probabilities of trial types modulate human flexible behaviors, as well as the characteristics of such modulation effects. To this end, we examined whether implicit trial type probability can influence saccade latencies and also manipulated the difficulty of cue discriminability to see how effects of trial type probability would change when the demand on visual perceptual analysis was high or low. A mixed-effects model was applied to the analysis to dissect the factors contributing to the modulation effects of trial type probabilities. Our results suggest that the trial type probability is one robust determinant of antisaccade cost. These findings highlight the importance of implicit probability in the flexibility of cognitive control of behavior. PMID:21543748

  10. Mutant number distribution in an exponentially growing population

    International Nuclear Information System (INIS)

    Keller, Peter; Antal, Tibor

    2015-01-01

    We present an explicit solution to a classic model of cell-population growth introduced by Luria and Delbrück (1943 Genetics 28 491–511) 70 years ago to study the emergence of mutations in bacterial populations. In this model a wild-type population is assumed to grow exponentially in a deterministic fashion. Proportional to the wild-type population size, mutants arrive randomly and initiate new sub-populations of mutants that grow stochastically according to a supercritical birth and death process. We give an exact expression for the generating function of the total number of mutants at a given wild-type population size. We present a simple expression for the probability of finding no mutants, and a recursion formula for the probability of finding a given number of mutants. In the ‘large population-small mutation’ limit we recover recent results of Kessler and Levine (2014 J. Stat. Phys. doi:10.1007/s10955-014-1143-3) for a fully stochastic version of the process. (paper)

  11. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  12. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  14. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  15. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  16. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  17. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  18. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  19. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  20. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.