WorldWideScience

Sample records for calculating age-conditional probabilities

  1. Estimating age conditional probability of developing disease from surveillance data

    Directory of Open Access Journals (Sweden)

    Fay Michael P

    2004-07-01

    Full Text Available Abstract Fay, Pfeiffer, Cronin, Le, and Feuer (Statistics in Medicine 2003; 22; 1837–1848 developed a formula to calculate the age-conditional probability of developing a disease for the first time (ACPDvD for a hypothetical cohort. The novelty of the formula of Fay et al (2003 is that one need not know the rates of first incidence of disease per person-years alive and disease-free, but may input the rates of first incidence per person-years alive only. Similarly the formula uses rates of death from disease and death from other causes per person-years alive. The rates per person-years alive are much easier to estimate than per person-years alive and disease-free. Fay et al (2003 used simple piecewise constant models for all three rate functions which have constant rates within each age group. In this paper, we detail a method for estimating rate functions which does not have jumps at the beginning of age groupings, and need not be constant within age groupings. We call this method the mid-age group joinpoint (MAJ model for the rates. The drawback of the MAJ model is that numerical integration must be used to estimate the resulting ACPDvD. To increase computational speed, we offer a piecewise approximation to the MAJ model, which we call the piecewise mid-age group joinpoint (PMAJ model. The PMAJ model for the rates input into the formula for ACPDvD described in Fay et al (2003 is the current method used in the freely available DevCan software made available by the National Cancer Institute.

  2. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be... determine their new intermediate probabilities. (g) Multiply each applicant's probability pursuant...

  3. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  4. Necessity of Exact Calculation for Transition Probability

    Institute of Scientific and Technical Information of China (English)

    LIU Fu-Sui; CHEN Wan-Fang

    2003-01-01

    This paper shows that exact calculation for transition probability can make some systems deviate fromFermi golden rule seriously. This paper also shows that the corresponding exact calculation of hopping rate inducedby phonons for deuteron in Pd-D system with the many-body electron screening, proposed by Ichimaru, can explainthe experimental fact observed in Pd-D system, and predicts that perfection and low-dimension of Pd lattice are veryimportant for the phonon-induced hopping rate enhancement in Pd-D system.

  5. Calculation of fractional electron capture probabilities

    CERN Document Server

    Schoenfeld, E

    1998-01-01

    A 'Table of Radionuclides' is being prepared which will supersede the 'Table de Radionucleides' formerly issued by the LMRI/LPRI (France). In this effort it is desirable to have a uniform basis for calculating theoretical values of fractional electron capture probabilities. A table has been compiled which allows one to calculate conveniently and quickly the fractional probabilities P sub K , P sub L , P sub M , P sub N and P sub O , their ratios and the assigned uncertainties for allowed and non-unique first forbidden electron capture transitions of known transition energy for radionuclides with atomic numbers from Z=3 to 102. These results have been applied to a total of 28 transitions of 14 radionuclides ( sup 7 Be, sup 2 sup 2 Na, sup 5 sup 1 Cr, sup 5 sup 4 Mn, sup 5 sup 5 Fe, sup 6 sup 8 Ge , sup 6 sup 8 Ga, sup 7 sup 5 Se, sup 1 sup 0 sup 9 Cd, sup 1 sup 2 sup 5 I, sup 1 sup 3 sup 9 Ce, sup 1 sup 6 sup 9 Yb, sup 1 sup 9 sup 7 Hg, sup 2 sup 0 sup 2 Tl). The values are in reasonable agreement with measure...

  6. Calculation Model and Simulation of Warship Damage Probability

    Institute of Scientific and Technical Information of China (English)

    TENG Zhao-xin; ZHANG Xu; YANG Shi-xing; ZHU Xiao-ping

    2008-01-01

    The combat efficiency of mine obstacle is the focus of the present research. Based on the main effects that mine obstacle has on the target warship damage probability such as: features of mines with maneuverability, the success rate of mine-laying, the hit probability, mine reliability and action probability, a calculation model of target warship mine-encounter probability is put forward under the condition that the route selection of target warships accords with even distribution and the course of target warships accords with normal distribution. And a damage probability model of mines with maneuverability to target warships is set up, a simulation way proved the model to be a high practicality.

  7. A Monte Carlo Method for Calculating Initiation Probability

    Energy Technology Data Exchange (ETDEWEB)

    Greenman, G M; Procassini, R J; Clouse, C J

    2007-03-05

    A Monte Carlo method for calculating the probability of initiating a self-sustaining neutron chain reaction has been developed. In contrast to deterministic codes which solve a non-linear, adjoint form of the Boltzmann equation to calculate initiation probability, this new method solves the forward (standard) form of the equation using a modified source calculation technique. Results from this new method are compared with results obtained from several deterministic codes for a suite of historical test problems. The level of agreement between these code predictions is quite good, considering the use of different numerical techniques and nuclear data. A set of modifications to the historical test problems has also been developed which reduces the impact of neutron source ambiguities on the calculated probabilities.

  8. Calculating the probability of detecting radio signals from alien civilizations

    CERN Document Server

    Horvat, Marko

    2006-01-01

    Although it might not be self-evident, it is in fact entirely possible to calculate the probability of detecting alien radio signals by understanding what types of extraterrestrial radio emissions can be expected and what properties these emissions can have. Using the Drake equation as the obvious starting point, and logically identifying and enumerating constraints of interstellar radio communications can yield the probability of detecting a genuine alien radio signal.

  9. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  10. Fostering Positive Attitude in Probability Learning Using Graphing Calculator

    Science.gov (United States)

    Tan, Choo-Kim; Harji, Madhubala Bava; Lau, Siong-Hoe

    2011-01-01

    Although a plethora of research evidence highlights positive and significant outcomes of the incorporation of the Graphing Calculator (GC) in mathematics education, its use in the teaching and learning process appears to be limited. The obvious need to revisit the teaching and learning of Probability has resulted in this study, i.e. to incorporate…

  11. Calculating the probability of injected carbon dioxide plumes encountering faults

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, P.D.

    2011-04-01

    One of the main concerns of storage in saline aquifers is leakage via faults. In the early stages of site selection, site-specific fault coverages are often not available for these aquifers. This necessitates a method using available fault data to estimate the probability of injected carbon dioxide encountering and migrating up a fault. The probability of encounter can be calculated from areal fault density statistics from available data, and carbon dioxide plume dimensions from numerical simulation. Given a number of assumptions, the dimension of the plume perpendicular to a fault times the areal density of faults with offsets greater than some threshold of interest provides probability of the plume encountering such a fault. Application of this result to a previously planned large-scale pilot injection in the southern portion of the San Joaquin Basin yielded a 3% and 7% chance of the plume encountering a fully and half seal offsetting fault, respectively. Subsequently available data indicated a half seal-offsetting fault at a distance from the injection well that implied a 20% probability of encounter for a plume sufficiently large to reach it.

  12. CALCULATION OF PER PARCEL PROBABILITY FOR DUD BOMBS IN GERMANY

    Directory of Open Access Journals (Sweden)

    S. M. Tavakkoli Sabour

    2014-10-01

    Full Text Available Unexploded aerial Bombs, also known as duds or unfused bombs, of the bombardments in the past wars remain explosive for decades after the war under the earth’s surface threatening the civil activities especially if dredging works are involved. Interpretation of the aerial photos taken shortly after bombardments has been proven to be useful for finding the duds. Unfortunately, the reliability of this method is limited by some factors. The chance of finding a dud on an aerial photo depends strongly on the photography system, the size of the bomb and the landcover. On the other hand, exploded bombs are considerably better detectable on aerial photos and confidently represent the extent and density of a bombardment. Considering an empirical quota of unfused bombs, the expected number of duds can be calculated by the number of exploded bombs. This can help to have a better calculation of cost-risk ratio and to classify the areas for clearance. This article is about a method for calculation of a per parcel probability of dud bombs according to the distribution and density of exploded bombs. No similar work has been reported in this field by other authors.

  13. Calculation of Probability Maps Directly from Ordinary Kriging Weights

    Directory of Open Access Journals (Sweden)

    Jorge Kazuo Yamamoto

    2010-03-01

    Full Text Available Probability maps are useful to analyze ores or contaminants in soils and they are helpful to make a decision duringexploration work. These probability maps are usually derived from the indicator kriging approach. Ordinary krigingweights can be used to derive probability maps as well. For testing these two approaches a sample data base was randomlydrawn from an exhaustive data set. From the exhaustive data set actual cumulative distribution functions were determined.Thus, estimated and actual conditional cumulative distribution functions were compared. The vast majority of correlationcoeffi cients between estimated and actual probability maps is greater than 0.75. Not only does the ordinary kriging approachwork, but it also gives slightly better results than median indicator kriging. Moreover, probability maps from ordinary krigingweights are much easier than the traditional approach based on either indicator kriging or median indicator kriging.

  14. Flipping Out: Calculating Probability with a Coin Game

    Science.gov (United States)

    Degner, Kate

    2015-01-01

    In the author's experience with this activity, students struggle with the idea of representativeness in probability. Therefore, this student misconception is part of the classroom discussion about the activities in this lesson. Representativeness is related to the (incorrect) idea that outcomes that seem more random are more likely to happen. This…

  15. Calculation of paternity probabilities from multilocus DNA profiles.

    Science.gov (United States)

    Brenner, C H; Rittner, C; Schneider, P M

    1994-02-01

    We describe a procedure for evaluation of paternity evidence from multi-locus DNA probe patterns. A computer program abstracts a "+/-" notation description from the multilocus profile and then calculates a paternity index based on observed phenotypic fragment frequencies. The biostatistical evaluation considers only bands found in the child and missing from the mother--a simplified approach that is at once robust and conservative. Mutations are of course taken into account. Particular features lending objectivity to the interpretation include computer reading and matching decisions, and specific recognition and statistical compensation for ambiguities ("faint orphans").

  16. Constrained Mathematics for Calculating Logical Safety and Reliability Probabilities with Uncertain Inputs

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, D.K.; Cooper, J.A.; Ferson, S.

    1999-01-21

    Calculating safety and reliability probabilities with functions of uncertain variables can yield incorrect or misleading results if some precautions are not taken. One important consideration is the application of constrained mathematics for calculating probabilities for functions that contain repeated variables. This paper includes a description of the problem and develops a methodology for obtaining an accurate solution.

  17. Quantum dynamics calculation of reaction probability for H+Cl2→HCl+Cl

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We present in this paper a time-dependent quantum wave packet calculation of the initial state selected reaction probability for H + Cl2 based on the GHNS potential energy surface with total angular momentum J = 0. The effects of the translational, vibrational and rotational excitation of Cl2 on the reaction probability have been investigated. In a broad region of the translational energy, the rotational excitation enhances the reaction probability while the vibrational excitation depresses the reaction probability. The theoretical results agree well with the fact that it is an early down-hill reaction.

  18. Quantum dynamics calculation of reaction probability for H+Cl2→HC1+Cl

    Institute of Scientific and Technical Information of China (English)

    王胜龙; 赵新生

    2001-01-01

    We present in this paper a time-dependent quantum wave packet calculation of the initial state selected reaction probability for H + CI2 based on the GHNS potential energy surface with total angular momentum J= 0. The effects of the translational, vibrational and rotational excitation of CI2 on the reaction probability have been investigated. In a broad region of the translational energy, the rotational excitation enhances the reaction probability while the vibrational excitation depresses the reaction probability. The theoretical results agree well with the fact that it is an early down-hill reaction.

  19. An alternative approach to calculate the posterior probability of GNSS integer ambiguity resolution

    Science.gov (United States)

    Yu, Xianwen; Wang, Jinling; Gao, Wang

    2016-10-01

    When precise positioning is carried out via GNSS carrier phases, it is important to make use of the property that every ambiguity should be an integer. With the known float solution, any integer vector, which has the same degree of freedom as the ambiguity vector, is the ambiguity vector in probability. For both integer aperture estimation and integer equivariant estimation, it is of great significance to know the posterior probabilities. However, to calculate the posterior probability, we have to face the thorny problem that the equation involves an infinite number of integer vectors. In this paper, using the float solution of ambiguity and its variance matrix, a new approach to rapidly and accurately calculate the posterior probability is proposed. The proposed approach consists of four steps. First, the ambiguity vector is transformed via decorrelation. Second, the range of the adopted integer of every component is directly obtained via formulas, and a finite number of integer vectors are obtained via combination. Third, using the integer vectors, the principal value of posterior probability and the correction factor are worked out. Finally, the posterior probability of every integer vector and its error upper bound can be obtained. In the paper, the detailed process to calculate the posterior probability and the derivations of the formulas are presented. The theory and numerical examples indicate that the proposed approach has the advantages of small amount of computations, high calculation accuracy and strong adaptability.

  20. An alternative approach to calculate the posterior probability of GNSS integer ambiguity resolution

    Science.gov (United States)

    Yu, Xianwen; Wang, Jinling; Gao, Wang

    2017-03-01

    When precise positioning is carried out via GNSS carrier phases, it is important to make use of the property that every ambiguity should be an integer. With the known float solution, any integer vector, which has the same degree of freedom as the ambiguity vector, is the ambiguity vector in probability. For both integer aperture estimation and integer equivariant estimation, it is of great significance to know the posterior probabilities. However, to calculate the posterior probability, we have to face the thorny problem that the equation involves an infinite number of integer vectors. In this paper, using the float solution of ambiguity and its variance matrix, a new approach to rapidly and accurately calculate the posterior probability is proposed. The proposed approach consists of four steps. First, the ambiguity vector is transformed via decorrelation. Second, the range of the adopted integer of every component is directly obtained via formulas, and a finite number of integer vectors are obtained via combination. Third, using the integer vectors, the principal value of posterior probability and the correction factor are worked out. Finally, the posterior probability of every integer vector and its error upper bound can be obtained. In the paper, the detailed process to calculate the posterior probability and the derivations of the formulas are presented. The theory and numerical examples indicate that the proposed approach has the advantages of small amount of computations, high calculation accuracy and strong adaptability.

  1. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  2. Notes on Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities

    OpenAIRE

    Hyun-Kyung Chung; Per Jönsson; Alexander Kramida

    2013-01-01

    Atomic structure and transition probabilities are fundamental physical data required in many fields of science and technology. Atomic physics codes are freely available to other community users to generate atomic data for their interest, but the quality of these data is rarely verified. This special issue addresses estimation of uncertainties in atomic structure and transition probability calculations, and discusses methods and strategies to assess and ensure the quality of theoretical atomic...

  3. Torpedo's Search Trajectory Design Based on Acquisition and Hit Probability Calculation

    Institute of Scientific and Technical Information of China (English)

    LI Wen-zhe; ZHANG Yu-wen; FAN Hui; WANG Yong-hu

    2008-01-01

    Taking aim at light torpedo search trajectory characteristic of warship, by analyzing common used torpedo search trajectory, a better torpedo search trajectory is designed, a mathematic model is built up, and the simulation calculation taking MK46 torpedo for example is carried out. The calculation results testify that this method can increase acquisition probability and hit probability by about 10%-30% at some situations and becomes feasible for the torpedo trajectory design. The research is of great reference value for the acoustic homing torpedo trajectory design and the torpedo combat efficiency research.

  4. Duality-based calculations for transition probabilities in stochastic chemical reactions

    Science.gov (United States)

    Ohkubo, Jun

    2017-02-01

    An idea for evaluating transition probabilities in chemical reaction systems is proposed, which is efficient for repeated calculations with various rate constants. The idea is based on duality relations; instead of direct time evolutions of the original reaction system, the dual process is dealt with. Usually, if one changes rate constants of the original reaction system, the direct time evolutions should be performed again, using the new rate constants. On the other hands, only one solution of an extended dual process can be reused to calculate the transition probabilities for various rate constant cases. The idea is demonstrated in a parameter estimation problem for the Lotka-Volterra system.

  5. Calculation of Quantum Probability in O(2,2) String Cosmology with a Dilaton Potential

    Institute of Scientific and Technical Information of China (English)

    YAN Jun

    2006-01-01

    The quantum properties of O(2,2) string cosmology with a dilaton potential are studied in this paper. The cosmological solutions are obtained on three-dimensional space-time. Moreover, the quantum probability of transition between two duality universe is calculated through a Wheeler-De Witt approach.

  6. PNO-CEPA and MCSCF-SCEP calculations of transition probabilities in OH, HF + , and HCl +

    Science.gov (United States)

    Werner, Hans-Joachim; Rosmus, Pavel; Schätzl, Wolfgang; Meyer, Wilfried

    1984-01-01

    Electronic transition moment functions for the A 2Σ+-X2Π transitions in OH, HF+, and HCl+ have been calculated using RHF, PNO-CI, PNO-CEPA, MCSCF, and MCSCF-SCEP wave functions. The vibrational band transition probabilities are obtained, and the resulting radiative lifetimes are compared with measured values. For OH and HCl+ the deviations are smaller than 10%, but the theoretical lifetimes for HF+ are larger by about 300% than the experimental values. For the electronic ground states of HF+ and HCl+ vibrational transition probabilities have been calculated from MCSCF-SCEP dipole moment functions. Both ions are predicted to be excellent absorbers and emitters in the infrared spectral region.

  7. Calculation of Radar Probability of Detection in K-Distributed Sea Clutter and Noise

    Science.gov (United States)

    2011-04-01

    Expanded Swerling Target Models, IEEE Trans. AES 39 (2003) 1059-1069. 18. G. Arfken , Mathematical Methods for Physicists, Second Edition, Academic...form solution for the probability of detection in K-distributed clutter, so numerical methods are required. The K distribution is a compound model...the integration, with the nodes and weights calculated using matrix methods , so that a general purpose numerical integration routine is not required

  8. Efficient Probability of Failure Calculations for QMU using Computational Geometry LDRD 13-0144 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Scott A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rushdi, Ahmad A. [Univ. of Texas, Austin, TX (United States); Abdelkader, Ahmad [Univ. of Maryland, College Park, MD (United States)

    2015-09-01

    This SAND report summarizes our work on the Sandia National Laboratory LDRD project titled "Efficient Probability of Failure Calculations for QMU using Computational Geometry" which was project #165617 and proposal #13-0144. This report merely summarizes our work. Those interested in the technical details are encouraged to read the full published results, and contact the report authors for the status of the software and follow-on projects.

  9. Impact of temporal probability in 4D dose calculation for lung tumors.

    Science.gov (United States)

    Rouabhi, Ouided; Ma, Mingyu; Bayouth, John; Xia, Junyi

    2015-11-08

    The purpose of this study was to evaluate the dosimetric uncertainty in 4D dose calculation using three temporal probability distributions: uniform distribution, sinusoidal distribution, and patient-specific distribution derived from the patient respiratory trace. Temporal probability, defined as the fraction of time a patient spends in each respiratory amplitude, was evaluated in nine lung cancer patients. Four-dimensional computed tomography (4D CT), along with deformable image registration, was used to compute 4D dose incorporating the patient's respiratory motion. First, the dose of each of 10 phase CTs was computed using the same planning parameters as those used in 3D treatment planning based on the breath-hold CT. Next, deformable image registration was used to deform the dose of each phase CT to the breath-hold CT using the deformation map between the phase CT and the breath-hold CT. Finally, the 4D dose was computed by summing the deformed phase doses using their corresponding temporal probabilities. In this study, 4D dose calculated from the patient-specific temporal probability distribution was used as the ground truth. The dosimetric evaluation matrix included: 1) 3D gamma analysis, 2) mean tumor dose (MTD), 3) mean lung dose (MLD), and 4) lung V20. For seven out of nine patients, both uniform and sinusoidal temporal probability dose distributions were found to have an average gamma passing rate > 95% for both the lung and PTV regions. Compared with 4D dose calculated using the patient respiratory trace, doses using uniform and sinusoidal distribution showed a percentage difference on average of -0.1% ± 0.6% and -0.2% ± 0.4% in MTD, -0.2% ± 1.9% and -0.2% ± 1.3% in MLD, 0.09% ± 2.8% and -0.07% ± 1.8% in lung V20, -0.1% ± 2.0% and 0.08% ± 1.34% in lung V10, 0.47% ± 1.8% and 0.19% ± 1.3% in lung V5, respectively. We concluded that four-dimensional dose computed using either a uniform or sinusoidal temporal probability distribution can

  10. A semiclassical model for the calculation of nonadiabatic transition probabilities for classically forbidden transitions.

    Science.gov (United States)

    Dang, Phuong-Thanh; Herman, Michael F

    2009-02-01

    A semiclassical surface hopping model is presented for the calculation of nonadiabatic transition probabilities for the case in which the avoided crossing point is in the classically forbidden regions. The exact potentials and coupling are replaced with simple functional forms that are fitted to the values, evaluated at the turning point in the classical motion, of the Born-Oppenheimer potentials, the nonadiabatic coupling, and their first few derivatives. For the one-dimensional model considered, reasonably accurate results for transition probabilities are obtained down to around 10(-10). The possible extension of this model to many dimensional problems is discussed. The fact that the model requires only information at the turning point, a point that the trajectories encounter would be a significant advantage in many dimensional problems over Landau-Zener type models, which require information at the avoided crossing seam, which is in the forbidden region where the trajectories do not go.

  11. Calculation of the number of Monte Carlo histories for a planetary protection probability of impact estimation

    Science.gov (United States)

    Barengoltz, Jack

    2016-07-01

    Monte Carlo (MC) is a common method to estimate probability, effectively by a simulation. For planetary protection, it may be used to estimate the probability of impact P{}_{I} by a launch vehicle (upper stage) of a protected planet. The object of the analysis is to provide a value for P{}_{I} with a given level of confidence (LOC) that the true value does not exceed the maximum allowed value of P{}_{I}. In order to determine the number of MC histories required, one must also guess the maximum number of hits that will occur in the analysis. This extra parameter is needed because a LOC is desired. If more hits occur, the MC analysis would indicate that the true value may exceed the specification value with a higher probability than the LOC. (In the worst case, even the mean value of the estimated P{}_{I} might exceed the specification value.) After the analysis is conducted, the actual number of hits is, of course, the mean. The number of hits arises from a small probability per history and a large number of histories; these are the classic requirements for a Poisson distribution. For a known Poisson distribution (the mean is the only parameter), the probability for some interval in the number of hits is calculable. Before the analysis, this is not possible. Fortunately, there are methods that can bound the unknown mean for a Poisson distribution. F. Garwoodfootnote{ F. Garwood (1936), ``Fiduciary limits for the Poisson distribution.'' Biometrika 28, 437-442.} published an appropriate method that uses the Chi-squared function, actually its inversefootnote{ The integral chi-squared function would yield probability α as a function of the mean µ and an actual value n.} (despite the notation used): This formula for the upper and lower limits of the mean μ with the two-tailed probability 1-α depends on the LOC α and an estimated value of the number of "successes" n. In a MC analysis for planetary protection, only the upper limit is of interest, i.e., the single

  12. Theoretical Calculations of Transition Probabilities and Oscillator Strengths for Sc(Ⅲ) and Y(Ⅲ)

    Institute of Scientific and Technical Information of China (English)

    Tian-yi Zhang; Neng-wu Zheng

    2009-01-01

    The Weakest Bound Electron Potential Model theory is used to calculate transition probability-values and oscillator strength-values for individual lines of Sc(Ⅲ) and Y(Ⅲ). In this method, by solving the SchrSdinger equation of the weakest bound electron, the expressions of energy eigenvalue and the radial function can be obtained. And a coupled equation is used to determine the parameters which are needed in the calculations. The ob-tained results of Sc(Ⅲ) from this work agree very well with the accepted values taken from the National Institute of Standards and Technoligy (NIST) data base, most deviations are within the accepted level. For Y(Ⅲ) there are no accepted values reported by the NIST data base. So we compared our results of Y(Ⅲ) with other theoretical results, good agreement is also obtained.

  13. Calculating inspector probability of detection using performance demonstration program pass rates

    Science.gov (United States)

    Cumblidge, Stephen; D'Agostino, Amy

    2016-02-01

    The United States Nuclear Regulatory Commission (NRC) staff has been working since the 1970's to ensure that nondestructive testing performed on nuclear power plants in the United States will provide reasonable assurance of structural integrity of the nuclear power plant components. One tool used by the NRC has been the development and implementation of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code Section XI Appendix VIII[1] (Appendix VIII) blind testing requirements for ultrasonic procedures, equipment, and personnel. Some concerns have been raised, over the years, by the relatively low pass rates for the Appendix VIII qualification testing. The NRC staff has applied statistical tools and simulations to determine the expected probability of detection (POD) for ultrasonic examinations under ideal conditions based on the pass rates for the Appendix VIII qualification tests for the ultrasonic testing personnel. This work was primarily performed to answer three questions. First, given a test design and pass rate, what is the expected overall POD for inspectors? Second, can we calculate the probability of detection for flaws of different sizes using this information? Finally, if a previously qualified inspector fails a requalification test, does this call their earlier inspections into question? The calculations have shown that one can expect good performance from inspectors who have passed appendix VIII testing in a laboratory-like environment, and the requalification pass rates show that the inspectors have maintained their skills between tests. While these calculations showed that the PODs for the ultrasonic inspections are very good under laboratory conditions, the field inspections are conducted in a very different environment. The NRC staff has initiated a project to systematically analyze the human factors differences between qualification testing and field examinations. This work will be used to evaluate and prioritize

  14. Peculiarities of high-overtone transition probabilities in carbon monoxide revealed by high-precision calculation

    Energy Technology Data Exchange (ETDEWEB)

    Medvedev, Emile S., E-mail: esmedved@orc.ru [The Institute of Problems of Chemical Physics, Russian Academy of Sciences, Prospect Akademika Semenova 1, 142432 Chernogolovka (Russian Federation); Meshkov, Vladimir V.; Stolyarov, Andrey V. [Department of Chemistry, Lomonosov Moscow State University, Leninskie gory 1/3, 119991 Moscow (Russian Federation); Gordon, Iouli E. [Atomic and Molecular Physics Division, Harvard-Smithsonian Center for Astrophysics, 60 Garden St, Cambridge, Massachusetts 02138 (United States)

    2015-10-21

    In the recent work devoted to the calculation of the rovibrational line list of the CO molecule [G. Li et al., Astrophys. J., Suppl. Ser. 216, 15 (2015)], rigorous validation of the calculated parameters including intensities was carried out. In particular, the Normal Intensity Distribution Law (NIDL) [E. S. Medvedev, J. Chem. Phys. 137, 174307 (2012)] was employed for the validation purposes, and it was found that, in the original CO line list calculated for large changes of the vibrational quantum number up to Δn = 41, intensities with Δn > 11 were unphysical. Therefore, very high overtone transitions were removed from the published list in Li et al. Here, we show how this type of validation is carried out and prove that the quadruple precision is indispensably required to predict the reliable intensities using the conventional 32-bit computers. Based on these calculations, the NIDL is shown to hold up for the 0 → n transitions till the dissociation limit around n = 83, covering 45 orders of magnitude in the intensity. The low-intensity 0 → n transition predicted in the work of Medvedev [Determination of a new molecular constant for diatomic systems. Normal intensity distribution law for overtone spectra of diatomic and polyatomic molecules and anomalies in overtone absorption spectra of diatomic molecules, Institute of Chemical Physics, Russian Academy of Sciences, Chernogolovka, 1984] at n = 5 is confirmed, and two additional “abnormal” intensities are found at n = 14 and 23. Criteria for the appearance of such “anomalies” are formulated. The results could be useful to revise the high-overtone molecular transition probabilities provided in spectroscopic databases.

  15. Burnup calculation by the method of first-flight collision probabilities using average chords prior to the first collision

    Science.gov (United States)

    Karpushkin, T. Yu.

    2012-12-01

    A technique to calculate the burnup of materials of cells and fuel assemblies using the matrices of first-flight neutron collision probabilities rebuilt at a given burnup step is presented. A method to rebuild and correct first collision probability matrices using average chords prior to the first neutron collision, which are calculated with the help of geometric modules of constructed stochastic neutron trajectories, is described. Results of calculation of the infinite multiplication factor for elementary cells with a modified material composition compared to the reference one as well as calculation of material burnup in the cells and fuel assemblies of a VVER-1000 are presented.

  16. Theoretical Calculations of Thermal Broadenings and Transition Probabilities of R, R' and B Line-Groups for Ruby

    Institute of Scientific and Technical Information of China (English)

    MA Dong-Ping; LIU Yan-Yun; CHEN Ju-Rong

    2001-01-01

    On the basis of the unified calculation of the thermal shifts of R1 line, R2 line and ground-state-splitting transition probabilities of direct and Raman processes have theoretically been calculated. The thermal broadenings of R,The theoretically predicted transition probabilities are in good agreement with the experimental ones.PACS numbers: 71.70.Ch, 78.20.Nv, 63.20.Mt, 63.20.Kr

  17. Improved techniques for outgoing wave variational principle calculations of converged state-to-state transition probabilities for chemical reactions

    Science.gov (United States)

    Mielke, Steven L.; Truhlar, Donald G.; Schwenke, David W.

    1991-01-01

    Improved techniques and well-optimized basis sets are presented for application of the outgoing wave variational principle to calculate converged quantum mechanical reaction probabilities. They are illustrated with calculations for the reactions D + H2 yields HD + H with total angular momentum J = 3 and F + H2 yields HF + H with J = 0 and 3. The optimization involves the choice of distortion potential, the grid for calculating half-integrated Green's functions, the placement, width, and number of primitive distributed Gaussians, and the computationally most efficient partition between dynamically adapted and primitive basis functions. Benchmark calculations with 224-1064 channels are presented.

  18. Calculation of Fire Severity Factors and Fire Non-Suppression Probabilities For A DOE Facility Fire PRA

    Energy Technology Data Exchange (ETDEWEB)

    Tom Elicson; Bentley Harwood; Jim Bouchard; Heather Lucek

    2011-03-01

    Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. The fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: • Development of time-dependent fire heat release rate profiles (required as input to CFAST), • Calculation of fire severity factors based on CFAST detailed fire modeling, and • Calculation of fire non-suppression probabilities.

  19. ELIPGRID-PC: A PC program for calculating hot spot probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, J.R.

    1994-10-01

    ELIPGRID-PC, a new personal computer program has been developed to provide easy access to Singer`s 1972 ELIPGRID algorithm for hot-spot detection probabilities. Three features of the program are the ability to determine: (1) the grid size required for specified conditions, (2) the smallest hot spot that can be sampled with a given probability, and (3) the approximate grid size resulting from specified conditions and sampling cost. ELIPGRID-PC also provides probability of hit versus cost data for graphing with spread-sheets or graphics software. The program has been successfully tested using Singer`s published ELIPGRID results. An apparent error in the original ELIPGRID code has been uncovered and an appropriate modification incorporated into the new program.

  20. Chronology of Postglacial Eruptive Activity and Calculation of Eruption Probabilities for Medicine Lake Volcano, Northern California

    Science.gov (United States)

    Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.

    2007-01-01

    Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.

  1. Remark about Transition Probabilities Calculation for Single Server Queues with Lognormal Inter-Arrival or Service Time Distributions

    Science.gov (United States)

    Lee, Moon Ho; Dudin, Alexander; Shaban, Alexy; Pokhrel, Subash Shree; Ma, Wen Ping

    Formulae required for accurate approximate calculation of transition probabilities of embedded Markov chain for single-server queues of the GI/M/1, GI/M/1/K, M/G/1, M/G/1/K type with heavy-tail lognormal distribution of inter-arrival or service time are given.

  2. Calculation of absolute concentrations and probability of resonant absorption for iron-bearing precipitates in zirconium alloys

    NARCIS (Netherlands)

    Filippov, V. P.; Petrov, V. I.; Lauer, D. E.; Shikanova, Yu. A.

    2006-01-01

    In order to find the absolute concentrations and the probability of resonant absorption, the theoretical dependence of effective thickness from Mossbauer absorption line area has been obtained. Calculations of absolute concentrations of secondary phase precipitate in zirconium alloys with natural ir

  3. Methodologies and Comparisons for Lund's Two Methods for Calculating Probability of Cloud-Free Line-of-Sight.

    Science.gov (United States)

    Yu, Shawn; Case, Kenneth E.; Chernick, Julian

    1986-03-01

    To help in the implementation of Lund's probability of cloud-free line-of-sight (PCFLOS) calculations (method A and method B) for limited altitudes, a methodology for cumulative cloud cover calculation (required for both methods) is introduced and a methodology for cumulative cloud form determination (required for method B) is developed. To study the PCFLOS differences between the two methods, Lund's master matrices are investigated and the derived PCFLOS results of Hamburg, Germany, are compared and analyzed for variations in selected environmental parameters. Based upon numerical studies performed in this research effort, it is strongly recommended that Lund's method B should always be adopted for general purpose worldwide PCFLOS calculations.

  4. Special Issue on Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities

    OpenAIRE

    Per Jönsson; Hyun-Kyung Chung

    2013-01-01

    There exist several codes in the atomic physics community to generate atomic structure and transition probabilities freely and readily distributed to researchers outside atomic physics community, in plasma, astrophysical or nuclear physics communities. Users take these atomic physics codes to generate the necessary atomic data or modify the codes for their own applications. However, there has been very little effort to validate and verify the data sets generated by non-expert users. [...

  5. Syntax for calculation of discounting indices from the monetary choice questionnaire and probability discounting questionnaire.

    Science.gov (United States)

    Gray, Joshua C; Amlung, Michael T; Palmer, Abraham A; MacKillop, James

    2016-09-01

    The 27-item Monetary Choice Questionnaire (MCQ; Kirby, Petry, & Bickel, 1999) and 30-item Probability Discounting Questionnaire (PDQ; Madden, Petry, & Johnson, 2009) are widely used, validated measures of preferences for immediate versus delayed rewards and guaranteed versus risky rewards, respectively. The MCQ measures delayed discounting by asking individuals to choose between rewards available immediately and larger rewards available after a delay. The PDQ measures probability discounting by asking individuals to choose between guaranteed rewards and a chance at winning larger rewards. Numerous studies have implicated these measures in addiction and other health behaviors. Unlike typical self-report measures, the MCQ and PDQ generate inferred hyperbolic temporal and probability discounting functions by comparing choice preferences to arrays of functions to which the individual items are preconfigured. This article provides R and SPSS syntax for processing the MCQ and PDQ. Specifically, for the MCQ, the syntax generates k values, consistency of the inferred k, and immediate choice ratios; for the PDQ, the syntax generates h indices, consistency of the inferred h, and risky choice ratios. The syntax is intended to increase the accessibility of these measures, expedite the data processing, and reduce risk for error.

  6. Calculation of identity-by-descent probabilities of short chromosome segments.

    Science.gov (United States)

    Tuchscherer, A; Teuscher, F; Reinsch, N

    2012-12-01

    For some purposes, identity-by-descent (IBD) probabilities for entire chromosome segments are required. Making use of pedigree information, length of the segment and the assumption of no crossing-over, a generalization of a previously published graph theory oriented algorithm accounting for nonzero IBD of common ancestors is given, which can be viewed as method of path coefficients for entire chromosome segments. Furthermore, rules for setting up a gametic version of a segmental IBD matrix are presented. Results from the generalized graph theory oriented method, the gametic segmental IBD matrix and the segmental IBD matrix for individuals are identical.

  7. Survival probability for diffractive dijet production in p-pbar collisions from next-to-leading order calculations

    CERN Document Server

    Klasen, M

    2009-01-01

    We perform next-to-leading order calculations of the single-diffractive and non-diffractive cross sections for dijet production in proton-antiproton collisions at the Tevatron. By comparing their ratio to the data published by the CDF collaboration for two different center-of-mass energies, we deduce the rapidity-gap survival probability as a function of the momentum fraction of the parton in the antiproton. Assuming Regge factorization, this probability can be interpreted as a suppression factor for the diffractive structure function measured in deep-inelastic scattering at HERA. In contrast to the observations for photoproduction, the suppression factor in proton-antiproton collisions depends on the momentum fraction of the parton in the Pomeron even at next-to-leading order.

  8. Survival probability for diffractive dijet production in pp¯ collisions from next-to-leading order calculations

    Science.gov (United States)

    Klasen, Michael; Kramer, Gustav

    2009-10-01

    We perform next-to-leading order calculations of the single-diffractive and nondiffractive cross sections for dijet production in proton-antiproton collisions at the Tevatron. By comparing their ratio to the data published by the CDF collaboration for two different center-of-mass energies, we deduce the rapidity-gap survival probability as a function of the momentum fraction of the parton in the antiproton. Assuming Regge factorization, this probability can be interpreted as a suppression factor for the diffractive structure function measured in deep-inelastic scattering at HERA. In contrast to the observations for photoproduction, the suppression factor in proton-antiproton collisions depends on the momentum fraction of the parton in the Pomeron even at next-to-leading order.

  9. Survival probability for diffractive dijet production in p anti p collisions from next-to-leading order calculations

    Energy Technology Data Exchange (ETDEWEB)

    Klasen, M. [Univ. Joseph Fourier, Laboratoire de Physique Subatomique et de Cosmologie, Grenoble (France); Kramer, G. [Univ. Hamburg, II. Inst. fuer Theoretische Physik (Germany)

    2009-08-15

    We perform next-to-leading order calculations of the single-diffractive and non-diffractive cross sections for dijet production in proton-antiproton collisions at the Tevatron. By comparing their ratio to the data published by the CDF collaboration for two different center-of-mass energies, we deduce the rapidity-gap survival probability as a function of the momentum fraction of the parton in the antiproton. Assuming Regge factorization, this probability can be interpreted as a suppression factor for the diffractive structure function measured in deep-inelastic scattering at HERA. In contrast to the observations for photoproduction, the suppression factor in protonantiproton collisions depends on the momentum fraction of the parton in the Pomeron even at next-to-leading order. (orig.)

  10. A Graph Model for calculating the probability for a moving cyclic disturbance interacting at a particular spatial position

    CERN Document Server

    Brown, D

    2003-01-01

    The analysis follows an earlier paper - Brown (2003) - which analysed a moving disturbance using a directed cyclic graph defined as Interrelated Fluctuating Entities (IFEs) of /STATE/, /SPACE/, /alphaTIME/, /betaTIME/. This paper provides a statistical analysis of the alternative positions in space and state of an IFE for a defined total time magnitude. The probability for a freely moving entity interacting in a particular spatial position is calculated and a formulation is derived for the minimum locus of uncertainty in position and momentum. The model has proven amenable to computer modelling (the assistance of University College London Computer Science department is gratefully acknowledged). A computer model is available on request.

  11. Web Service for Calculating the Probability of Returning a Loan – De-sign, Implementation and Deployment

    Directory of Open Access Journals (Sweden)

    Julian VASILEV

    2014-01-01

    Full Text Available The purpose of this paper is to describe the process of designing, creating, implementing and deploying a real web service. A basic theory approach is used to analyze the implementation of web services. An existing profit model is used. Its business logic is integrated within a web ser-vice. Another desktop application is created to demonstrate the use of the recently created web service. This study shows a methodology for fast development and deployment of web services. The methodology has wide practical implications – in credit institutions and banks when giving a loan. This study is the first of its kind to show the design, implementation and deployment of a web service for calculating the probability of returning a loan. The methodology may be used for the encapsulation of other business logic into web services.

  12. GTNEUT: A code for the calculation of neutral particle transport in plasmas based on the Transmission and Escape Probability method

    Science.gov (United States)

    Mandrekas, John

    2004-08-01

    GTNEUT is a two-dimensional code for the calculation of the transport of neutral particles in fusion plasmas. It is based on the Transmission and Escape Probabilities (TEP) method and can be considered a computationally efficient alternative to traditional Monte Carlo methods. The code has been benchmarked extensively against Monte Carlo and has been used to model the distribution of neutrals in fusion experiments. Program summaryTitle of program: GTNEUT Catalogue identifier: ADTX Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTX Computer for which the program is designed and others on which it has been tested: The program was developed on a SUN Ultra 10 workstation and has been tested on other Unix workstations and PCs. Operating systems or monitors under which the program has been tested: Solaris 8, 9, HP-UX 11i, Linux Red Hat v8.0, Windows NT/2000/XP. Programming language used: Fortran 77 Memory required to execute with typical data: 6 219 388 bytes No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: No No. of bytes in distributed program, including test data, etc.: 300 709 No. of lines in distributed program, including test data, etc.: 17 365 Distribution format: compressed tar gzip file Keywords: Neutral transport in plasmas, Escape probability methods Nature of physical problem: This code calculates the transport of neutral particles in thermonuclear plasmas in two-dimensional geometric configurations. Method of solution: The code is based on the Transmission and Escape Probability (TEP) methodology [1], which is part of the family of integral transport methods for neutral particles and neutrons. The resulting linear system of equations is solved by standard direct linear system solvers (sparse and non-sparse versions are included). Restrictions on the complexity of the problem: The current version of the code can

  13. PAPIN: A Fortran-IV program to calculate cross section probability tables, Bondarenko and transmission self-shielding factors for fertile isotopes in the unresolved resonance region

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Cobos, J.G.

    1981-08-01

    The Fortran IV code PAPIN has been developed to calculate cross section probability tables, Bondarenko self-shielding factors and average self-indication ratios for non-fissile isotopes, below the inelastic threshold, on the basis of the ENDF/B prescriptions for the unresolved resonance region. Monte-Carlo methods are utilized to generate ladders of resonance parameters in the unresolved resonance region, from average resonance parameters and their appropriate distribution functions. The neutron cross-sections are calculated by the single level Breit-Wigner (SLBW) formalism, with s, p and d-wave contributions. The cross section probability tables are constructed by sampling the Doppler-broadened cross sections. The various self-shielded factors are computed numerically as Lebesgue integrals over the cross section probability tables. The program PAPIN has been validated through extensive comparisons with several deterministic codes.

  14. Tumor control probability and the utility of 4D vs 3D dose calculations for stereotactic body radiotherapy for lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Valdes, Gilmer, E-mail: gilmer.valdes@uphs.upenn.edu [Department of Radiation Oncology, Perelman Center for Advanced Medicine, University of Pennsylvania, Philadelphia, PA (United States); Robinson, Clifford [Department of Radiation Oncology, Siteman Cancer Center, Washington University in St. Louis, St. Louis, MO (United States); Lee, Percy [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States); Morel, Delphine [Department of Biomedical Engineering, AIX Marseille 2 University, Marseille (France); Department of Medical Physics, Joseph Fourier University, Grenoble (France); Low, Daniel; Iwamoto, Keisuke S.; Lamb, James M. [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States)

    2015-04-01

    Four-dimensional (4D) dose calculations for lung cancer radiotherapy have been technically feasible for a number of years but have not become standard clinical practice. The purpose of this study was to determine if clinically significant differences in tumor control probability (TCP) exist between 3D and 4D dose calculations so as to inform the decision whether 4D dose calculations should be used routinely for treatment planning. Radiotherapy plans for Stage I-II lung cancer were created for 8 patients. Clinically acceptable treatment plans were created with dose calculated on the end-exhale 4D computed tomography (CT) phase using a Monte Carlo algorithm. Dose was then projected onto the remaining 9 phases of 4D-CT using the Monte Carlo algorithm and accumulated onto the end-exhale phase using commercially available deformable registration software. The resulting dose-volume histograms (DVH) of the gross tumor volume (GTV), planning tumor volume (PTV), and PTV{sub setup} were compared according to target coverage and dose. The PTV{sub setup} was defined as a volume including the GTV and a margin for setup uncertainties but not for respiratory motion. TCPs resulting from these DVHs were estimated using a wide range of alphas, betas, and tumor cell densities. Differences of up to 5 Gy were observed between 3D and 4D calculations for a PTV with highly irregular shape. When the TCP was calculated using the resulting DVHs for fractionation schedules typically used in stereotactic body radiation therapy (SBRT), the TCP differed at most by 5% between 4D and 3D cases, and in most cases, it was by less than 1%. We conclude that 4D dose calculations are not necessary for most cases treated with SBRT, but they might be valuable for irregularly shaped target volumes. If 4D calculations are used, 4D DVHs should be evaluated on volumes that include margin for setup uncertainty but not respiratory motion.

  15. Relativistic calculations of charge transfer probabilities in U92+ - U91+(1s) collisions using the basis set of cubic Hermite splines

    CERN Document Server

    Maltsev, I A; Tupitsyn, I I; Shabaev, V M; Kozhedub, Y S; Plunien, G; Stoehlker, Th

    2013-01-01

    A new approach for solving the time-dependent two-center Dirac equation is presented. The method is based on using the finite basis set of cubic Hermite splines on a two-dimensional lattice. The Dirac equation is treated in rotating reference frame. The collision of U92+ (as a projectile) and U91+ (as a target) is considered at energy E_lab=6 MeV/u. The charge transfer probabilities are calculated for different values of the impact parameter. The obtained results are compared with the previous calculations [I. I. Tupitsyn et al., Phys. Rev. A 82, 042701 (2010)], where a method based on atomic-like Dirac-Sturm orbitals was employed. This work can provide a new tool for investigation of quantum electrodynamics effects in heavy-ion collisions near the supercritical regime.

  16. Ab initio quantum mechanical calculation of the reaction probability for the Cl-+PH2Cl→ClPH2+Cl- reaction

    Science.gov (United States)

    Farahani, Pooria; Lundberg, Marcus; Karlsson, Hans O.

    2013-11-01

    The SN2 substitution reactions at phosphorus play a key role in organic and biological processes. Quantum molecular dynamics simulations have been performed to study the prototype reaction Cl-+PH2Cl→ClPH2+Cl-, using one and two-dimensional models. A potential energy surface, showing an energy well for a transition complex, was generated using ab initio electronic structure calculations. The one-dimensional model is essentially reflection free, whereas the more realistic two-dimensional model displays involved resonance structures in the reaction probability. The reaction rate is almost two orders of magnitude smaller for the two-dimensional compared to the one-dimensional model. Energetic errors in the potential energy surface is estimated to affect the rate by only a factor of two. This shows that for these types of reactions it is more important to increase the dimensionality of the modeling than to increase the accuracy of the electronic structure calculation.

  17. Continuous-Energy Adjoint Flux and Perturbation Calculation using the Iterated Fission Probability Method in Monte Carlo Code TRIPOLI-4® and Underlying Applications

    Science.gov (United States)

    Truchet, G.; Leconte, P.; Peneliau, Y.; Santamarina, A.; Malvagi, F.

    2014-06-01

    Pile-oscillation experiments are performed in the MINERVE reactor at the CEA Cadarache to improve nuclear data accuracy. In order to precisely calculate small reactivity variations (experiments, a reference calculation need to be achieved. This calculation may be accomplished using the continuous-energy Monte Carlo code TRIPOLI-4® by using the eigenvalue difference method. This "direct" method has shown limitations in the evaluation of very small reactivity effects because it needs to reach a very small variance associated to the reactivity in both states. To answer this problem, it has been decided to implement the exact perturbation theory in TRIPOLI-4® and, consequently, to calculate a continuous-energy adjoint flux. The Iterated Fission Probability (IFP) method was chosen because it has shown great results in some other Monte Carlo codes. The IFP method uses a forward calculation to compute the adjoint flux, and consequently, it does not rely on complex code modifications but on the physical definition of the adjoint flux as a phase-space neutron importance. In the first part of this paper, the IFP method implemented in TRIPOLI-4® is described. To illustrate the effciency of the method, several adjoint fluxes are calculated and compared with their equivalent obtained by the deterministic code APOLLO-2. The new implementation can calculate angular adjoint flux. In the second part, a procedure to carry out an exact perturbation calculation is described. A single cell benchmark has been used to test the accuracy of the method, compared with the "direct" estimation of the perturbation. Once again the method based on the IFP shows good agreement for a calculation time far more inferior to the "direct" method. The main advantage of the method is that the relative accuracy of the reactivity variation does not depend on the magnitude of the variation itself, which allows us to calculate very small reactivity perturbations with high precision. Other applications of

  18. The Calculation Method of Mathematical Expectation of Probability%概率论中数学期望的计算方法

    Institute of Scientific and Technical Information of China (English)

    李晓燕; 黄丽莉

    2014-01-01

    This article ,by probability theory ,the historical development and narration of mathemati-cal expectation of probability theory ,attempts to show us the necessity of mathematical expectation and simple probability as well as the wide application of them .After that ,the article discusses some realistic calculation methods of mathematical expectation ,expecting to make the model of mathematics widely used in practical teaching .%通过对概率论和概率论中数学期望的历史发展的叙述以及定义的阐述,简要说明了研究概率和数学期望的必要性和在生活中的广泛应用,探讨了关于数学期望的多种具有现实意义的计算方法,旨在数学期望的模型能在实际教学以及应用中让人更加得心应手。

  19. Efficient calculation of detection probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Thoreson, Gregory G., E-mail: gthoreson@mail.utexas.ed [University of Texas - Austin, Pickle Research Campus, R-9000, Austin, TX 78712 (United States); Schneider, Erich A. [University of Texas - Austin, Pickle Research Campus, R-9000, Austin, TX 78712 (United States)

    2010-04-11

    Radiation transport simulations have found wide use as a detector and system design tool for smuggled nuclear material interdiction applications. A major obstacle to the utility of Monte Carlo radiation transport to this class of problems is the computational burden associated with simulating a spanning set of threat scenarios. One common method for circumventing this obstacle models a subset of detailed scenarios which are considered representative of the system. Another simplifies the threat scenarios, enabling many cases to be simulated at the cost of a loss of fidelity. This paper demonstrates a new approach to the problem of modeling a very large scenario set. The scenario is disaggregated into components in which radiation transport may be simulated independently. Green's functions for each submodel are generated, parameterized with respect to major scenario variables, and convolved to create a depiction of the radiation transport within the entire scenario. With this approach, the computation time required to model many different scenarios is greatly reduced. The theoretical basis of this algorithm is presented along with validation results that show it to be comparable in fidelity to more computationally intensive methods, in particular brute-force simulation.

  20. Simultaneous analysis of matter radii, transition probabilities, and excitation energies of Mg isotopes by angular-momentum-projected configuration-mixing calculations

    Science.gov (United States)

    Shimada, Mitsuhiro; Watanabe, Shin; Tagami, Shingo; Matsumoto, Takuma; Shimizu, Yoshifumi R.; Yahiro, Masanobu

    2016-06-01

    We perform simultaneous analysis of (1) matter radii, (2) B (E 2 ;0+→2+) transition probabilities, and (3) excitation energies, E (2+) and E (4+) , for Mg-4024 by using the beyond-mean-field (BMF) framework with angular-momentum-projected configuration mixing with respect to the axially symmetric β2 deformation with infinitesimal cranking. The BMF calculations successfully reproduce all of the data for rm,B (E 2 ) , and E (2+) and E (4+) , indicating that it is quite useful for data analysis; particularly for low-lying states. We also discuss the absolute value of the deformation parameter β2 deduced from measured values of B (E 2 ) and rm. This framework makes it possible to investigate the effects of β2 deformation, the change in β2 due to restoration of rotational symmetry, β2 configuration mixing, and the inclusion of time-odd components by infinitesimal cranking. Under the assumption of axial deformation and parity conservation, we clarify which effect is important for each of the three measurements and propose the kinds of BMF calculations that are practical for each of the three kinds of observables.

  1. Simultaneous analysis of matter radii, transition probabilities, and excitation energies of Mg isotopes by angular-momentum-projected configuration-mixing calculations

    CERN Document Server

    Shimada, Mitsuhiro; Tagami, Shingo; Matsumoto, Takuma; Shimizu, Yoshifumi R; Yahiro, Masanobu

    2016-01-01

    We perform simultaneous analysis of (1) matter radii, (2) $B(E2; 0^+ \\rightarrow 2^+ )$ transition probabilities, and (3) excitation energies, $E(2^+)$ and $E(4^+)$, for $^{24-40}$Mg by using the beyond mean-field (BMF) framework with angular-momentum-projected configuration mixing with respect to the axially symmetric $\\beta_2$ deformation with infinitesimal cranking. The BMF calculations successfully reproduce all of the data for $r_{\\rm m}$, $B(E2)$, and $E(2^+)$ and $E(4^+)$, indicating that it is quite useful for data analysis, particularly for low-lying states. We also discuss the absolute value of the deformation parameter $\\beta_2$ deduced from measured values of $B(E2)$ and $r_{\\rm m}$. This framework makes it possible to investigate the effects of $\\beta_2$ deformation, the change in $\\beta_2$ due to restoration of rotational symmetry, $\\beta_2$ configuration mixing, and the inclusion of time-odd components by infinitesimal cranking. Under the assumption of axial deformation and parity conservation,...

  2. Monte Carlo dose calculations and radiobiological modelling: analysis of the effect of the statistical noise of the dose distribution on the probability of tumour control.

    Science.gov (United States)

    Buffa, F M; Nahum, A E

    2000-10-01

    The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, sigma(d); whilst the quantities d and sigma(d) depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 10(8) from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error

  3. Dynamical Simulation of Probabilities

    Science.gov (United States)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.

  4. Full-dimensional and reduced-dimensional calculations of initial state-selected reaction probabilities studying the H + CH4 → H2 + CH3 reaction on a neural network PES

    Science.gov (United States)

    Welsch, Ralph; Manthe, Uwe

    2015-02-01

    Initial state-selected reaction probabilities of the H + CH4 → H2 + CH3 reaction are calculated in full and reduced dimensionality on a recent neural network potential [X. Xu, J. Chen, and D. H. Zhang, Chin. J. Chem. Phys. 27, 373 (2014)]. The quantum dynamics calculation employs the quantum transition state concept and the multi-layer multi-configurational time-dependent Hartree approach and rigorously studies the reaction for vanishing total angular momentum (J = 0). The calculations investigate the accuracy of the neutral network potential and study the effect resulting from a reduced-dimensional treatment. Very good agreement is found between the present results obtained on the neural network potential and previous results obtained on a Shepard interpolated potential energy surface. The reduced-dimensional calculations only consider motion in eight degrees of freedom and retain the C3v symmetry of the methyl fragment. Considering reaction starting from the vibrational ground state of methane, the reaction probabilities calculated in reduced dimensionality are moderately shifted in energy compared to the full-dimensional ones but otherwise agree rather well. Similar agreement is also found if reaction probabilities averaged over similar types of vibrational excitation of the methane reactant are considered. In contrast, significant differences between reduced and full-dimensional results are found for reaction probabilities starting specifically from symmetric stretching, asymmetric (f2-symmetric) stretching, or e-symmetric bending excited states of methane.

  5. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  6. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  7. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  8. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  9. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    Science.gov (United States)

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  10. The comparison of calculated transition probabilities with luminescence characteristics of erbium(III) in fluoride glasses and in the mixed yttrium-zirconium oxide crystal

    Science.gov (United States)

    Reisfeld, R.; Katz, G.; Jacoboni, C.; De Pape, R.; Drexhage, M. G.; Brown, R. N.; Jørgensen, C. K.

    1983-07-01

    Fluorozirconate glasses containing 2 mole% ErF 3 were prepared by melting the binary fluorides with ammonium bifluoride under an atmosphere of carbon tetrachloride and argon at 850°C. Absorption spectra of these glasses were obtained and the Judd-Ofelt parameters were calculated. Emission spectra and lifetimes of erbium in fluorozirconate glass, in lead-gallium-zinc fluoride glass, and in yttrium-zirconium oxide crystal were measured and compared with the theoretical calculations. Laser emission lines in these materials are deduced from these measurements. It is suggested that materials doped with erbium may serve as light sources for fiber optic waveguides made from the undoped materials.

  11. Improving Ranking Using Quantum Probability

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.

  12. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  13. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  14. Relativistic calculations of the K-K charge transfer and K-vacancy production probabilities in low-energy ion-atom collisions

    CERN Document Server

    Tupitsyn, I I; Shabaev, V M; Bondarev, A I; Deyneka, G B; Maltsev, I A; Hagmann, S; Plunien, G; Stoehlker, Th

    2011-01-01

    The previously developed technique for evaluation of charge-transfer and electron-excitation processes in low-energy heavy-ion collisions [I.I. Tupitsyn et al., Phys. Rev. A 82, 042701(2010)] is extended to collisions of ions with neutral atoms. The method employs the active electron approximation, in which only the active electron participates in the charge transfer and excitation processes while the passive electrons provide the screening DFT potential. The time-dependent Dirac wave function of the active electron is represented as a linear combination of atomic-like Dirac-Fock-Sturm orbitals, localized at the ions (atoms). The screening DFT potential is calculated using the overlapping densities of each ions (atoms), derived from the atomic orbitals of the passive electrons. The atomic orbitals are generated by solving numerically the one-center Dirac-Fock and Dirac-Fock-Sturm equations by means of a finite-difference approach with the potential taken as the sum of the exact reference ion (atom) Dirac-Fock...

  15. Effect of physicochemical aging conditions on the composite-composite repair bond strength

    NARCIS (Netherlands)

    Brendeke, Johannes; Ozcan, Mutlu

    2007-01-01

    Purpose: This study evaluated the effect of different physicochemical aging methods and surface conditioning techniques on the repair bond strength of composite. It was hypothesized that the aging conditions would decrease the repair bond strength and surface conditioning methods would perform simil

  16. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  17. Landau-Zener Probability Reviewed

    CERN Document Server

    Valencia, C

    2008-01-01

    We examine the survival probability for neutrino propagation through matter with variable density. We present a new method to calculate the level-crossing probability that differs from Landau's method by constant factor, which is relevant in the interpretation of neutrino flux from supernova explosion.

  18. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  19. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  20. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  1. Effect of age condition on fatigue properties of 2E12 aluminum alloy

    Institute of Scientific and Technical Information of China (English)

    YAN Liang; DU Feng-shan; DAI Sheng-long; YANG Shou-jie

    2010-01-01

    The fatigue behaviors of 2E12 aluminum alloy in T3 and T6 conditions at room temperature in air were investigated.The microstructures and fatigue fracture surfaces of the alloy were examined by transmission electron microscopy(TEM)and scanning electron microscopy(SEM).The results show that the alloy exhibits higher fatigue crack propagation(FCP)resistance in T3condition than in T6 condition,the fatigue life is increased by 54% and the fatigue crack growth rate(FCGR)decreases significantly.The fatigue fractures of the alloy in T3 and T6 conditions are transgranular.But in T3 condition,secondary cracks occur and fatigue striations are not clear.In T6 condition,ductile fatigue striations are observed.The effect of aging conditions on fatigue behaviors is explained in terms of the slip planarity of dislocations and the cyclic slip reversibility.

  2. Quantitative Analysis of Ageing Condition of Insulating Paper Using Infrared Spectroscopy

    Directory of Open Access Journals (Sweden)

    R. Saldivar-Guerrero

    2016-01-01

    Full Text Available Transformers are very expensive apparatuses and are vital to make the whole power system run normally. The failures in such apparatuses could leave them out of service, causing severe economic losses. The life of a transformer can be effectively determined by the life of the insulating paper. In the present work, we show an alternative diagnostic technique to determine the ageing condition of transformer paper by the use of FTIR spectroscopy and an empirical model. This method has the advantage of using a microsample that could be extracted from the transformer on-site. The proposed technique offers an approximation quantitative evaluation of the degree of polymerization of dielectric papers and could be used for transformer diagnosis and remaining life estimation.

  3. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  4. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  5. Cluster pre-existence probability

    Energy Technology Data Exchange (ETDEWEB)

    Rajeswari, N.S.; Vijayaraghavan, K.R.; Balasubramaniam, M. [Bharathiar University, Department of Physics, Coimbatore (India)

    2011-10-15

    Pre-existence probability of the fragments for the complete binary spectrum of different systems such as {sup 56}Ni, {sup 116}Ba, {sup 226}Ra and {sup 256}Fm are calculated, from the overlapping part of the interaction potential using the WKB approximation. The role of reduced mass as well as the classical hydrodynamical mass in the WKB method is analysed. Within WKB, even for negative Q -value systems, the pre-existence probability is calculated. The calculations reveal rich structural information. The calculated results are compared with the values of preformed cluster model of Gupta and collaborators. The mass asymmetry motion is shown here for the first time as a part of relative separation motion. (orig.)

  6. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  7. Nuclear structure of tellurium 133 via beta decay and shell model calculations in the doubly magic tin 132 region. [J,. pi. , transition probabilities, neutron and proton separation, g factors

    Energy Technology Data Exchange (ETDEWEB)

    Lane, S.M.

    1979-08-01

    An experimental investigation of the level structure of /sup 133/Te was performed by spectroscopy of gamma-rays following the beta-decay of 2.7 min /sup 133/Sb. Multiscaled gamma-ray singles spectra and 2.5 x 10/sup 7/ gamma-gamma coincidence events were used in the assignment of 105 of the approximately 400 observed gamma-rays to /sup 133/Sb decay and in the construction of the /sup 133/Te level scheme with 29 excited levels. One hundred twenty-two gamma-rays were identified as originating in the decay of other isotopes of Sb or their daughter products. The remaining gamma-rays were associated with the decay of impurity atoms or have as yet not been identified. A new computer program based on the Lanczos tridiagonalization algorithm using an uncoupled m-scheme basis and vector manipulations was written. It was used to calculate energy levels, parities, spins, model wavefunctions, neutron and proton separation energies, and some electromagnetic transition probabilities for the following nuclei in the /sup 132/Sn region: /sup 128/Sn, /sup 129/Sn, /sup 130/Sn, /sup 131/Sn, /sup 130/Sb, /sup 131/Sb, /sup 132/Sb, /sup 133/Sb, /sup 132/Te, /sup 133/Te, /sup 134/Te, /sup 134/I, /sup 135/I, /sup 135/Xe, and /sup 136/Xe. The results are compared with experiment and the agreement is generally good. For non-magic nuclei: the lg/sub 7/2/, 2d/sub 5/2/, 2d/sub 3/2/, 1h/sub 11/2/, and 3s/sub 1/2/ orbitals are available to valence protons and the 2d/sub 5/2/, 2d/sub 3/2/, 1h/sub 11/2/, and 3s/sub 1/2/ orbitals are available to valence neutron holes. The present CDC7600 computer code can accommodate 59 single particle states and vectors comprised of 30,000 Slater determinants. The effective interaction used was that of Petrovich, McManus, and Madsen, a modification of the Kallio-Kolltveit realistic force. Single particle energies, effective charges and effective g-factors were determined from experimental data for nuclei in the /sup 132/Sn region. 116 references.

  8. Average Transmission Probability of a Random Stack

    Science.gov (United States)

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  9. Diurnal distribution of sunshine probability

    Energy Technology Data Exchange (ETDEWEB)

    Aydinli, S.

    1982-01-01

    The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.

  10. Are early onset aging conditions correlated to daily activity functions in youth and adults with Down syndrome?

    Science.gov (United States)

    Lin, Jin-Ding; Lin, Lan-Ping; Hsu, Shang-Wei; Chen, Wen-Xiu; Lin, Fu-Gong; Wu, Jia-Ling; Chu, Cordia

    2014-11-13

    This study aims to answer the research question of "Are early onset aging conditions correlated to daily activity functions in youth and adults with Down syndrome (DS)?" A cross-sectional survey was employed to recruit 216 individuals with DS over 15 years of age in the analyses. A structured questionnaire included demographic data, brief self-reported aging conditions, Dementia Screening Questionnaire for Individuals with Intellectual Disabilities (DSQIID) and activity of daily living (ADL) scales were completed by the primary caregivers who were well-suited for providing information on the functioning conditions of the DS individuals. Results showed that the most five frequent aging conditions (sometimes, usually and always) included frailty (20.2%), vision problem (15.8%), loss of language ability (15.3%), sleep problem (14.9%) and memory impairment (14.5%). Other onset aging conditions included more chronic diseases (13.9%), hearing loss (13%), chewing ability and tooth loss (12.5%), incontinence (11.1%), depressive syndrome (7.7%), falls and gait disorder (7.2%), loss of taste and smell (7.2%). The data also showed scores of DSQIID, onset aging conditions and ADL has significant relationships each other in Pearson's correlation tests. Finally, multiple linear regression analyses indicated onset aging conditions (β=-0.735, p<0.001) can significantly predicted the variation in ADL scores after adjusting other factors (R(2)=0.381). This study suggests that the authority should initiate early intervention programs aim to improve healthy aging and ADL functions for people with DS.

  11. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  12. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  13. Effect of surface conditioning methods on the microtensile bond strength of resin composite to composite after aging conditions

    NARCIS (Netherlands)

    Ozcan, Mutlu; Barbosa, Silvia Helena; Melo, Renata Marques; Galhano, Graziela Avila Prado; Bottino, Marco Antonio

    2007-01-01

    Objectives. This study evaluated the effect of two different surface conditioning methods on the repair bond strength of a bis-GMA-adduct/bis-EMA/TEGDMA based resin composite after three aging conditions. Methods. Thirty-six composite resin blocks (Esthet X, Dentsply) were prepared (5 mm x 6 mm x 6

  14. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  15. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  16. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  17. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  18. Stage line diagram: an age-conditional reference diagram for tracking development.

    NARCIS (Netherlands)

    Van Buuren, S.; Ooms, J.C.L.

    2009-01-01

    This paper presents a method for calculating stage line diagrams, a novel type of reference diagram useful for tracking developmental processes over time. Potential fields of applications include: dentistry (tooth eruption), oncology (tumor grading, cancer staging), virology (HIV infection and disea

  19. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  20. Monte Carlo calculation of the total probability for gamma-Ray interaction in toluene; Aplicacion del metodo de Monte Carlo al calcu lo de la probabilidad de interaccion fotonica en tolueno

    Energy Technology Data Exchange (ETDEWEB)

    Grau Malonda, A.; Garcia-Torano, E.

    1983-07-01

    Interaction and absorption probabilities for gamma-rays with energies between 1 and 1000 KeV have been computed and tabulated. Toluene based scintillator solution has been assumed in the computation. Both, point sources and homogeneously dispersed radioactive material have been assumed. These tables may be applied to cylinders with radii between 1.25 cm and 0.25 cm and heights between 4.07 cm and 0.20 cm. (Author) 26 refs.

  1. Insurance Calculation of Bankruptcy Probability of Constant Interest Rate Model under Dependent Negative%负相依下带常数利率模型的破产概率的保险计算

    Institute of Scientific and Technical Information of China (English)

    李明倩

    2014-01-01

    本文研究了负相依索赔条件下带常数利率的风险模型在随机区间上的破产问题,最终得到了该模型破产概率的渐进表达式。%This paper studies the risk model under conditions of constant interest rates negatively correlated claims in the bankruptcy issue random intervals, and finally get the asymptotic expression of the model the probability of bankruptcy.

  2. Probability and radical behaviorism

    Science.gov (United States)

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  3. Probability and radical behaviorism

    OpenAIRE

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...

  4. PROBABILITY AND STATISTICS.

    Science.gov (United States)

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  5. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  6. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  7. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  8. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice prob...

  9. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  10. On Quantum Conditional Probability

    Directory of Open Access Journals (Sweden)

    Isabel Guerra Bobo

    2013-02-01

    Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.

  11. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  12. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  13. Understanding Y haplotype matching probability.

    Science.gov (United States)

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  14. The estimation of tree posterior probabilities using conditional clade probability distributions.

    Science.gov (United States)

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  15. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  16. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  17. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  18. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  19. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  20. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  1. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  2. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  3. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  4. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  5. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.

  6. The Probability of Default Calculation Model of Listed Banks Based on Long Term Liability Coefficient Optimization%基于最优负债系数的上市银行违约概率测算模型与实证

    Institute of Scientific and Technical Information of China (English)

    迟国泰; 曹勇; 党均章

    2012-01-01

    当上市银行的长期负债系数γ的取值不同时,应用KMV模型测算出的银行违约概率大相径庭.根据债券的实际信用利差可以推算出上市银行的违约概率PD(I),CS,根据长期负债系数γ可以运用KMV模型确定上市银行的理论违约概率PDi,KMV.本文通过理论违约率与实际违约率的总体差异n∑i=(I)|PDi,KMV-PDi,ES|最小的思路建立规划模型,确定了KMV模型的最优长期负债γ系数;通过最优长期负债系数γ建立了未发债上市银行的违约率测算模型、并实证测算了我国14家全都上市银行的违约概率.本文的创新与特色一是采用KMV模型计算的银行违约概率PDi.KMV与实际信用利差确定的银行违约概率PDi,CS总体差异∑|PDi,KMV-PDi,es|最小的思路建立规划模型,确定了KMV模型中的最优长期负债γ系数;使γ系数的确定符合资本市场利差的实际状况,解决了现有研究中在0和1之间当采用不同的长期负债系数γ、其违约概率的计算结果截然不同的问题.二是实证研究表明,当长期负债系数γ=0.7654时,应用KMV模型测算出的我国上市银行违约概率与我国债券市场所接受的上市银行违约概率最为接近.三是实证研究表明国有上市银行违约概率最低,区域性的上市银行违约概率较高,其他上市银行的违约概率居中.%When the long term liability coefficient y equals various values, the default probabilities of listed banks calculated by KMV model are quite different. The real default probabilities of listed banks Pdics can be measured by the credit spreads of financial bond issued by banks. The theoretical default probabilities of listed banks Pdi can be calculated by KMV model with a certain coefficient y of long term liability. A programming model is established to calculate the optimal value of long term liability coefficient y following the idea to minimize the total differences Σim1|PDiKMV-Pdics| between the

  7. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  8. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  9. Probabilities from Envariance

    CERN Document Server

    Zurek, W H

    2004-01-01

    I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...

  10. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  11. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  12. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  13. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  14. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  15. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  16. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  17. Varga: On Probability.

    Science.gov (United States)

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  18. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  19. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  20. Survival probability for open spherical billiards

    Science.gov (United States)

    Dettmann, Carl P.; Rahman, Mohammed R.

    2014-12-01

    We study the survival probability for long times in an open spherical billiard, extending previous work on the circular billiard. We provide details of calculations regarding two billiard configurations, specifically a sphere with a circular hole and a sphere with a square hole. The constant terms of the long-time survival probability expansions have been derived analytically. Terms that vanish in the long time limit are investigated analytically and numerically, leading to connections with the Riemann hypothesis.

  1. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  2. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  3. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  4. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  5. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  6. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  7. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  8. Calculation of the pipes failure probability of the Rcic system of a nuclear power station by means of software WinPRAISE 07; Calculo de la probabilidad de falla de tuberias del sistema RCIC de una central nuclear mediante el software WinPRAISE 07

    Energy Technology Data Exchange (ETDEWEB)

    Jasso G, J.; Diaz S, A.; Mendoza G, G.; Sainz M, E. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Garcia de la C, F. M., E-mail: angeles.diaz@inin.gob.mx [Comision Federal de Electricidad, Central Nucleoelectrica Laguna Verde, Km 44.5 Carretera Cardel-Nautla, 91476 Laguna Verde, Alto Lucero, Veracruz (Mexico)

    2014-10-15

    The growth and the cracks propagation by fatigue are a typical degradation mechanism that is presented in the nuclear industry as in the conventional industry; the unstable propagation of a crack can cause the catastrophic failure of a metallic component even with high ductility; for this reason, activities of programmed maintenance have been established in the industry using inspection and visual techniques and/or ultrasound with an established periodicity allowing to follow up to these growths, controlling the undesirable effects; however, these activities increase the operation costs; and in the peculiar case of the nuclear industry, they increase the radiation exposure to the participant personnel. The use of mathematical processes that integrate concepts of uncertainty, material properties and the probability associated to the inspection results, has been constituted as a powerful tool of evaluation of the component reliability, reducing costs and exposure levels. In this work the evaluation of the failure probability by cracks growth preexisting by fatigue is presented, in pipes of a Reactor Core Isolation Cooling system (Rcic) in a nuclear power station. The software WinPRAISE 07 (Piping Reliability Analysis Including Seismic Events) was used supported in the probabilistic fracture mechanics principles. The obtained values of failure probability evidenced a good behavior of the analyzed pipes with a maximum order of 1.0 E-6, therefore is concluded that the performance of the lines of these pipes is reliable even extrapolating the calculations at 10, 20, 30 and 40 years of service. (Author)

  9. Empirical and Computational Tsunami Probability

    Science.gov (United States)

    Geist, E. L.; Parsons, T.; ten Brink, U. S.; Lee, H. J.

    2008-12-01

    A key component in assessing the hazard posed by tsunamis is quantification of tsunami likelihood or probability. To determine tsunami probability, one needs to know the distribution of tsunami sizes and the distribution of inter-event times. Both empirical and computational methods can be used to determine these distributions. Empirical methods rely on an extensive tsunami catalog and hence, the historical data must be carefully analyzed to determine whether the catalog is complete for a given runup or wave height range. Where site-specific historical records are sparse, spatial binning techniques can be used to perform a regional, empirical analysis. Global and site-specific tsunami catalogs suggest that tsunami sizes are distributed according to a truncated or tapered power law and inter-event times are distributed according to an exponential distribution modified to account for clustering of events in time. Computational methods closely follow Probabilistic Seismic Hazard Analysis (PSHA), where size and inter-event distributions are determined for tsunami sources, rather than tsunamis themselves as with empirical analysis. In comparison to PSHA, a critical difference in the computational approach to tsunami probabilities is the need to account for far-field sources. The three basic steps in computational analysis are (1) determination of parameter space for all potential sources (earthquakes, landslides, etc.), including size and inter-event distributions; (2) calculation of wave heights or runup at coastal locations, typically performed using numerical propagation models; and (3) aggregation of probabilities from all sources and incorporation of uncertainty. It is convenient to classify two different types of uncertainty: epistemic (or knowledge-based) and aleatory (or natural variability). Correspondingly, different methods have been traditionally used to incorporate uncertainty during aggregation, including logic trees and direct integration. Critical

  10. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  11. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  12. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  13. Learning unbelievable marginal probabilities

    CERN Document Server

    Pitkow, Xaq; Miller, Ken D

    2011-01-01

    Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...

  14. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  15. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  16. Probabilities for Solar Siblings

    CERN Document Server

    Valtonen, M; Bobylev, V V; Myllari, A

    2015-01-01

    We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  17. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  18. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  19. Probability Theory without Bayes' Rule

    OpenAIRE

    Rodriques, Samuel G.

    2014-01-01

    Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...

  20. Chemical immobilization of adult female Weddell seals with tiletamine and zolazepam: effects of age, condition and stage of lactation

    Directory of Open Access Journals (Sweden)

    Harcourt Robert G

    2006-02-01

    Full Text Available Abstract Background Chemical immobilization of Weddell seals (Leptonychotes weddellii has previously been, for the most part, problematic and this has been mainly attributed to the type of immobilizing agent used. In addition to individual sensitivity, physiological status may play an important role. We investigated the use of the intravenous administration of a 1:1 mixture of tiletamine and zolazepam (Telazol® to immobilize adult females at different points during a physiologically demanding 5–6 week lactation period. We also compared performance between IV and IM injection of the same mixture. Results The tiletamine:zolazepam mixture administered intravenously was an effective method for immobilization with no fatalities or pronounced apnoeas in 106 procedures; however, there was a 25 % (one animal in four mortality rate with intramuscular administration. Induction time was slightly longer for females at the end of lactation (54.9 ± 2.3 seconds than at post-parturition (48.2 ± 2.9 seconds. In addition, the number of previous captures had a positive effect on induction time. There was no evidence for effects due to age, condition (total body lipid, stage of lactation or number of captures on recovery time. Conclusion We suggest that intravenous administration of tiletamine and zolazepam is an effective and safe immobilizing agent for female Weddell seals. Although individual traits could not explain variation in recovery time, we suggest careful monitoring of recovery times during longitudinal studies (> 2 captures. We show that physiological pressures do not substantially affect response to chemical immobilization with this mixture; however, consideration must be taken for differences that may exist for immobilization of adult males and juveniles. Nevertheless, we recommend a mass-specific dose of 0.50 – 0.65 mg/kg for future procedures with adult female Weddell seals and a starting dose of 0.50 mg/kg for other age classes and other

  1. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  2. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-04-01

    Full Text Available In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  3. Probability state modeling theory.

    Science.gov (United States)

    Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I

    2015-07-01

    As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.

  4. Computing Earthquake Probabilities on Global Scales

    Science.gov (United States)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  5. 基于响应曲面和重要性抽样方法的热力系统参数失效概率计算%Calculation of Parameter Failure Probability of Thermodynamic System by Response Surface and Importance Sampling Method

    Institute of Scientific and Technical Information of China (English)

    尚彦龙; 蔡琦; 陈力生; 张杨伟

    2012-01-01

    本文研究了将响应曲面与重要性抽样相结合的方法用于复杂热力系统参数失效概率的计算.建立了热力系统物理过程参数失效的数学模型,在此基础上研究了将响应曲面与重要性抽样相结合的算法模型,并给出了热力系统组成设备的性能退化模型和基于重要性抽样的仿真流程,进而对反应堆净化系统工作过程中参数失效问题进行了分析计算.研究表明,对于高维、非线性特性明显并考虑性能退化的复杂热力系统参数失效概率的计算,重要性抽样法较直接抽样能以较高效率获得满意精度的计算结果,而响应曲面法存在局限;响应曲面和重要性抽样相结合的方法是分析热力系统物理过程参数失效的有效方法.%In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the draw-backs of response surface method.

  6. Exact probability distribution functions for Parrondo's games

    Science.gov (United States)

    Zadourian, Rubina; Saakian, David B.; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  7. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  8. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  9. Fusion probability in heavy nuclei

    Science.gov (United States)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, PCN> , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. PCN> for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: PCN> has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine PCN> . Approximate boundaries have been obtained from where PCN> starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of PCN> from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections

  10. Approximation of Failure Probability Using Conditional Sampling

    Science.gov (United States)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  11. Hidden Variables or Positive Probabilities?

    CERN Document Server

    Rothman, T; Rothman, Tony

    2001-01-01

    Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...

  12. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  13. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  14. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to charac......An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  15. Exciton-Dependent Pre-formation Probability of Composite Particles

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jing-Shang; WANG Ji-Min; DUAN Jun-Feng

    2007-01-01

    In Iwamoto-Harada model the whole phase space is full of fermions. When the momentum distributions of the exciton states are taken into account, the pre-formation probability of light composite particles could be improved,and the exciton state-dependent pre-formation probability has been proposed. The calculated results indicate that the consideration of the momentum distribution enhances the pre-formation probability of [1,m] configuration, and suppresses that of [l > 1, m] configurations seriously.

  16. Approximating the Probability of Mortality Due to Protracted Radiation Exposures

    Science.gov (United States)

    2016-06-01

    ODZ QR SHUVRQVKDOOEH VXEMHFWWRDQ\\SHQDOW\\IRUIDLOLQJWRFRPSO\\ZLWKDFROOHFWLRQRILQIRUPDWLRQLILWGRHVQRWGLVSOD\\DFXUUHQWO\\YDOLG20...Because the RIPD code is computationally intensive, it is useful to have an easier, approximate calculation for estimating probability of mortality

  17. Declination Calculator

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Declination is calculated using the current International Geomagnetic Reference Field (IGRF) model. Declination is calculated using the current World Magnetic Model...

  18. Understanding Students' Beliefs about Probability.

    Science.gov (United States)

    Konold, Clifford

    The concept of probability is not an easy concept for high school and college students to understand. This paper identifies and analyzes the students' alternative frameworks from the viewpoint of constructivism. There are various interpretations of probability through mathematical history: classical, frequentist, and subjectivist interpretation.…

  19. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...

  20. A graduate course in probability

    CERN Document Server

    Tucker, Howard G

    2014-01-01

    Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.

  1. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  2. Linear Positivity and Virtual Probability

    CERN Document Server

    Hartle, J B

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...

  3. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  4. Cálculo de los límites de confianza de la Distribución de Probabilidad de valores extremos tipo i para dos poblaciones Calculation of Confidence Limits for the Probability Distribution of Extreme Values of Type I for Two Populations

    Directory of Open Access Journals (Sweden)

    J.A Raynal

    2004-01-01

    Full Text Available Se presenta una metodología para la obtención de los límites de confianza para la distribución de probabilidad de valores extremos tipo I para dos poblaciones. La metodología está basada en la aplicación del método de máxima verosimilitud para calcular los parámetros y los límites de confianza de los valores de diseño. Estos son obtenidos por medio de la matriz de varianza-covarianza de los parámetros y con la suposición de que los valores de diseño poseen una distribución normal. Dada la complejidad de la función de verosimilitud, se hace uso de un código de optimización para maximizar esa función y así producir los valores de los parámetros de máxima verosimilitud. Los resultados obtenidos indican que es una metodología promisoria ya que en varios casos se ha encontrado una reducción en el ancho de los límites de confianza además de un mejor ajuste de la función de probabilidad propuesta a los datos de muestra.A methodology for the calculation of confidence limits for the probability distribution of extreme values of type I for two populations is presented. The methodology is based on the application of the maximum likelihood method for estimating the parameters of the distribution and the confidence limits of the design values. The confidence limits are obtained by the use of the variance-covariance matrix of the parameters assuming a normal distribution of the design values. Given the complexity of the likelihood function, an optimization code is used to maximize such function to produce the maximum likelihood estimators of the parameters of the distribution. The results obtained show that this is a promising methodology given that in several cases a reduction on the width of the confidence limits have been observed in addition to the improvement in the fitting of the proposed probability distribution to the data used.

  5. Probability

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.

  6. Holographic probabilities in eternal inflation.

    Science.gov (United States)

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  7. Probability Ranking in Vector Spaces

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.

  8. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  9. Local Causality, Probability and Explanation

    CERN Document Server

    Healey, Richard A

    2016-01-01

    In papers published in the 25 years following his famous 1964 proof John Bell refined and reformulated his views on locality and causality. Although his formulations of local causality were in terms of probability, he had little to say about that notion. But assumptions about probability are implicit in his arguments and conclusions. Probability does not conform to these assumptions when quantum mechanics is applied to account for the particular correlations Bell argues are locally inexplicable. This account involves no superluminal action and there is even a sense in which it is local, but it is in tension with the requirement that the direct causes and effects of events are nearby.

  10. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  11. Evaluation for Success Probability of Chaff Centroid Jamming

    Institute of Scientific and Technical Information of China (English)

    GAO Dong-hua; SHI Xiu-hua

    2008-01-01

    As the chaff centroid jamming can introduce the guiding error of the anti-warship missile's seeker and decrease its hitting probability, a new quantitative analysis method and a mathematic model are proposed in this paper to evaluate the success jamming probability. By using this method, the optimal decision scheme of chaff centroid jamming in different threat situations can be found, and also the success probability of this scheme can be calculated quantitatively. Thus, the operation rules of the centroid jamming and the tactical approach for increasing the success probability can be determined.

  12. Considerations on probability: from games of chance to modern science

    Directory of Open Access Journals (Sweden)

    Paola Monari

    2015-12-01

    Full Text Available The article sets out a number of considerations on the distinction between variability and uncertainty over the centuries. Games of chance have always been useful random experiments which through combinatorial calculation have opened the way to probability theory and to the interpretation of modern science through statistical laws. The article also looks briefly at the stormy nineteenth-century debate concerning the definitions of probability which went over the same grounds – sometimes without any historical awareness – as the debate which arose at the very beginnings of probability theory, when the great probability theorists were open to every possible meaning of the term.

  13. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  14. Methodology for assessing probability of extreme hydrologic events coincidence

    Directory of Open Access Journals (Sweden)

    Prohaska Stevan

    2010-01-01

    Full Text Available The aim of the presented research is improvement of methodology for probability calculation of coinciding occurrence of historic floods and droughts in the same year. The original procedure was developed in order to determine the occurrence probability of such an extreme historic event. There are two phases in calculation procedure for assessment of both extreme drought and flood occurrence probability in the same year. In the first phase outliers are detected as indicators of extreme events, their return periods are calculated and series' statistics adjusted. In the second phase conditional probabilities are calculated: empirical points are plotted, and both extreme drought and flood occurrence probability in the same year is assessed based on the plot. Outlier detection is performed for the territory of Serbia. Results are shown as maps of regions (basins prone to floods, hydrologic drought, or both. Step-by-step numeric example is given for assessing conditional probability of occurrence of flood and drought for GS Raska on the river Raska. Results of assessment of conditional probability in two more cases are given for combination of extreme flood and 30 day minimum flow.

  15. THE SURVIVAL PROBABILITY IN FINITE TIME PERIOD IN FULLY DISCRETE RISK MODEL

    Institute of Scientific and Technical Information of China (English)

    ChengShixue; WuBiao

    1999-01-01

    The probabilities of the following events are first discussed in this paper: the insurance company survives to any fixed time k and the surplus at time k equals x≥1. The formulas for calculating such probabilities are deduced through analytical and probabilistic arguments respectively. Finally, other probability laws relating to risk are determined based on the probabilities mentioned above.

  16. Probability representation of classical states

    NARCIS (Netherlands)

    Man'ko, OV; Man'ko, [No Value; Pilyavets, OV

    2005-01-01

    Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.

  17. The probabilities of unique events.

    Directory of Open Access Journals (Sweden)

    Sangeet S Khemlani

    Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.

  18. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  19. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  20. Joint probabilities and quantum cognition

    CERN Document Server

    de Barros, J Acacio

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  1. Three lectures on free probability

    OpenAIRE

    2012-01-01

    These are notes from a three-lecture mini-course on free probability given at MSRI in the Fall of 2010 and repeated a year later at Harvard. The lectures were aimed at mathematicians and mathematical physicists working in combinatorics, probability, and random matrix theory. The first lecture was a staged rediscovery of free independence from first principles, the second dealt with the additive calculus of free random variables, and the third focused on random matrix models.

  2. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  3. MATHEMATICAL EXPECTATION ABOUT DISCRETE RANDOM VARIABLE WITH INTERVAL PROBABILITY OR FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The character and an algorithm about DRVIP(discrete random variable with interval probability) and the second kind DRVFP (discrete random variable with crisp event-fuzzy probability) are researched. Using the fuzzy resolution theorem, the solving mathematical expectation of a DRVFP can be translated into solving mathematical expectation of a series of RVIP. It is obvious that solving mathematical expectation of a DRVIP is a typical linear programming problem. A very functional calculating formula for solving mathematical expectation of DRVIP was obtained by using the Dantzig's simplex method. The example indicates that the result obtained by using the functional calculating formula fits together completely with the result obtained by using the linear programming method, but the process using the formula deduced is simpler.

  4. Cluster Membership Probability: Polarimetric Approach

    CERN Document Server

    Medhi, Biman J

    2013-01-01

    Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...

  5. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  6. MEMS Calculator

    Science.gov (United States)

    SRD 166 MEMS Calculator (Web, free access)   This MEMS Calculator determines the following thin film properties from data taken with an optical interferometer or comparable instrument: a) residual strain from fixed-fixed beams, b) strain gradient from cantilevers, c) step heights or thicknesses from step-height test structures, and d) in-plane lengths or deflections. Then, residual stress and stress gradient calculations can be made after an optical vibrometer or comparable instrument is used to obtain Young's modulus from resonating cantilevers or fixed-fixed beams. In addition, wafer bond strength is determined from micro-chevron test structures using a material test machine.

  7. Objective Lightning Probability Forecast Tool Phase II

    Science.gov (United States)

    Lambert, Winnie

    2007-01-01

    This presentation describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability, and an ability to distinguish between lightning and non-lightning days.

  8. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  9. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  10. Detonation probabilities of high explosives

    Energy Technology Data Exchange (ETDEWEB)

    Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.

    1995-07-01

    The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.

  11. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  12. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  13. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  14. Volcano shapes, entropies, and eruption probabilities

    Science.gov (United States)

    Gudmundsson, Agust; Mohajeri, Nahid

    2014-05-01

    We propose that the shapes of polygenetic volcanic edifices reflect the shapes of the associated probability distributions of eruptions. In this view, the peak of a given volcanic edifice coincides roughly with the peak of the probability (or frequency) distribution of its eruptions. The broadness and slopes of the edifices vary widely, however. The shapes of volcanic edifices can be approximated by various distributions, either discrete (binning or histogram approximation) or continuous. For a volcano shape (profile) approximated by a normal curve, for example, the broadness would be reflected in its standard deviation (spread). Entropy (S) of a discrete probability distribution is a measure of the absolute uncertainty as to the next outcome/message: in this case, the uncertainty as to time and place of the next eruption. A uniform discrete distribution (all bins of equal height), representing a flat volcanic field or zone, has the largest entropy or uncertainty. For continuous distributions, we use differential entropy, which is a measure of relative uncertainty, or uncertainty change, rather than absolute uncertainty. Volcano shapes can be approximated by various distributions, from which the entropies and thus the uncertainties as regards future eruptions can be calculated. We use the Gibbs-Shannon formula for the discrete entropies and the analogues general formula for the differential entropies and compare their usefulness for assessing the probabilities of eruptions in volcanoes. We relate the entropies to the work done by the volcano during an eruption using the Helmholtz free energy. Many factors other than the frequency of eruptions determine the shape of a volcano. These include erosion, landslides, and the properties of the erupted materials (including their angle of repose). The exact functional relation between the volcano shape and the eruption probability distribution must be explored for individual volcanoes but, once established, can be used to

  15. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  16. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  17. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  18. Fuzzy Markov chains: uncertain probabilities

    OpenAIRE

    2002-01-01

    We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.

  19. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  20. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.

  1. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  2. Calculating Quenching Weights

    CERN Document Server

    Salgado, C A; Salgado, Carlos A.; Wiedemann, Urs Achim

    2003-01-01

    We calculate the probability (``quenching weight'') that a hard parton radiates an additional energy fraction due to scattering in spatially extended QCD matter. This study is based on an exact treatment of finite in-medium path length, it includes the case of a dynamically expanding medium, and it extends to the angular dependence of the medium-induced gluon radiation pattern. All calculations are done in the multiple soft scattering approximation (Baier-Dokshitzer-Mueller-Peign\\'e-Schiff--Zakharov ``BDMPS-Z''-formalism) and in the single hard scattering approximation (N=1 opacity approximation). By comparison, we establish a simple relation between transport coefficient, Debye screening mass and opacity, for which both approximations lead to comparable results. Together with this paper, a CPU-inexpensive numerical subroutine for calculating quenching weights is provided electronically. To illustrate its applications, we discuss the suppression of hadronic transverse momentum spectra in nucleus-nucleus colli...

  3. Systematic study of survival probability of excited superheavy nuclei

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The stability of excited superheavy nuclei (SHN) with 100 Z 134 against neutron emission and fission is investigated by using a statistical model. In particular, a systematic study of the survival probability against fission in the 1n-channel of these SHN is made. The present calculations consistently take the neutron separation energies and shell correction energies from the calculated results of the finite range droplet model which predicts an island of stability of SHN around Z = 115 and N = 179. It turns out that this island of stability persists for excited SHN in the sense that the calculated survival probabilities in the 1n-channel of excited SHN at the optimal excitation energy are maximized around Z = 115 and N = 179. This indicates that the survival probability in the 1n-channel is mainly determined by the nuclear shell effects.

  4. Reach/frequency for printed media: Personal probabilities or models

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    2000-01-01

    that, in order to prevent bias, ratings per group must be used as reading probabilities. Nevertheless, in most cases, the estimates are still biased compared with panel data, thus overestimating net ´reach. Models with the same assumptions as with assignments of reading probabilities are presented......The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded...

  5. Probability distributions for multimeric systems.

    Science.gov (United States)

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  6. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  7. Sm Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Sneden, C; Cowan, J J

    2005-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).

  8. Knot probabilities in random diagrams

    Science.gov (United States)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  9. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  10. Conditional Probability Analyses of the Spike Activity of Single Neurons

    Science.gov (United States)

    Gray, Peter R.

    1967-01-01

    With the objective of separating stimulus-related effects from refractory effects in neuronal spike data, various conditional probability analyses have been developed. These analyses are introduced and illustrated with examples based on electrophysiological data from auditory nerve fibers. The conditional probability analyses considered here involve the estimation of the conditional probability of a firing in a specified time interval (defined relative to the time of the stimulus presentation), given that the last firing occurred during an earlier specified time interval. This calculation enables study of the stimulus-related effects in the spike data with the time-since-the-last-firing as a controlled variable. These calculations indicate that auditory nerve fibers “recover” from the refractory effects that follow a firing in the following sense: after a “recovery time” of approximately 20 msec, the firing probabilities no longer depend on the time-since-the-last-firing. Probabilities conditional on this minimum time since the last firing are called “recovered probabilities.” The recovered probabilities presented in this paper are contrasted with the corresponding poststimulus time histograms, and the differences are related to the refractory properties of the nerve fibers. Imagesp[762]-a PMID:19210997

  11. Off-site ignition probability of flammable gases.

    Science.gov (United States)

    Rew, P J; Spencer, H; Daycock, J

    2000-01-07

    A key step in the assessment of risk for installations where flammable liquids or gases are stored is the estimation of ignition probability. A review of current modelling and data confirmed that ignition probability values used in risk analyses tend to be based on extrapolation of limited incident data or, in many cases, on the judgement of those conducting the safety assessment. Existing models tend to assume that ignition probability is a function of release rate (or flammable gas cloud size) alone and they do not consider location, density or type of ignition source. An alternative mathematical framework for calculating ignition probability is outlined in which the approach used is to model the distribution of likely ignition sources and to calculate ignition probability by considering whether the flammable gas cloud will reach these sources. Data are collated on the properties of ignition sources within three generic land-use types: industrial, urban and rural. These data are then incorporated into a working model for ignition probability in a form capable of being implemented within risk analysis models. The sensitivity of the model results to assumptions made in deriving the ignition source properties is discussed and the model is compared with other available ignition probability methods.

  12. Probability Properties of Multi-contact in Protein Molecules

    Institute of Scientific and Technical Information of China (English)

    WANG Xiang-hong; KE Jian-hong; HU Min-xiao; ZHANG Lin-xi

    2003-01-01

    The compact conformations of polymers are important because the native conformations of all bio-polymers with certain function are highly compact. The properties of mutil-contact bio-polymer chains were studied by Gaussian statistics of the random-flight chain. The theoretical expressions(were given, also), the calculations of probability distributions and correlation functions for different topologic cases were derived and made respectively. Comparison between single, double and triple contacts was also made. By means of setting the parameters, the results of the current calculations of the multiple contacts are just the same as those calculated by single, double or tripe contacts separately. It is a useful method to investigate native conformations of biopolymers. The probabilities of multi contacts and correlation functions between chains contacts were calculated for the Gaussian chains. Because the bond probability distributions are Gaussians distributions, the probability distributions of the separations of various points along the chains are always consecutive. All the contacts may break up into several groups, and each group consists of many contacts. Here we investigated the probability distribution from one group to three groups of contacts.

  13. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    3–6] and they underlie mathematics , science, and tech- nology [7–10]. Plato claimed that emotions upset reason - ing. However, individuals in the grip...Press 10 Nickerson, R. (2011) Mathematical Reasoning : Patterns, Problems, Conjectures, and Proofs, Taylor & Francis 11 Blanchette, E. and Richards, A...Logic, probability, and human reasoning P.N. Johnson-Laird1,2, Sangeet S. Khemlani3, and Geoffrey P. Goodwin4 1 Princeton University, Princeton, NJ

  14. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  15. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  16. Objective probability and quantum fuzziness

    CERN Document Server

    Mohrhoff, U

    2007-01-01

    This paper offers a critique of the Bayesian approach to quantum mechanics in general and of a recent paper by Caves, Fuchs, and Schack in particular (quant-ph/0608190 v2). In this paper the Bayesian interpretation of Born probabilities is defended against what the authors call the "objective-preparations view". The fact that Caves et al. and the proponents of this view equally misconstrue the time dependence of quantum states, voids the arguments pressed by the former against the latter. After tracing the genealogy of this common error, I argue that the real oxymoron is not an unknown quantum state, as the Bayesians hold, but an unprepared quantum state. I further argue that the essential role of probability in quantum theory is to define and quantify an objective fuzziness. This, more than anything, legitimizes conjoining "objective" to "probability". The measurement problem is essentially the problem of finding a coherent way of thinking about this objective fuzziness, and about the supervenience of the ma...

  17. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Simonen, Fredric A.

    2009-05-01

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  18. Application of the diagrams of phase transformations during aging for optimizing the aging conditions for V1469 and 1441 Al-Li alloys

    Science.gov (United States)

    Lukina, E. A.; Alekseev, A. A.; Antipov, V. V.; Zaitsev, D. V.; Klochkova, Yu. Yu.

    2009-12-01

    To describe the changes in the phase composition of alloys during aging, it is convenient to construct TTT diagrams on the temperature-aging time coordinates in which time-temperature regions of the existence of nonequilibrium phases that form during aging are indicated. As a rule, in constructing the diagrams of phase transformations during aging (DPTA), time-temperature maps of properties are plotted. A comparison of the diagrams with maps of properties allows one to analyze the effect of the structure on the properties. In this study, we analyze the DPTAs of V1469 (Al-1.2 Li-0.46 Ag-3.4 Cu-0.66 Mg) and 1441 (Al-1.8 Li-1.1 Mg-1.6 Cu, C Mg/ C Cu ≈ 1) alloys. Examples of the application of DPTA for the development of steplike aging conditions are reported.

  19. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  20. PROBABILITY MODEL OF GUNTHER GENERATOR

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper constructs the probability model of Gunther generator at first, and the finite dimension union distribution of the output sequence is presented. The result shows that the output sequence is an independent and uniformly distributed 0,1 random variable sequence.It gives the theoretical foundation about why Gunther generator can avoid the statistic weakness of the output sequence of stop-and-go generator, and analyzes the coincidence between output sequence and input sequences of Gunther generator. The conclusions of this paper would offer theoretical references for designers and analyzers of clock-controlled generators.

  1. Estimating Probabilities in Recommendation Systems

    CERN Document Server

    Sun, Mingxuan; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.

  2. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  3. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  4. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  5. Exact feature probabilities in images with occlusion

    CERN Document Server

    Pitkow, Xaq

    2010-01-01

    To understand the computations of our visual system, it is important to understand also the natural environment it evolved to interpret. Unfortunately, existing models of the visual environment are either unrealistic or too complex for mathematical description. Here we describe a naturalistic image model and present a mathematical solution for the statistical relationships between the image features and model variables. The world described by this model is composed of independent, opaque, textured objects which occlude each other. This simple structure allows us to calculate the joint probability distribution of image values sampled at multiple arbitrarily located points, without approximation. This result can be converted into probabilistic relationships between observable image features as well as between the unobservable properties that caused these features, including object boundaries and relative depth. Using these results we explain the causes of a wide range of natural scene properties, including high...

  6. An Alternative Teaching Method of Conditional Probabilities and Bayes' Rule: An Application of the Truth Table

    Science.gov (United States)

    Satake, Eiki; Vashlishan Murray, Amy

    2015-01-01

    This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…

  7. Hf Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I

    2006-01-01

    Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...

  8. Gd Transition Probabilities and Abundances

    CERN Document Server

    Den Hartog, E A; Sneden, C; Cowan, J J

    2006-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...

  9. Calculator calculus

    CERN Document Server

    McCarty, George

    1982-01-01

    How THIS BOOK DIFFERS This book is about the calculus. What distinguishes it, however, from other books is that it uses the pocket calculator to illustrate the theory. A computation that requires hours of labor when done by hand with tables is quite inappropriate as an example or exercise in a beginning calculus course. But that same computation can become a delicate illustration of the theory when the student does it in seconds on his calculator. t Furthermore, the student's own personal involvement and easy accomplishment give hi~ reassurance and en­ couragement. The machine is like a microscope, and its magnification is a hundred millionfold. We shall be interested in limits, and no stage of numerical approximation proves anything about the limit. However, the derivative of fex) = 67.SgX, for instance, acquires real meaning when a student first appreciates its values as numbers, as limits of 10 100 1000 t A quick example is 1.1 , 1.01 , 1.001 , •••• Another example is t = 0.1, 0.01, in the functio...

  10. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  11. THE TRANSITION PROBABILITY MATRIX OF A MARKOV CHAIN MODEL IN AN ATM NETWORK

    Institute of Scientific and Technical Information of China (English)

    YUE Dequan; ZHANG Huachen; TU Fengsheng

    2003-01-01

    In this paper we consider a Markov chain model in an ATM network, which has been studied by Dag and Stavrakakis. On the basis of the iterative formulas obtained by Dag and Stavrakakis, we obtain the explicit analytical expression of the transition probability matrix. It is very simple to calculate the transition probabilities of the Markov chain by these expressions. In addition, we obtain some results about the structure of the transition probability matrix, which are helpful in numerical calculation and theoretical analysis.

  12. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  13. Associativity and normative credal probability.

    Science.gov (United States)

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.

  14. Wigner function and the probability representation of quantum states

    Directory of Open Access Journals (Sweden)

    Man’ko Margarita A.

    2014-01-01

    Full Text Available The relation of theWigner function with the fair probability distribution called tomographic distribution or quantum tomogram associated with the quantum state is reviewed. The connection of the tomographic picture of quantum mechanics with the integral Radon transform of the Wigner quasidistribution is discussed. The Wigner–Moyal equation for the Wigner function is presented in the form of kinetic equation for the tomographic probability distribution both in quantum mechanics and in the classical limit of the Liouville equation. The calculation of moments of physical observables in terms of integrals with the state tomographic probability distributions is constructed having a standard form of averaging in the probability theory. New uncertainty relations for the position and momentum are written in terms of optical tomograms suitable for directexperimental check. Some recent experiments on checking the uncertainty relations including the entropic uncertainty relations are discussed.

  15. Probability landscapes for integrative genomics

    Directory of Open Access Journals (Sweden)

    Benecke Arndt

    2008-05-01

    Full Text Available Abstract Background The comprehension of the gene regulatory code in eukaryotes is one of the major challenges of systems biology, and is a requirement for the development of novel therapeutic strategies for multifactorial diseases. Its bi-fold degeneration precludes brute force and statistical approaches based on the genomic sequence alone. Rather, recursive integration of systematic, whole-genome experimental data with advanced statistical regulatory sequence predictions needs to be developed. Such experimental approaches as well as the prediction tools are only starting to become available and increasing numbers of genome sequences and empirical sequence annotations are under continual discovery-driven change. Furthermore, given the complexity of the question, a decade(s long multi-laboratory effort needs to be envisioned. These constraints need to be considered in the creation of a framework that can pave a road to successful comprehension of the gene regulatory code. Results We introduce here a concept for such a framework, based entirely on systematic annotation in terms of probability profiles of genomic sequence using any type of relevant experimental and theoretical information and subsequent cross-correlation analysis in hypothesis-driven model building and testing. Conclusion Probability landscapes, which include as reference set the probabilistic representation of the genomic sequence, can be used efficiently to discover and analyze correlations amongst initially heterogeneous and un-relatable descriptions and genome-wide measurements. Furthermore, this structure is usable as a support for automatically generating and testing hypotheses for alternative gene regulatory grammars and the evaluation of those through statistical analysis of the high-dimensional correlations between genomic sequence, sequence annotations, and experimental data. Finally, this structure provides a concrete and tangible basis for attempting to formulate a

  16. The feature on the posterior conditional probability of finite state Markov channel

    Institute of Scientific and Technical Information of China (English)

    MU Li-hua; SHEN Ji-hong; YUAN Yan-hua

    2005-01-01

    The feature of finite state Markov channel probability distribution is discussed on condition that original I/O are known. The probability is called posterior condition probability. It is also proved by Bayes formula that posterior condition probability forms stationary Markov sequence if channel input is independently and identically distributed. On the contrary, Markov property of posterior condition probability isn' t kept if the input isn't independently and identically distributed and a numerical example is utilized to explain this case. The properties of posterior condition probability will aid the study of the numerical calculated recurrence formula of finite state Markov channel capacity.

  17. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  18. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  19. E1M1 and E1E2 transition probabilities in one-electron ions

    Science.gov (United States)

    Labzowsky, L. N.; Shonin, A. V.

    2004-12-01

    The quantum electrodynamical (QED) theory of the two-photon transitions in hydrogenlike ions is presented. The emission probability for 2s→2γ(E1)+1s transitions is calculated and compared to the results of the previous calculations. The emission probabilities 2p→γ(E1)+γ(E2)+1s and 2p→γ(E1)+γ(M1)+1s are also calculated for the nuclear charge Z values 1⩽Z⩽100. This is the first calculation of the two latter probabilities. The results are given in two different gauges.

  20. Relativistic calculations of 3s2 1S0-3s3p 1P1 and 3s2 1S0-3s3p 3P1,2 transition probabilities in the Mg isoelectronic sequence

    Institute of Scientific and Technical Information of China (English)

    Cheng Cheng; Gao Xiang; Qing Bo; Zhang Xiao-Le; Li Jia-Ming

    2011-01-01

    Using the multi-configuration Dirac-Fock self-consistent field method and the relativistic configuration-interaction method, calculations of transition energies, oscillator strengths and rates are performed for the 3s2 1S0-3s3p 1P1 spinallowed transition, 3s2 1S0-3s3p 3P1,2 intercombination and magnetic quadrupole transition in the Mg isoelectronic sequence (Mg Ⅰ, Al Ⅱ, Si ⅢⅢ, P Ⅳ and S Ⅴ). Electron correlations are treated adequately, including intravalence electron correlations. The influence of the Breit interaction on oscillator strengths and transition energies are investigated. Quantum electrodynamics corrections are added as corrections. The calculation results are found to be in good agreement with the experimental data and other theoretical calculations.

  1. Evaluation of the Permanent Deformations and Aging Conditions of Batu Pahat Soft Clay-Modified Asphalt Mixture by Using a Dynamic Creep Test

    Directory of Open Access Journals (Sweden)

    Al Allam A. M.

    2016-01-01

    Full Text Available This study aimed to evaluate the permanent deformation and aging conditions of BatuPahat soft clay–modified asphalt mixture, also called BatuPahat soft clay (BPSC particles; these particles are used in powder form as an additive to hot-mix asphalt mixture. In this experiment, five percentage compositions of BPSC (0%, 2%, 4%, 6%, and 8% by weight of bitumen were used. A novel design was established to modify the hot-mix asphalt by using the Superpave method for each additive ratio. Several laboratory tests evaluating different properties, such as indirect tensile strength, resilient stiffness modulus, and dynamic creep, was conducted to assess the performance of the samples mixed through the Superpave method. In the resilient modulus test, fatigue and rutting resistance were reduced by the BPSC particles. The added BPSC particles increased the indirect tensile strength. Among the mixtures, 4% BPSC particles yielded the highest performance. In the dynamic creep test, 4% BPSC particles added to the unaged and short-term aged specimens also showed the highest performance. Based on these results, our conclusion is that the BPSC particles can alleviate the permanent deformation (rutting of roads.

  2. Measuring Robustness of Timetables in Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    of a station based on the plan of operation and the minimum headway times However, none of the above methods take a given timetable into account when the complexity of the station is calculated. E.g. two timetable candidates are given following the same plan of operation in a station; one will be more...... vulnerable to delays (less robust) while the other will be less vulnerable (more robust), but this cannot be measured by the above methods. In the light of this, the article will describe a new method where the complexity of a given station with a given timetable can be calculated based on a probability...... delays caused by interdependencies, and result in a more robust operation. Currently three methods to calculate the complexity of station exists: 1. Complexity of a station based on the track layout 2. Complexity of a station based on the probability of a conflict using a plan of operation 3. Complexity...

  3. Brief communication: On direct impact probability of landslides on vehicles

    Science.gov (United States)

    Nicolet, Pierrick; Jaboyedoff, Michel; Cloutier, Catherine; Crosta, Giovanni B.; Lévy, Sébastien

    2016-04-01

    When calculating the risk of railway or road users of being killed by a natural hazard, one has to calculate a temporal spatial probability, i.e. the probability of a vehicle being in the path of the falling mass when the mass falls, or the expected number of affected vehicles in case such of an event. To calculate this, different methods are used in the literature, and, most of the time, they consider only the dimensions of the falling mass or the dimensions of the vehicles. Some authors do however consider both dimensions at the same time, and the use of their approach is recommended. Finally, a method considering an impact on the front of the vehicle is discussed.

  4. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  5. The Black Hole Formation Probability

    CERN Document Server

    Clausen, Drew; Ott, Christian D

    2014-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...

  6. Probable Linezolid-Induced Pancytopenia

    Directory of Open Access Journals (Sweden)

    Nita Lakhani

    2005-01-01

    Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.

  7. Probability output of multi-class support vector machines

    Institute of Scientific and Technical Information of China (English)

    忻栋; 吴朝晖; 潘云鹤

    2002-01-01

    A novel approach to interpret the outputs of multi-class support vector machines is proposed in this paper. Using the geometrical interpretation of the classifying heperplane and the distance of the pattern from the hyperplane, one can calculate the posterior probability in binary classification case. This paper focuses on the probability output in multi-class phase where both the one-against-one and one-against-rest strategies are considered. Experiment on the speaker verification showed that this method has high performance.

  8. Probability of statistical L-H transition in tokamaks

    Energy Technology Data Exchange (ETDEWEB)

    Itoh, Sanae-I. [Kyushu Univ., Research Institute for Applied Mechanics, Kasuga, Fukuoka (Japan); Itoh, Kimitaka; Toda, Shinichiro [National Inst. for Fusion Science, Toki, Gifu (Japan)

    2002-08-01

    A statistical model of bifurcation of radial electric field E{sub r} is analyzed in relation with L-H transitions of tokamaks. A noise from micro fluctuations leads to random noise for E{sub r}. The transition of E{sub r} occurs in a probabilistic manner. Probability density function and ensemble average of E{sub r} are obtained, when hysteresis of E{sub r} exists. Forward- and backward-transition probabilities are calculated. The phase boundary is shown. Due to the suppression of turbulence by E{sub r} shear, the boundary deviates from the Maxwell's construction rule. (author)

  9. The estimation of yearly probability gain for seismic statistical model

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the calculation method of information gain in the stochastic process presented by Vere-Jones, the relation between information gain and probability gain is studied, which is very common in earthquake prediction, and the yearly probability gain for seismic statistical model is proposed. The method is applied to the non-stationary Poisson model with whole-process exponential increase and stress release model. In addition, the prediction method of stress release model is obtained based on the inverse function simulation method of stochastic variable.

  10. Non-Equilibrium Random Matrix Theory : Transition Probabilities

    CERN Document Server

    Pedro, Francisco Gil

    2016-01-01

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large $N$ limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  11. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    Estimation of blood velocities by time-domain cross-correlation of successive high frequency sampled ultrasound signals is investigated. It is shown that any velocity can result from the estimator regardless of the true velocity due to the nonlinear technique employed. Using a simple simulation...... as a filter with a transfer function depending on the actual velocity. This influences the detection probability, which gets lower at certain velocities. An index directly reflecting the probability of detection can easily be calculated from the cross-correlation estimated. This makes it possible to assess...... the reliability of the velocity estimate in real time...

  12. Conditional probability modulates visual search efficiency.

    Science.gov (United States)

    Cort, Bryan; Anderson, Britt

    2013-01-01

    We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability-the likelihood of a particular color given a particular combination of two cues-varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  13. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  14. Effect of conditioning methods on the microtensile bond strength of phosphate monomer-based cement on zirconia ceramic in dry and aged conditions.

    Science.gov (United States)

    Amaral, Regina; Ozcan, Mutlu; Valandro, Luiz Felipe; Balducci, Ivan; Bottino, Marco Antonio

    2008-04-01

    The objective of this study was to evaluate the durability of bond strength between a resin cement and aluminous ceramic submitted to various surface conditioning methods. Twenty-four blocks (5 x 5 x 4 mm(3)) of a glass-infiltrated zirconia-alumina ceramic (In-Ceram Zirconia Classic) were randomly divided into three surface treatment groups: ST1-Air-abrasion with 110-mum Al2O3 particles + silanization; ST2-Laboratory tribochemical silica coating method (110-microm Al2O3, 110-microm silica) (Rocatec) + silanization; ST3-Chairside tribochemical silica coating method (30-microm SiO(x)) (CoJet) + silanization. Each treated ceramic block was placed in its silicone mold with the treated surface exposed. The resin cement (Panavia F) was prepared and injected into the mold over the treated surface. Specimens were sectioned to achieve nontrimmed bar specimens (14 sp/block) that were randomly divided into two conditions: (a) Dry-microtensile test after sectioning; (b) Thermocycling (TC)-(6,000x, 5-55 degrees C) and water storage (150 days). Thus, six experimental groups were obtained (n = 50): Gr1-ST1 + dry; Gr2-ST1 + TC(;) Gr3-ST2 + dry; Gr4-ST2 + TC; Gr5-ST3 + dry; Gr6-ST3 + TC. After microtensile testing, the failure types were noted. ST2 (25.1 +/- 11) and ST3 (24.1 +/- 7.4) presented statistically higher bond strength (MPa) than that of ST1 (17.5 +/- 8) regardless of aging conditions (p silanization showed durable bond strength. After aging, air-abrasion with 110-microm Al(2)O(3) + silanization showed the largest decrease indicating that aging is fundamental for bond strength testing for acid-resistant zirconia ceramics in order to estimate their long-term performance in the mouth.

  15. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2012-01-01

    We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...... probabilities are indeed best characterized as probability distributions with non-zero variance....

  16. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  17. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  18. The trajectory of the target probability effect.

    Science.gov (United States)

    Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B

    2013-05-01

    The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.

  19. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  20. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  1. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  2. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  3. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  4. Influence of dose calculation algorithms on the predicted dose distribution and NTCP values for NSCLC patients

    DEFF Research Database (Denmark)

    Nielsen, Tine B; Wieslander, Elinore; Fogliata, Antonella;

    2011-01-01

    To investigate differences in calculated doses and normal tissue complication probability (NTCP) values between different dose algorithms.......To investigate differences in calculated doses and normal tissue complication probability (NTCP) values between different dose algorithms....

  5. Helicity probabilities for heavy quark fragmentation into excited mesons

    CERN Document Server

    Yuan, T C

    1995-01-01

    Abstract: In the fragmentation of a heavy quark into a heavy meson whose light degrees of freedom have angular momentum 3/2, all the helicity probabilities are completely determined in the heavy quark limit up to a single probability w_{3/2}. We point out that this probability depends on the longitudinal momentum fraction z of the meson and on its transverse momentum p_\\bot relative to the jet axis. We calculate w_{3/2} as a function of scaling variables corresponding to z and p_\\bot for the heavy quark limit of the perturbative QCD fragmentation functions for b quark to fragment into (b \\bar c) mesons. In this model, the light degrees of freedom prefer to have their angular momentum aligned transverse to, rather than along, the jet axis. Implications for the production of excited heavy mesons, like D^{**} and B^{**}, are discussed.

  6. PROBABILITY DISTRIBUTION FUNCTION OF NEAR-WALL TURBULENT VELOCITY FLUCTUATIONS

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    By large eddy simulation (LES), turbulent databases of channel flows at different Reynolds numbers were established. Then, the probability distribution functions of the streamwise and wall-normal velocity fluctuations were obtained and compared with the corresponding normal distributions. By hypothesis test, the deviation from the normal distribution was analyzed quantitatively. The skewness and flatness factors were also calculated. And the variations of these two factors in the viscous sublayer, buffer layer and log-law layer were discussed. Still illustrated were the relations between the probability distribution functions and the burst events-sweep of high-speed fluids and ejection of low-speed fluids-in the viscous sub-layer, buffer layer and loglaw layer. Finally the variations of the probability distribution functions with Reynolds number were examined.

  7. On the computability of conditional probability

    CERN Document Server

    Ackerman, Nathanael L; Roy, Daniel M

    2010-01-01

    We study the problem of computing conditional probabilities, a fundamental operation in statistics and machine learning. In the elementary discrete setting, a ratio of probabilities defines conditional probability. In the abstract setting, conditional probability is defined axiomatically and the search for more constructive definitions is the subject of a rich literature in probability theory and statistics. In the discrete or dominated setting, under suitable computability hypotheses, conditional probabilities are computable. However, we show that in general one cannot compute conditional probabilities. We do this by constructing a pair of computable random variables in the unit interval whose conditional distribution encodes the halting problem at almost every point. We show that this result is tight, in the sense that given an oracle for the halting problem, one can compute this conditional distribution. On the other hand, we show that conditioning in abstract settings is computable in the presence of cert...

  8. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  9. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  10. Measuring Robustness of Timetables at Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    Stations are often the limiting capacity factor in a railway network. This induces interdependencies, especially at at-grade junctions, causing network effects. This paper presents three traditional methods that can be used to measure the complexity of a station, indicating the robustness...... infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time....... This paper proposes to use the exponential distribution, only taking non-negative delays into account, but any probability distribution can be used. Furthermore, the paper proposes that the calculation parameters are estimated from existing delay data, at a station, to achieve a higher precision. As delay...

  11. Archimedes' calculations of square roots

    CERN Document Server

    Davies, E B

    2011-01-01

    We reconsider Archimedes' evaluations of several square roots in 'Measurement of a Circle'. We show that several methods proposed over the last century or so for his evaluations fail one or more criteria of plausibility. We also provide internal evidence that he probably used an interpolation technique. The conclusions are relevant to the precise calculations by which he obtained upper and lower bounds on pi.

  12. Calculating Speed of Sound

    Science.gov (United States)

    Bhatnagar, Shalabh

    2017-01-01

    Sound is an emerging source of renewable energy but it has some limitations. The main limitation is, the amount of energy that can be extracted from sound is very less and that is because of the velocity of the sound. The velocity of sound changes as per medium. If we could increase the velocity of the sound in a medium we would be probably able to extract more amount of energy from sound and will be able to transfer it at a higher rate. To increase the velocity of sound we should know the speed of sound. If we go by the theory of classic mechanics speed is the distance travelled by a particle divided by time whereas velocity is the displacement of particle divided by time. The speed of sound in dry air at 20 °C (68 °F) is considered to be 343.2 meters per second and it won't be wrong in saying that 342.2 meters is the velocity of sound not the speed as it's the displacement of the sound not the total distance sound wave covered. Sound travels in the form of mechanical wave, so while calculating the speed of sound the whole path of wave should be considered not just the distance traveled by sound. In this paper I would like to focus on calculating the actual speed of sound wave which can help us to extract more energy and make sound travel with faster velocity.

  13. Collision strengths and transition probabilities for Co III forbidden lines

    Science.gov (United States)

    Storey, P. J.; Sochi, Taha

    2016-07-01

    In this paper we compute the collision strengths and their thermally averaged Maxwellian values for electron transitions between the 15 lowest levels of doubly ionized cobalt, Co2+, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.

  14. Collision strengths and transition probabilities for Co III forbidden lines

    CERN Document Server

    Storey, P J

    2016-01-01

    In this paper we compute the collision strengths and their thermally-averaged Maxwellian values for electron transitions between the fifteen lowest levels of doubly-ionised cobalt, Co^{2+}, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.

  15. Bell Could Become the Copernicus of Probability

    Science.gov (United States)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  16. Probability of exclusion in paternity testing: time to reassess.

    Science.gov (United States)

    Cifuentes, Lucía O; Martínez, Eliseo H; Acuña, Mónica P; Jonquera, Hugo G

    2006-03-01

    The average exclusion probability is a measure of efficiency in paternity testing; it refers to the a priori ability of a battery of tests to detect paternity inconsistencies. This parameter measures the capacity of the system to detect a false accusation of paternity. Traditionally, this average exclusion probability has been estimated as the probability of excluding a man who is not the father by an inconsistency in at least one of the studied loci. We suggest that this criterion should be corrected, as currently the presumed father is excluded when at least three genetic inconsistencies are found with the child being tested, not just one. This change of criterion has occurred because of the use of microsatellite loci, whose mutation rates are much greater than those of the coding genes used previously in paternity studies. We propose the use of the average probability of exclusion for at least three loci (not only one), as an honest measure of the combined probability of exclusion of several loci, and we propose an algebraic expression to calculate it.

  17. On the probability of cure for heavy-ion radiotherapy.

    Science.gov (United States)

    Hanin, Leonid; Zaider, Marco

    2014-07-21

    The probability of a cure in radiation therapy (RT)-viewed as the probability of eventual extinction of all cancer cells-is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.

  18. Calculation of Hunan Power Grid Icing Recurrence Interval Based on Extreme-value Type Ⅰ Probability Distribution Model%基于极值Ⅰ型概率分布模型的湖南地区电网覆冰重现期计算

    Institute of Scientific and Technical Information of China (English)

    陆佳政; 张红先; 彭继文; 方针; 李波

    2012-01-01

    分析电网覆冰重现期,对于掌握冰灾规律从而指导抗冰工作具有重要指导意义。为此,提出了基于极值I型分布的电网覆冰重现期计算方法,结合97个气象站1951-2008年的覆冰日数观测数据,计算了97个气象站点15a一遇、30a一遇、50a一遇和100a一遇冰灾的覆冰日数,计算结果显示,长沙马坡岭气象站100a一遇的覆冰日数为14.78d。按照特别严重覆冰重现期为11d以上的划分标准,在极值Ⅰ型计算模型下,湖南电网特别严重覆冰重现期为24.8a。根据计算结果,绘制了湖南省多年一遇覆冰分布图,该图显示,湖南的覆冰严重区域集中在湘西南和湘东南地区。覆冰分布图为今后电网抗冰设计工作提供了指导。%It is very important to analyze icing recurrence interval of power grid for understanding ice disaster law and guiding ice-resistant.We put forward a calculation method of icing recurrence interval of power grid based on extreme-value type I,and calculated the icing days for recurrence period of 15 years,30 years,50 years and 100 years from 97 weather stations with the observation data of icing days from 1951 to 2008.The results show that the number of icing day for recurrence period of 100 years in Changsha Mapoling weather station is 14.78.If standard of serious icing of over 11 days is adopted,recurrence period of serious icing is 24.8 years in this model.Consequently,an icing distribution map of recurrence period of multi-years is plotted for design of anti-icing of power grid,showing that areas of serious icing are southwest and southeast of Hunan.

  19. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  20. UT Biomedical Informatics Lab (BMIL probability wheel

    Directory of Open Access Journals (Sweden)

    Sheng-Cheng Huang

    2016-01-01

    Full Text Available A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant”, about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  1. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  2. Total variation denoising of probability measures using iterated function systems with probabilities

    Science.gov (United States)

    La Torre, Davide; Mendivil, Franklin; Vrscay, Edward R.

    2017-01-01

    In this paper we present a total variation denoising problem for probability measures using the set of fixed point probability measures of iterated function systems with probabilities IFSP. By means of the Collage Theorem for contraction mappings, we provide an upper bound for this problem that can be solved by determining a set of probabilities.

  3. Bayesian Probabilities and the Histories Algebra

    OpenAIRE

    Marlow, Thomas

    2006-01-01

    We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.

  4. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  5. Non-Boolean probabilities and quantum measurement

    Energy Technology Data Exchange (ETDEWEB)

    Niestegge, Gerd

    2001-08-03

    A non-Boolean extension of the classical probability model is proposed. The non-Boolean probabilities reproduce typical quantum phenomena. The proposed model is more general and more abstract, but easier to interpret, than the quantum mechanical Hilbert space formalism and exhibits a particular phenomenon (state-independent conditional probabilities) which may provide new opportunities for an understanding of the quantum measurement process. Examples of the proposed model are provided, using Jordan operator algebras. (author)

  6. Data analysis recipes: Probability calculus for inference

    CERN Document Server

    Hogg, David W

    2012-01-01

    In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods, posterior probabilities, and posterior predictions are all discussed.

  7. Probabilities are single-case, or nothing

    CERN Document Server

    Appleby, D M

    2004-01-01

    Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather than physical realities, and in which probability statements do apply directly to individual events. The question is closely related to the disagreement between the orthodox school of statistical thought and the Bayesian school. It has important technical implications (it makes a difference, what statistical methodology one adopts). It may also have important implications for the interpretation of the quantum state.

  8. Real analysis and probability solutions to problems

    CERN Document Server

    Ash, Robert P

    1972-01-01

    Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.

  9. Some New Results on Transition Probability

    Institute of Scientific and Technical Information of China (English)

    Yu Quan XIE

    2008-01-01

    In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.

  10. Study on the Winning Probability for a Bid in Procurement Combinational Auction with Tree Structure

    Institute of Scientific and Technical Information of China (English)

    CHEN Jian; HUANG He

    2004-01-01

    In this paper, the processes to determine winning probability for the correspondingbidder's deterministic bid are presented. The analysis of the winning probability is Crucial for studying the bidding equilibria and designing the mechanism of procurement combinational auctions (CAs), and it also provides the decision making support for bidders who are in commercial synergies surrounding.Finally, an example is used to illustrate the feasibility and detailed processes of calculating winning probability.

  11. Probability of graphs with large spectral gap by multicanonical Monte Carlo

    Science.gov (United States)

    Saito, Nen; Iba, Yukito

    2011-01-01

    Graphs with large spectral gap are important in various fields such as biology, sociology and computer science. In designing such graphs, an important question is how the probability of graphs with large spectral gap behaves. A method based on multicanonical Monte Carlo is introduced to quantify the behavior of this probability, which enables us to calculate extreme tails of the distribution. The proposed method is successfully applied to random 3-regular graphs and large deviation probability is estimated.

  12. When Index Term Probability Violates the Classical Probability Axioms Quantum Probability can be a Necessary Theory for Information Retrieval

    CERN Document Server

    Melucci, Massimo

    2012-01-01

    Probabilistic models require the notion of event space for defining a probability measure. An event space has a probability measure which ensues the Kolmogorov axioms. However, the probabilities observed from distinct sources, such as that of relevance of documents, may not admit a single event space thus causing some issues. In this article, some results are introduced for ensuring whether the observed prob- abilities of relevance of documents admit a single event space. More- over, an alternative framework of probability is introduced, thus chal- lenging the use of classical probability for ranking documents. Some reflections on the convenience of extending the classical probabilis- tic retrieval toward a more general framework which encompasses the issues are made.

  13. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  14. Inclusive spectra of hadrons created by color tube fission; 1, Probability of tube fission

    CERN Document Server

    Gedalin, E V

    1997-01-01

    The probability of color tube fission that includes the tube surface small oscillation corrections is obtained with pre-exponential factor accuracy on the basis of previously constructed color tube model. Using these expressions the probability of the tube fission in $n$ point is obtained that is the basis for calculation of inclusive spectra of produced hadrons.

  15. Quantum Theory and Probability Theory: Their Relationship and Origin in Symmetry

    Directory of Open Access Journals (Sweden)

    Philip Goyal

    2011-04-01

    Full Text Available Quantum theory is a probabilistic calculus that enables the calculation of the probabilities of the possible outcomes of a measurement performed on a physical system. But what is the relationship between this probabilistic calculus and probability theory itself? Is quantum theory compatible with probability theory? If so, does it extend or generalize probability theory? In this paper, we answer these questions, and precisely determine the relationship between quantum theory and probability theory, by explicitly deriving both theories from first principles. In both cases, the derivation depends upon identifying and harnessing the appropriate symmetries that are operative in each domain. We prove, for example, that quantum theory is compatible with probability theory by explicitly deriving quantum theory on the assumption that probability theory is generally valid.

  16. Analytical Study of Thermonuclear Reaction Probability Integrals

    CERN Document Server

    Chaudhry, M A; Mathai, A M

    2000-01-01

    An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.

  17. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  18. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  19. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  20. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  1. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  2. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  3. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  4. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  5. Transition probability spaces in loop quantum gravity

    CERN Document Server

    Guo, Xiao-Kan

    2016-01-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is achieved by first checking such structures in covariant quantum mechanics, and then passing to spin foam models via the general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the Hilbert space of the canonical theory and the relevant quantum logical structure. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize property transitions and causality in this categorical context in connection with presheaves on quantaloids and respectively causal categories. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  6. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  7. Laboratory-Tutorial activities for teaching probability

    CERN Document Server

    Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...

  8. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.

  9. An improved probability mapping approach to assess genome mosaicism

    Directory of Open Access Journals (Sweden)

    Gogarten J Peter

    2003-09-01

    Full Text Available Abstract Background Maximum likelihood and posterior probability mapping are useful visualization techniques that are used to ascertain the mosaic nature of prokaryotic genomes. However, posterior probabilities, especially when calculated for four-taxon cases, tend to overestimate the support for tree topologies. Furthermore, because of poor taxon sampling four-taxon analyses suffer from sensitivity to the long branch attraction artifact. Here we extend the probability mapping approach by improving taxon sampling of the analyzed datasets, and by using bootstrap support values, a more conservative tool to assess reliability. Results Quartets of orthologous proteins were complemented with homologs from selected reference genomes. The mapping of bootstrap support values from these extended datasets gives results similar to the original maximum likelihood and posterior probability mapping. The more conservative nature of the plotted support values allows to focus further analyses on those protein families that strongly disagree with the majority or plurality of genes present in the analyzed genomes. Conclusion Posterior probability is a non-conservative measure for support, and posterior probability mapping only provides a quick estimation of phylogenetic information content of four genomes. This approach can be utilized as a pre-screen to select genes that might have been horizontally transferred. Better taxon sampling combined with subtree analyses prevents the inconsistencies associated with four-taxon analyses, but retains the power of visual representation. Nevertheless, a case-by-case inspection of individual multi-taxon phylogenies remains necessary to differentiate unrecognized paralogy and shared phylogenetic reconstruction artifacts from horizontal gene transfer events.

  10. Survival probability in patients with liver trauma.

    Science.gov (United States)

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.

  11. Computation of Probabilities in Causal Models of History of Science

    Directory of Open Access Journals (Sweden)

    Osvaldo Pessoa Jr.

    2006-12-01

    Full Text Available : The aim of this paper is to investigate the ascription of probabilities in a causal model of an episode in the history of science. The aim of such a quantitative approach is to allow the implementation of the causal model in a computer, to run simulations. As an example, we look at the beginning of the science of magnetism, “explaining” — in a probabilistic way, in terms of a single causal model — why the field advanced in China but not in Europe (the difference is due to different prior probabilities of certain cultural manifestations. Given the number of years between the occurrences of two causally connected advances X and Y, one proposes a criterion for stipulating the value pY=X of the conditional probability of an advance Y occurring, given X. Next, one must assume a specific form for the cumulative probability function pY=X(t, which we take to be the time integral of an exponential distribution function, as is done in physics of radioactive decay. Rules for calculating the cumulative functions for more than two events are mentioned, involving composition, disjunction and conjunction of causes. We also consider the problems involved in supposing that the appearance of events in time follows an exponential distribution, which are a consequence of the fact that a composition of causes does not follow an exponential distribution, but a “hypoexponential” one. We suggest that a gamma distribution function might more adequately represent the appearance of advances.

  12. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  13. Basic Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems--as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first

  14. Advanced Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob

  15. Tomographic probability representation for quantum fermion fields

    CERN Document Server

    Andreev, V A; Man'ko, V I; Son, Nguyen Hung; Thanh, Nguyen Cong; Timofeev, Yu P; Zakharov, S D

    2009-01-01

    Tomographic probability representation is introduced for fermion fields. The states of the fermions are mapped onto probability distribution of discrete random variables (spin projections). The operators acting on the fermion states are described by fermionic tomographic symbols. The product of the operators acting on the fermion states is mapped onto star-product of the fermionic symbols. The kernel of the star-product is obtained. The antisymmetry of the fermion states is formulated as the specific symmetry property of the tomographic joint probability distribution associated with the states.

  16. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  17. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  18. Are All Probabilities Fundamentally Quantum Mechanical?

    CERN Document Server

    Pradhan, Rajat Kumar

    2011-01-01

    The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.

  19. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  20. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  1. Computer routines for probability distributions, random numbers, and related functions

    Science.gov (United States)

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  2. On the Low False Positive Probabilities of Kepler Planet Candidates

    CERN Document Server

    Morton, Timothy D

    2011-01-01

    We present a framework to conservatively estimate the probability that any particular planet-like transit signal observed by the Kepler mission is in fact a planet, prior to any ground-based follow-up efforts. We use Monte Carlo methods based on stellar population synthesis and Galactic structure models, and we provide empirical analytic fits to our results that may be applied to the as-yet-unconfirmed Kepler candidates. We find that the false positive probability for candidates that pass preliminary Kepler vetting procedures is generally 20% to < 2%, assuming a continuous power law for the planet mass function with index alpha = -1.5. Since Kepler will detect many more planetary signals than can be positively confirmed with ground-based follow-up efforts in the near term, these calculations will be crucial to using the ensemble of Kepler data to determine population characteristic s of planetary systems.

  3. Bridge pier failure probabilities under combined hazard effects of scour, truck and earthquake. Part I: occurrence probabilities

    Science.gov (United States)

    Liang, Zach; Lee, George C.

    2013-06-01

    In many regions of the world, a bridge will experience multiple extreme hazards during its expected service life. The current American Association of State Highway and Transportation Officials (AASHTO) load and resistance factor design (LRFD) specifications are formulated based on failure probabilities, which are fully calibrated for dead load and nonextreme live loads. Design against earthquake loads is established separately. Design against scour effect is also formulated separately by using the concept of capacity reduction (or increased scour depth). Furthermore, scour effect cannot be linked directly to an LRFD limit state equation, because the latter is formulated using force-based analysis. This paper (in two parts) presents a probability-based procedure to estimate the combined hazard effects on bridges due to truck, earthquake and scour, by treating the effect of scour as an equivalent load effect so that it can be included in reliability-based bridge failure calculations. In Part I of this series, the general principle of treating the scour depth as an equivalent load effect is presented. The individual and combined partial failure probabilities due to truck, earthquake and scour effects are described. To explain the method of including non-force-based natural hazards effects, two types of common scour failures are considered. In Part II, the corresponding bridge failure probability, the occurrence of scour as well as simultaneously having both truck load and equivalent scour load are quantitatively discussed.

  4. Bridge pier failure probabilities under combined hazard effects of scour, truck and earthquake. Part II: failure probabilities

    Science.gov (United States)

    Liang, Zach; Lee, George C.

    2013-06-01

    In many regions of the world, a bridge will experience multiple extreme hazards during its expected service life. The current American Association of State Highway and Transportation Officials (AASHTO) load and resistance factor design (LRFD) specifications are formulated based on failure probabilities, which are fully calibrated for dead load and non-extreme live loads. Design against earthquake load effect is established separately. Design against scour effect is also formulated separately by using the concept of capacity reduction (or increased scour depth). Furthermore, scour effect cannot be linked directly to an LRFD limit state equation because the latter is formulated using force-based analysis. This paper (in two parts) presents a probability-based procedure to estimate the combined hazard effects on bridges due to truck, earthquake and scour, by treating the effect of scour as an equivalent load effect so that it can be included in reliability-based failure calculations. In Part I of this series, the general principle for treating the scour depth as an equivalent load effect is presented. In Part II, the corresponding bridge failure probability, the occurrence of scour as well as simultaneously having both truck load and equivalent scour load effect are quantitatively discussed. The key formulae of the conditional partial failure probabilities and the necessary conditions are established. In order to illustrate the methodology, an example of dead, truck, earthquake and scour effects on a simple bridge pile foundation is represented.

  5. Inclusion probability with dropout: an operational formula.

    Science.gov (United States)

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.

  6. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  7. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  8. Teaching Elementary Probability Through its History.

    Science.gov (United States)

    Kunoff, Sharon; Pines, Sylvia

    1986-01-01

    Historical problems are presented which can readily be solved by students once some elementary probability concepts are developed. The Duke of Tuscany's Problem; the problem of points; and the question of proportions, divination, and Bertrand's Paradox are included. (MNS)

  9. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  10. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  11. Characteristic Functions over C*-Probability Spaces

    Institute of Scientific and Technical Information of China (English)

    王勤; 李绍宽

    2003-01-01

    Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.

  12. De Finetti's contribution to probability and statistics

    OpenAIRE

    Cifarelli, Donato Michele; Regazzini, Eugenio

    1996-01-01

    This paper summarizes the scientific activity of de Finetti in probability and statistics. It falls into three sections: Section 1 includes an essential biography of de Finetti and a survey of the basic features of the scientific milieu in which he took the first steps of his scientific career; Section 2 concerns de Finetti's work in probability: (a) foundations, (b) processes with independent increments, (c) sequences of exchangeable random variables, and (d) contributions which fall within ...

  13. Probability, clinical decision making and hypothesis testing

    Directory of Open Access Journals (Sweden)

    A Banerjee

    2009-01-01

    Full Text Available Few clinicians grasp the true concept of probability expressed in the ′P value.′ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing.

  14. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  15. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  16. Data analysis recipes: Probability calculus for inference

    OpenAIRE

    Hogg, David W.

    2012-01-01

    In this pedagogical text aimed at those wanting to start thinking about or brush up on probabilistic inference, I review the rules by which probability distribution functions can (and cannot) be combined. I connect these rules to the operations performed in probabilistic data analysis. Dimensional analysis is emphasized as a valuable tool for helping to construct non-wrong probabilistic statements. The applications of probability calculus in constructing likelihoods, marginalized likelihoods,...

  17. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  18. Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-05-23

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{sub eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.

  19. Site occupancy models with heterogeneous detection probabilities

    Science.gov (United States)

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  20. Methods for Calculating the Probability Distribution of Sums of Independent Random Variables

    Science.gov (United States)

    1986-07-01

    CANNON ARTY WPNS,DOVER NJ 07801-5001 ATTN: AMCPM-CAWS COMMANDER, US ARMY LOGISTICS CENTER FORT LEE, VA 23801 ATTN: ATCL-S 20 i» *. • 3rv rv»n...frM^^ M & ROUTINE FOR MAT.INVERSE (N, AM) 1 I "ROUTINE TO OBTAIN THE INVERSE 0? THE N BY M MATRIX AM VIA THE ••COMPACT FORM OF THE GAUSS -JORDAN

  1. Optimization and Calculation of Probability Performances of Processes of Storage and Processing of Refrigerator Containerized Cargoes

    Science.gov (United States)

    Nyrkov, A. P.; Sokolov, S. S.; Chernyi, S. G.; Shnurenko, A. A.; Pavlova, L. A.

    2016-08-01

    In the work the queueing system of the disconnected multi-channel type to which irregular, uniform or not uniform flows of requests with a unlimited latency period arrive is considered. The system is considered on an example of the container terminal having conditional-functional sections with a definite mark-to-space ratio on which the irregular inhomogeneous traffic flow with resultant intensity acts.

  2. Tables of Probability of Damage Calculations for Point and Circular Normal Area Targets.

    Science.gov (United States)

    1976-07-19

    ion was d F ’ s c r i E I i ’ d h~’ a c i r c u l a r norma l d i s t r i b u t i o n . The ,j o i f l t d i s t r i b u t i o n of t h e d i...F rem I h(? I - t n t I c ( I f a c i r c u l a r norma l d i s t r i b u t e d area whose ’ R95 r a d i u s i s one n a u t h e a l mile. The...C 5 5 5 5 - 5 5 5’_ = = = =II—’ 0 C CC CI CI C- ISI ~L i z LU ~~ ~ - - - - - - ‘ - - ( 1)4 ~U,. ~~ © PI F - C C - F - C I — I C ’ P ’ P F

  3. Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities: An Experimenter’s View

    Directory of Open Access Journals (Sweden)

    Elmar Träbert

    2014-03-01

    Full Text Available The interpretation of atomic observations by theory and the testing of computational predictions by experiment are interactive processes. It is necessary to gain experience with “the other side” before claims of achievement can be validated and judged. The discussion covers some general problems in the field as well as many specific examples, mostly organized by isoelectronic sequence, of what level of accuracy recently has been reached or which atomic structure or level lifetime problem needs more attention.

  4. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    In this paper, we propose a class of infinite-dimensional phase-type distributions with finitely many parameters as models for heavy tailed distributions. The class of finite-dimensional phase-type distributions is dense in the class of distributions on the positive reals and may hence approximat...

  5. Calculating the Probability of Radiation from the Dissociated States of Diatomic Molecules,

    Science.gov (United States)

    1983-07-06

    3) as the wave func- tion of the dissociated state. Considering the U x9(r) a smooth func- tion, we take it out from under the integral sign at the...maximum point of the expression rm under the integral sign whose vicinity is deter- mined by (2), as a result we obtain (gEI IU. I g’n) A ’ o (r,) a P

  6. A First-Order Methodology for Calculating Probability of Mission Success

    Science.gov (United States)

    1979-01-31

    Ops. & Plans ATTN: DB-4C2, B. Morris Department of the Army ATTN: RDS-3A ATTN: DAMO-NC ATTN: DB-4C2, T. Ross ATTN: MOCA -ADL Defense Nuclear Agency...Services Electric Power Research Institute Agbabian Associates ATTN: G. Sliter ATTN: C. Bagge ATTN: M. Agbabian Electromechanical Sys. of New Mexico , Inc...Systems, Science & Software, Inc. 6 University of New Mexico ATTN: T. Riney ATTN: G. Triandafalidis ATTN: T. Cherry ATTN: Library University of Oklahoma

  7. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  8. Collective fluctuations in magnetized plasma: Transition probability approach

    Energy Technology Data Exchange (ETDEWEB)

    Sosenko, P.P. [International Centre of Physics and M.M.Boholiubov Inst. for Theoretical Physics, Kyiv (Ukraine)]|[Ecole Polytechnique, Palaiseau (France)]|[Univ. Henri Poincare, Vandoeuvre (France)

    1997-10-01

    Statistical plasma electrodynamics is elaborated with special emphasis on the transition probability approach and quasi-particles, and on modern applications to magnetized plasmas. Fluctuation spectra in the magnetized plasma are calculated in the range of low frequencies (with respect to the cyclotron one), and the conditions for the transition from incoherent to collective fluctuations are established. The role of finite-Larmor-radius effects and particle polarization drift in such a transition is explained. The ion collective features in fluctuation spectra are studied. 63 refs., 30 figs.

  9. E1M1 and E1E2 transition probabilities in one-electron ions

    Energy Technology Data Exchange (ETDEWEB)

    Labzowsky, L.N. [Institute of Physics, St. Petersburg State University, Uljanovskaya 1, Petrodvorets, 198904 St. Petersburg (Russian Federation) and Petersburg Nuclear Physics Institute, Gatchina, 188350 St. Petersburg (Russian Federation)]. E-mail: leonti@landau.phys.spbu.ru; Shonin, A.V. [Institute of Physics, St. Petersburg State University, Uljanovskaya 1, Petrodvorets, 198904 St. Petersburg (Russian Federation)

    2004-12-06

    The quantum electrodynamical (QED) theory of the two-photon transitions in hydrogenlike ions is presented. The emission probability for 2s1/2->2{gamma}(E1)+1s1/2 transitions is calculated and compared to the results of the previous calculations. The emission probabilities 2p1/2->{gamma}(E1)+{gamma}(E2)+1s1/2 and 2p1/2->{gamma}(E1)+{gamma}(M1)+1s1/2 are also calculated for the nuclear charge Z values 1=calculation of the two latter probabilities. The results are given in two different gauges.

  10. Calculation Methods for Wallenius’ Noncentral Hypergeometric Distribution

    DEFF Research Database (Denmark)

    Fog, Agner

    2008-01-01

    distribution are derived. Range of applicability, numerical problems, and efficiency are discussed for each method. Approximations to the mean and variance are also discussed. This distribution has important applications in models of biased sampling and in models of evolutionary systems....... is the conditional distribution of independent binomial variates given their sum. No reliable calculation method for Wallenius' noncentral hypergeometric distribution has hitherto been described in the literature. Several new methods for calculating probabilities from Wallenius' noncentral hypergeometric...

  11. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Roger E. Feeley

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  12. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design.

  13. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    Science.gov (United States)

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  14. Magnetic Field Calculator

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Magnetic Field Calculator will calculate the total magnetic field, including components (declination, inclination, horizontal intensity, northerly intensity,...

  15. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  16. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  17. A Thermodynamical Approach for Probability Estimation

    CERN Document Server

    Isozaki, Takashi

    2012-01-01

    The issue of discrete probability estimation for samples of small size is addressed in this study. The maximum likelihood method often suffers over-fitting when insufficient data is available. Although the Bayesian approach can avoid over-fitting by using prior distributions, it still has problems with objective analysis. In response to these drawbacks, a new theoretical framework based on thermodynamics, where energy and temperature are introduced, was developed. Entropy and likelihood are placed at the center of this method. The key principle of inference for probability mass functions is the minimum free energy, which is shown to unify the two principles of maximum likelihood and maximum entropy. Our method can robustly estimate probability functions from small size data.

  18. Probabilities and Signalling in Quantum Field Theory

    CERN Document Server

    Dickinson, Robert; Millington, Peter

    2016-01-01

    We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators in scalar field theory. This approach allows one to see clearly how faster-than-light signalling is prevented, because it leads to a diagrammatic expansion in which the retarded propagator plays a prominent role. We illustrate the formalism using the simple case of the much-studied Fermi two-atom problem.

  19. Channel Capacity Estimation using Free Probability Theory

    CERN Document Server

    Ryan, Øyvind

    2007-01-01

    In many channel measurement applications, one needs to estimate some characteristics of the channels based on a limited set of measurements. This is mainly due to the highly time varying characteristics of the channel. In this contribution, it will be shown how free probability can be used for channel capacity estimation in MIMO systems. Free probability has already been applied in various application fields such as digital communications, nuclear physics and mathematical finance, and has been shown to be an invaluable tool for describing the asymptotic behaviour of many systems when the dimensions of the system get large (i.e. the number of antennas). In particular, introducing the notion of free deconvolution, we provide hereafter an asymptotically (in the number of antennas) unbiased capacity estimator (w.r.t. the number of observations) for MIMO channels impaired with noise. Another unbiased estimator (for any number of observations) is also constructed by slightly modifying the free probability based est...

  20. Probability, Arrow of Time and Decoherence

    CERN Document Server

    Bacciagaluppi, G

    2007-01-01

    This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.

  1. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  2. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  3. A Revisit to Probability - Possibility Consistency Principles

    Directory of Open Access Journals (Sweden)

    Mamoni Dhar

    2013-03-01

    Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.

  4. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  5. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  6. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.

  7. Explosion probability of unexploded ordnance: expert beliefs.

    Science.gov (United States)

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  8. High-Probability Neurotransmitter Release Sites Represent an Energy-Efficient Design.

    Science.gov (United States)

    Lu, Zhongmin; Chouhan, Amit K; Borycz, Jolanta A; Lu, Zhiyuan; Rossano, Adam J; Brain, Keith L; Zhou, You; Meinertzhagen, Ian A; Macleod, Gregory T

    2016-10-10

    Nerve terminals contain multiple sites specialized for the release of neurotransmitters. Release usually occurs with low probability, a design thought to confer many advantages. High-probability release sites are not uncommon, but their advantages are not well understood. Here, we test the hypothesis that high-probability release sites represent an energy-efficient design. We examined release site probabilities and energy efficiency at the terminals of two glutamatergic motor neurons synapsing on the same muscle fiber in Drosophila larvae. Through electrophysiological and ultrastructural measurements, we calculated release site probabilities to differ considerably between terminals (0.33 versus 0.11). We estimated the energy required to release and recycle glutamate from the same measurements. The energy required to remove calcium and sodium ions subsequent to nerve excitation was estimated through microfluorimetric and morphological measurements. We calculated energy efficiency as the number of glutamate molecules released per ATP molecule hydrolyzed, and high-probability release site terminals were found to be more efficient (0.13 versus 0.06). Our analytical model indicates that energy efficiency is optimal (∼0.15) at high release site probabilities (∼0.76). As limitations in energy supply constrain neural function, high-probability release sites might ameliorate such constraints by demanding less energy. Energy efficiency can be viewed as one aspect of nerve terminal function, in balance with others, because high-efficiency terminals depress significantly during episodic bursts of activity.

  9. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary.

  10. Atomic transition probabilities of Nd I

    Science.gov (United States)

    Stockett, M. H.; Wood, M. P.; Den Hartog, E. A.; Lawler, J. E.

    2011-12-01

    Fourier transform spectra are used to determine emission branching fractions for 236 lines of the first spectrum of neodymium (Nd i). These branching fractions are converted to absolute atomic transition probabilities using radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 225001). The wavelength range of the data set is from 390 to 950 nm. These transition probabilities from emission and laser measurements are compared to relative absorption measurements in order to assess the importance of unobserved infrared branches from selected upper levels.

  11. Survival probability in diffractive dijet photoproduction

    CERN Document Server

    Klasen, M

    2009-01-01

    We confront the latest H1 and ZEUS data on diffractive dijet photoproduction with next-to-leading order QCD predictions in order to determine whether a rapidity gap survival probability of less than one is supported by the data. We find evidence for this hypothesis when assuming global factorization breaking for both the direct and resolved photon contributions, in which case the survival probability would have to be E_T^jet-dependent, and for the resolved or in addition the related direct initial-state singular contribution only, where it would be independent of E_T^jet.

  12. Conditional Probabilities and Collapse in Quantum Measurements

    Science.gov (United States)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  13. Harmonic analysis and the theory of probability

    CERN Document Server

    Bochner, Salomon

    2005-01-01

    Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro

  14. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  15. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  16. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  17. Concepts of probability in radiocarbon analysis

    Directory of Open Access Journals (Sweden)

    Bernhard Weninger

    2011-12-01

    Full Text Available In this paper we explore the meaning of the word probability, not in general terms, but restricted to the field of radiocarbon dating, where it has the meaning of ‘dating probability assigned to calibrated 14C-ages’. The intention of our study is to improve our understanding of certain properties of radiocarbon dates, which – although mathematically abstract – are fundamental both for the construction of age models in prehistoric archaeology, as well as for an adequate interpretation of their reliability.

  18. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  19. Probability, statistics, and decision for civil engineers

    CERN Document Server

    Benjamin, Jack R

    2014-01-01

    Designed as a primary text for civil engineering courses, as a supplementary text for courses in other areas, or for self-study by practicing engineers, this text covers the development of decision theory and the applications of probability within the field. Extensive use of examples and illustrations helps readers develop an in-depth appreciation for the theory's applications, which include strength of materials, soil mechanics, construction planning, and water-resource design. A focus on fundamentals includes such subjects as Bayesian statistical decision theory, subjective probability, and

  20. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  1. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  2. Probability in biology: overview of a comprehensive theory of probability in living systems.

    Science.gov (United States)

    Nakajima, Toshiyuki

    2013-09-01

    Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems.

  3. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  4. Clusters galore: insights about environmental clusters from probability theory.

    Science.gov (United States)

    Neutra, R; Swan, S; Mack, T

    1992-12-15

    The posterior probability of a causal explanation given that an environmental cancer cluster is statistically significant depends on the prior probability of an environmentally caused cluster, the sensitivity of the statistical test and its specificity. The prior probability is low, because it is rare to have enough carcinogen in the general environment to cause a relative risk of cancer high enough to achieve statistical significance in a small geographic area. The sensitivity and specificity are not great. The likelihood that a census tract escapes statistically significant elevations in all 80 types of cancer can be calculated. Many of the thousands of census tracts will, by chance alone, have at least one type of cancer whose elevation is statistically significant. Actual observation from a large cancer registry confirms this probabilistic prediction. Applying the principles of Bayes' Theorem would suggest that most statistically significant environmental cancer clusters are not due to environmental carcinogens. One would have to investigate hundreds of environmental cancer clusters to find one with a true environmental cause.

  5. Steady-state distributions of probability fluxes on complex networks

    Science.gov (United States)

    Chełminiak, Przemysław; Kurzyński, Michał

    2017-02-01

    We consider a simple model of the Markovian stochastic dynamics on complex networks to examine the statistical properties of the probability fluxes. The additional transition, called hereafter a gate, powered by the external constant force breaks a detailed balance in the network. We argue, using a theoretical approach and numerical simulations, that the stationary distributions of the probability fluxes emergent under such conditions converge to the Gaussian distribution. By virtue of the stationary fluctuation theorem, its standard deviation depends directly on the square root of the mean flux. In turn, the nonlinear relation between the mean flux and the external force, which provides the key result of the present study, allows us to calculate the two parameters that entirely characterize the Gaussian distribution of the probability fluxes both close to as well as far from the equilibrium state. Also, the other effects that modify these parameters, such as the addition of shortcuts to the tree-like network, the extension and configuration of the gate and a change in the network size studied by means of computer simulations are widely discussed in terms of the rigorous theoretical predictions.

  6. Probability of causation: Implications for radiological protection and dose limitation

    Energy Technology Data Exchange (ETDEWEB)

    Fabrikant, J.I.

    1987-05-01

    This report on the probability of causation of radiation-induced cancer is an attempt to bring together biology, chemistry, physics and statistics to calculate a value in the form of a ratio expressed as a percentage. In involves the interactions of numerous cancer risk factors, and all are fraught with technical difficulties and uncertainties. It is a computational approach to a societal problem that should be resolved in the political arena by men and women of government and law. But, it must be examined, because at the present, we have no reasonable method to explain the complexity of the mechanism of radiation-induced cancer and the probability of injury to an individual exposed in the past to ionizing radiation, and because society does not know how to compensate such a person who may have been injured by radiation, and particularly low-level radiation. Five questions are discussed that concern probability of causation of radiation-induced cancer. First, what is it and how can we best define the concept? Second, what are the methods of estimation and cancer causation? Third, what are the uncertainties involved? Fourth, what are the strengths and limitation of the computational approach? And fifth, what are the implications for radiological protection and dose-limitation?

  7. Probability of causation: Implications for radiological protection and dose limitation

    Energy Technology Data Exchange (ETDEWEB)

    Fabrikant, J.I.

    1987-05-01

    This report on the probability of causation of radiation-induced cancer is an attempt to bring together biology, chemistry, physics and statistics to calculate a value in the form of a ratio expressed as a percentage. In involves the interactions of numerous cancer risk factors, and all are fraught with technical difficulties and uncertainties. It is a computational approach to a societal problem that should be resolved in the political arena by men and women of government and law. But, it must be examined, because at the present, we have no reasonable method to explain the complexity of the mechanism of radiation-induced cancer and the probability of injury to an individual exposed in the past to ionizing radiation, and because society does not know how to compensate such a person who may have been injured by radiation, and particularly low-level radiation. Five questions are discussed that concern probability of causation of radiation-induced cancer. First, what is it and how can we best define the concept Second, what are the methods of estimation and cancer causation Third, what are the uncertainties involved Fourth, what are the strengths and limitation of the computational approach And fifth, what are the implications for radiological protection and dose-limitation

  8. Microscopic Calculations of 240Pu Fission

    Energy Technology Data Exchange (ETDEWEB)

    Younes, W; Gogny, D

    2007-09-11

    Hartree-Fock-Bogoliubov calculations have been performed with the Gogny finite-range effective interaction for {sup 240}Pu out to scission, using a new code developed at LLNL. A first set of calculations was performed with constrained quadrupole moment along the path of most probable fission, assuming axial symmetry but allowing for the spontaneous breaking of reflection symmetry of the nucleus. At a quadrupole moment of 345 b, the nucleus was found to spontaneously scission into two fragments. A second set of calculations, with all nuclear moments up to hexadecapole constrained, was performed to approach the scission configuration in a controlled manner. Calculated energies, moments, and representative plots of the total nuclear density are shown. The present calculations serve as a proof-of-principle, a blueprint, and starting-point solutions for a planned series of more comprehensive calculations to map out a large set of scission configurations, and the associated fission-fragment properties.

  9. Transition probabilities in neutron-rich Se,8684

    Science.gov (United States)

    Litzinger, J.; Blazhev, A.; Dewald, A.; Didierjean, F.; Duchêne, G.; Fransen, C.; Lozeva, R.; Sieja, K.; Verney, D.; de Angelis, G.; Bazzacco, D.; Birkenbach, B.; Bottoni, S.; Bracco, A.; Braunroth, T.; Cederwall, B.; Corradi, L.; Crespi, F. C. L.; Désesquelles, P.; Eberth, J.; Ellinger, E.; Farnea, E.; Fioretto, E.; Gernhäuser, R.; Goasduff, A.; Görgen, A.; Gottardo, A.; Grebosz, J.; Hackstein, M.; Hess, H.; Ibrahim, F.; Jolie, J.; Jungclaus, A.; Kolos, K.; Korten, W.; Leoni, S.; Lunardi, S.; Maj, A.; Menegazzo, R.; Mengoni, D.; Michelagnoli, C.; Mijatovic, T.; Million, B.; Möller, O.; Modamio, V.; Montagnoli, G.; Montanari, D.; Morales, A. I.; Napoli, D. R.; Niikura, M.; Pollarolo, G.; Pullia, A.; Quintana, B.; Recchia, F.; Reiter, P.; Rosso, D.; Sahin, E.; Salsac, M. D.; Scarlassara, F.; Söderström, P.-A.; Stefanini, A. M.; Stezowski, O.; Szilner, S.; Theisen, Ch.; Valiente Dobón, J. J.; Vandone, V.; Vogt, A.

    2015-12-01

    Reduced quadrupole transition probabilities for low-lying transitions in neutron-rich Se,8684 are investigated with a recoil distance Doppler shift (RDDS) experiment. The experiment was performed at the Istituto Nazionale di Fisica Nucleare (INFN) Laboratori Nazionali di Legnaro using the Cologne Plunger device for the RDDS technique and the AGATA Demonstrator array for the γ -ray detection coupled to the PRISMA magnetic spectrometer for an event-by-event particle identification. In 86Se the level lifetime of the yrast 21+ state and an upper limit for the lifetime of the 41+ state are determined for the first time. The results of 86Se are in agreement with previously reported predictions of large-scale shell-model calculations using Ni78-I and Ni78-II effective interactions. In addition, intrinsic shape parameters of lowest yrast states in 86Se are calculated. In semimagic 84Se level lifetimes of the yrast 41+ and 61+ states are determined for the first time. Large-scale shell-model calculations using effective interactions Ni78-II, JUN45, jj4b, and jj4pna are performed. The calculations describe B (E 2 ;21+→01+) and B (E 2 ;61+→41+) fairly well and point out problems in reproducing the experimental B (E 2 ;41+→21+) .

  10. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi...

  11. Probability of boundary conditions in quantum cosmology

    Science.gov (United States)

    Suenobu, Hiroshi; Nambu, Yasusada

    2017-02-01

    One of the main interest in quantum cosmology is to determine boundary conditions for the wave function of the universe which can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation for a closed universe with a scalar field numerically and evaluate probabilities for boundary conditions of the wave function of the universe. To impose boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with a constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify the exact solutions by introducing two real parameters to discriminate boundary conditions, and obtain the probability for these parameters under the requirement of sufficient e-foldings of the inflation. The probability distribution of boundary conditions prefers the tunneling boundary condition to the no-boundary boundary condition. Furthermore, for large values of a model parameter related to the inflaton mass and the cosmological constant, the probability of boundary conditions selects an unique boundary condition different from the tunneling type.

  12. Phonotactic Probability Effects in Children Who Stutter

    Science.gov (United States)

    Anderson, Julie D.; Byrd, Courtney T.

    2008-01-01

    Purpose: The purpose of this study was to examine the influence of "phonotactic probability", which is the frequency of different sound segments and segment sequences, on the overall fluency with which words are produced by preschool children who stutter (CWS) as well as to determine whether it has an effect on the type of stuttered disfluency…

  13. Comonotonic Book-Making with Nonadditive Probabilities

    NARCIS (Netherlands)

    Diecidue, E.; Wakker, P.P.

    2000-01-01

    This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the

  14. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  15. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  16. Interstitial lung disease probably caused by imipramine.

    Science.gov (United States)

    Deshpande, Prasanna R; Ravi, Ranjani; Gouda, Sinddalingana; Stanley, Weena; Hande, Manjunath H

    2014-01-01

    Drugs are rarely associated with causing interstitial lung disease (ILD). We report a case of a 75-year-old woman who developed ILD after exposure to imipramine. To our knowledge, this is one of the rare cases of ILD probably caused due to imipramine. There is need to report such rare adverse effects related to ILD and drugs for better management of ILD.

  17. PROBABILITY SAMPLING DESIGNS FOR VETERINARY EPIDEMIOLOGY

    OpenAIRE

    Xhelil Koleci; Coryn, Chris L.S.; Kristin A. Hobson; Rruzhdi Keci

    2011-01-01

    The objective of sampling is to estimate population parameters, such as incidence or prevalence, from information contained in a sample. In this paper, the authors describe sources of error in sampling; basic probability sampling designs, including simple random sampling, stratified sampling, systematic sampling, and cluster sampling; estimating a population size if unknown; and factors influencing sample size determination for epidemiological studies in veterinary medicine.

  18. STRIP: stream learning of influence probabilities

    DEFF Research Database (Denmark)

    Kutzkov, Konstantin

    2013-01-01

    cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...

  19. Rethinking the learning of belief network probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  20. Entanglement Mapping VS. Quantum Conditional Probability Operator

    Science.gov (United States)

    Chruściński, Dariusz; Kossakowski, Andrzej; Matsuoka, Takashi; Ohya, Masanori

    2011-01-01

    The relation between two methods which construct the density operator on composite system is shown. One of them is called an entanglement mapping and another one is called a quantum conditional probability operator. On the base of this relation we discuss the quantum correlation by means of some types of quantum entropy.

  1. Probable Bright Supernova discovered by PSST

    Science.gov (United States)

    Smith, K. W.; Wright, D.; Smartt, S. J.; Young, D. R.; Huber, M.; Chambers, K. C.; Flewelling, H.; Willman, M.; Primak, N.; Schultz, A.; Gibson, B.; Magnier, E.; Waters, C.; Tonry, J.; Wainscoat, R. J.; Foley, R. J.; Jha, S. W.; Rest, A.; Scolnic, D.

    2016-09-01

    A bright transient, which is a probable supernova, has been discovered as part of the Pan-STARRS Survey for Transients (PSST). Information on all objects discovered by the Pan-STARRS Survey for Transients is available at http://star.pst.qub.ac.uk/ps1threepi/ (see Huber et al. ATel #7153).

  2. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  3. Updating piping probabilities with survived historical loads

    NARCIS (Netherlands)

    Schweckendiek, T.; Kanning, W.

    2009-01-01

    Piping, also called under-seepage, is an internal erosion mechanism, which can cause the failure of dikes or other flood defence structures. The uncertainty in the resistance of a flood defence against piping is usually large, causing high probabilities of failure for this mechanism. A considerable

  4. Assessing Schematic Knowledge of Introductory Probability Theory

    Science.gov (United States)

    Birney, Damian P.; Fogarty, Gerard J.; Plank, Ashley

    2005-01-01

    The ability to identify schematic knowledge is an important goal for both assessment and instruction. In the current paper, schematic knowledge of statistical probability theory is explored from the declarative-procedural framework using multiple methods of assessment. A sample of 90 undergraduate introductory statistics students was required to…

  5. Independent Events in Elementary Probability Theory

    Science.gov (United States)

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  6. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data...

  7. Probability from a Socio-Cultural Perspective

    Science.gov (United States)

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  8. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  9. Quantitative non-monotonic modeling of economic uncertainty by probability and possibility distributions

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2012-01-01

    to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... by probability distributions is readily done by means of Monte Carlo simulation. Calculation of non-monotonic functions of possibility distributions is done within the theoretical framework of fuzzy intervals, but straight forward application of fuzzy arithmetic in general results in overestimation of interval...

  10. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  11. Momentum Probabilities for a Single Quantum Particle in Three-Dimensional Regular "Infinite" Wells: One Way of Promoting Understanding of Probability Densities

    Science.gov (United States)

    Riggs, Peter J.

    2013-01-01

    Students often wrestle unsuccessfully with the task of correctly calculating momentum probability densities and have difficulty in understanding their interpretation. In the case of a particle in an "infinite" potential well, its momentum can take values that are not just those corresponding to the particle's quantised energies but…

  12. Assessment of seismic margin calculation methods

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, R.P.; Murray, R.C.; Ravindra, M.K.; Reed, J.W.; Stevenson, J.D.

    1989-03-01

    Seismic margin review of nuclear power plants requires that the High Confidence of Low Probability of Failure (HCLPF) capacity be calculated for certain components. The candidate methods for calculating the HCLPF capacity as recommended by the Expert Panel on Quantification of Seismic Margins are the Conservative Deterministic Failure Margin (CDFM) method and the Fragility Analysis (FA) method. The present study evaluated these two methods using some representative components in order to provide further guidance in conducting seismic margin reviews. It is concluded that either of the two methods could be used for calculating HCLPF capacities. 21 refs., 9 figs., 6 tabs.

  13. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  14. Exact probability distribution for the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise

    Science.gov (United States)

    Calisto, H.; Bologna, M.

    2007-05-01

    We report an exact result for the calculation of the probability distribution of the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise. We study the conditions under which the probability distribution of the Malthus-Verhulst model can exhibit a transition from a unimodal to a bimodal distribution depending on the value of a critical parameter. Also we show that the mean value of x(t) in the latter model always approaches asymptotically the value 1.

  15. Probability of graphs with large spectral gap by multicanonical Monte Carlo

    OpenAIRE

    Saito, Nen; Iba, Yukito

    2010-01-01

    Graphs with large spectral gap are important in various fields such as biology, sociology and computer science. In designing such graphs, an important question is how the probability of graphs with large spectral gap behaves. A method based on multicanonical Monte Carlo is introduced to quantify the behavior of this probability, which enables us to calculate extreme tails of the distribution. The proposed method is successfully applied to random 3-regular graphs and large deviation probabilit...

  16. Probability of fixation of an advantageous mutant in a viral quasispecies.

    OpenAIRE

    Wilke, Claus O.

    2003-01-01

    The probability that an advantageous mutant rises to fixation in a viral quasispecies is investigated in the framework of multi-type branching processes. Whether fixation is possible depends on the overall growth rate of the quasispecies that will form if invasion is successful, rather than on the individual fitness of the invading mutant. The exact fixation probability can only be calculated if the fitnesses of all potential members of the invading quasispecies are known. Quasispecies fixati...

  17. Constructing the probability distribution function for the total capacity of a power system

    Energy Technology Data Exchange (ETDEWEB)

    Vasin, V.P.; Prokhorenko, V.I.

    1980-01-01

    The difficulties involved in constructing the probability distribution function for the total capacity of a power system consisting of numerous power plants are discussed. A method is considered for the approximate determination of such a function by a Monte Carlo method and by exact calculation based on special recursion formulas on a particular grid of argument values. It is shown that there may be significant deviations between the true probability distribution and a normal distribution.

  18. Role of Cluster Deformations on Their Preformation Probabilities in Radioactive Cluster-Decay Studies

    Science.gov (United States)

    Săndulescu, Aurel; Gupta, Raj K.; Greiner, Walter; Carstoiu, Florin; Horoi, Mihai

    The folded Michigan — 3-Yukawa (M3Y) potential is used for the first time for calculating the WKB penetration probabilities, with deformation effects of the emitted clusters also included for the first time. For the clusters to be spheres, our calculations of the (empirical) performation probabilities P0 show that, depending rather sensitively on the choice of nuclear potential, P0 decreases with mass Ac of the emitted cluster, as is the case for other two model calculations of Blendowske-Walliser and Malik-Gupta. The deformation effects of the emitted clusters are shown to reduce the performation probabilities by an order of magnitude 103, which invalidates the simple straight line formula between — log10 P0 and Ac.

  19. Cluster formation probability in the trans-tin and trans-lead nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Santhosh, K.P. [School of Pure and Applied Physics, Kannur University, Payyanur Campus, Payyanur 670 327 (India)], E-mail: drkpsanthosh@gmail.com; Biju, R.K.; Sahadevan, Sabina [P.G. Department of Physics and Research Centre, Payyanur College, Payyanur 670 327 (India)

    2010-07-01

    Within our fission model, the Coulomb and proximity potential model (CPPM) cluster formation probabilities are calculated for different clusters ranging from carbon to silicon for the parents in the trans-tin and trans-lead regions. It is found that in trans-tin region the {sup 12}C, {sup 16}O, {sup 20}Ne and {sup 24}Mg clusters have maximum cluster formation probability and lowest half lives as compared to other clusters. In trans-lead region the {sup 14}C, {sup 18,20}O, {sup 23}F, {sup 24,26}Ne, {sup 28,30}Mg and {sup 34}Si clusters have the maximum cluster formation probability and minimum half life, which show that alpha like clusters are most probable for emission from trans-tin region while non-alpha clusters are probable from trans-lead region. These results stress the role of neutron proton symmetry and asymmetry of daughter nuclei in these two cases.

  20. Cluster formation probability in the trans-tin and trans-lead nuclei

    CERN Document Server

    Santhosh, K P; Sahadevan, Sabina; 10.1016/j.nuclphysa.2010.03.004

    2010-01-01

    Within our fission model, the Coulomb and proximity potential model (CPPM) cluster formation probabilities are calculated for different clusters ranging from carbon to silicon for the parents in the trans-tin and trans- lead regions. It is found that in trans-tin region the 12^C, 16^O, 20^Ne and 24^Mg clusters have maximum cluster formation probability and lowest half lives as compared to other clusters. In trans-lead region the 14^C, 18, 20^O, 23^F, 24,26^Ne, 28,30^Mg and 34^Si clusters have the maximum cluster formation probability and minimum half life, which show that alpha like clusters are most probable for emission from trans-tin region while non-alpha clusters are probable from trans-lead region. These results stress the role of neutron proton symmetry and asymmetry of daughter nuclei in these two cases.

  1. Probability of fixation of an advantageous mutant in a viral quasispecies

    CERN Document Server

    Wilke, C O

    2002-01-01

    The probability that an advantageous mutant rises to fixation in a viral quasispecies is investigated in the framework of multi-type branching processes. Whether fixation is possible depends on the overall growth rate of the quasispecies that will form if invasion is successful, rather than on the individual fitness of the invading mutant. The exact fixation probability can only be calculated if the fitnesses of all potential members of the invading quasispecies are known. Quasispecies fixation has two important characteristics: First, a sequence with negative selection coefficient has a positive fixation probability as long as it has the potential to grow into a quasispecies with an overall growth rate that exceeds the one of the established quasispecies. Second, the fixation probabilities of sequences with identical fitnesses can nevertheless vary over many orders of magnitudes. Two approximations for the probability of fixation are introduced. Both approximations require only partial knowledge about the po...

  2. Quantum Zeno and anti-Zeno effects measured by transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenxian, E-mail: wxzhang@whu.edu.cn [School of Physics and Technology, Wuhan University, Wuhan, Hubei 430072 (China); Department of Optical Science and Engineering, Fudan University, Shanghai 200433 (China); CEMS, RIKEN, Saitama 351-0198 (Japan); Kavli Institute for Theoretical Physics China, CAS, Beijing 100190 (China); Kofman, A.G. [CEMS, RIKEN, Saitama 351-0198 (Japan); Department of Physics, The University of Michigan, Ann Arbor, MI 48109-1040 (United States); Zhuang, Jun [Department of Optical Science and Engineering, Fudan University, Shanghai 200433 (China); You, J.Q. [Beijing Computational Science Research Center, Beijing 10084 (China); Department of Physics, Fudan University, Shanghai 200433 (China); CEMS, RIKEN, Saitama 351-0198 (Japan); Nori, Franco [CEMS, RIKEN, Saitama 351-0198 (Japan); Department of Physics, The University of Michigan, Ann Arbor, MI 48109-1040 (United States)

    2013-10-30

    Using numerical calculations, we compare the transition probabilities of many spins in random magnetic fields, subject to either frequent projective measurements, frequent phase modulations, or a mix of modulations and measurements. For various distribution functions, we find the transition probability under frequent modulations is suppressed most if the pulse delay is short and the evolution time is larger than a critical value. Furthermore, decay freezing occurs only under frequent modulations as the pulse delay approaches zero. In the large pulse-delay region, however, the transition probabilities under frequent modulations are highest among the three control methods.

  3. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  4. Probability of Boundary Conditions in Quantum Cosmology

    CERN Document Server

    Suenobu, Hiroshi

    2016-01-01

    One of the main interest in quantum cosmology is to determine which type of boundary conditions for the wave function of the universe can predict observational data of our universe. For this purpose, we solve the Wheeler-DeWitt equation numerically and evaluate probabilities for an observable representing evolution of the classical universe, especially, the number of e-foldings of the inflation. To express boundary conditions of the wave function, we use exact solutions of the Wheeler-DeWitt equation with constant scalar field potential. These exact solutions include wave functions with well known boundary condition proposals, the no-boundary proposal and the tunneling proposal. We specify them introducing two real parameters which discriminate boundary conditions and estimate values of these parameters resulting in observationally preferable predictions. We obtain the probability for these parameters under the requirement of the sufficient e-foldings of the inflation.

  5. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... computations of aggregate values. The paper also reports on the experiments with the methods. The work is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. No previous work considers the combination of the aspects of uncertain...

  6. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  7. Quantum probabilities: an information-theoretic interpretation

    CERN Document Server

    Bub, Jeffrey

    2010-01-01

    This Chapter develops a realist information-theoretic interpretation of the nonclassical features of quantum probabilities. On this view, what is fundamental in the transition from classical to quantum physics is the recognition that \\emph{information in the physical sense has new structural features}, just as the transition from classical to relativistic physics rests on the recognition that space-time is structurally different than we thought. Hilbert space, the event space of quantum systems, is interpreted as a kinematic (i.e., pre-dynamic) framework for an indeterministic physics, in the sense that the geometric structure of Hilbert space imposes objective probabilistic or information-theoretic constraints on correlations between events, just as the geometric structure of Minkowski space in special relativity imposes spatio-temporal kinematic constraints on events. The interpretation of quantum probabilities is more subjectivist in spirit than other discussions in this book (e.g., the chapter by Timpson)...

  8. Generating pseudo-random discrete probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maziero, Jonas, E-mail: jonasmaziero@gmail.com [Universidade Federal de Santa Maria (UFSM), RS (Brazil). Departamento de Fisica

    2015-08-15

    The generation of pseudo-random discrete probability distributions is of paramount importance for a wide range of stochastic simulations spanning from Monte Carlo methods to the random sampling of quantum states for investigations in quantum information science. In spite of its significance, a thorough exposition of such a procedure is lacking in the literature. In this article, we present relevant details concerning the numerical implementation and applicability of what we call the iid, normalization, and trigonometric methods for generating an unbiased probability vector p=(p{sub 1},⋯ ,p{sub d}). An immediate application of these results regarding the generation of pseudo-random pure quantum states is also described. (author)

  9. Estimation of transition probabilities of credit ratings

    Science.gov (United States)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  10. Earthquake probabilities: theoretical assessments and reality

    Science.gov (United States)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  11. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  12. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  13. Interpreting Prediction Market Prices as Probabilities

    OpenAIRE

    Wolfers, Justin; Zitzewitz, Eric

    2006-01-01

    While most empirical analysis of prediction markets treats prices of binary options as predictions of the probability of future events, Manski (2004) has recently argued that there is little existing theory supporting this practice. We provide relevant analytic foundations, describing sufficient conditions under which prediction markets prices correspond with mean beliefs. Beyond these specific sufficient conditions, we show that for a broad class of models prediction market prices are usuall...

  14. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    2011-05-23

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.  Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID).   Date Released: 5/25/2011.

  15. The Origin of Probability and Entropy

    Science.gov (United States)

    Knuth, Kevin H.

    2008-11-01

    Measuring is the quantification of ordering. Thus the process of ordering elements of a set is a more fundamental activity than measuring. Order theory, also known as lattice theory, provides a firm foundation on which to build measure theory. The result is a set of new insights that cast probability theory and information theory in a new light, while simultaneously opening the door to a better understanding of measures as a whole.

  16. Non-signalling Theories and Generalized Probability

    Science.gov (United States)

    Tylec, Tomasz I.; Kuś, Marek; Krajczok, Jacek

    2016-09-01

    We provide mathematically rigorous justification of using term probability in connection to the so called non-signalling theories, known also as Popescu's and Rohrlich's box worlds. No only do we prove correctness of these models (in the sense that they describe composite system of two independent subsystems) but we obtain new properties of non-signalling boxes and expose new tools for further investigation. Moreover, it allows strightforward generalization to more complicated systems.

  17. Probable Cause: A Decision Making Framework.

    Science.gov (United States)

    1984-08-01

    draw upon several approaches to the p study of causality; specifically, work in attribution theory (Hider, 1958; Kelley, 1973), methodology (Cook...psychology (Michotte, 1946; Piaget , 1974). From our perspective, much of the difficulty in assessing causality is due to the fact that judgments of...that violate probability and statistical theory . We therefore consider such cases because they highlight the various characteristics of each system

  18. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  19. A quantum probability model of causal reasoning.

    Science.gov (United States)

    Trueblood, Jennifer S; Busemeyer, Jerome R

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  20. Probability of metastable states in Yukawa clusters

    Science.gov (United States)

    Ludwig, Patrick; Kaehlert, Hanno; Baumgartner, Henning; Bonitz, Michael

    2008-11-01

    Finite strongly coupled systems of charged particles in external traps are of high interest in many fields. Here we analyze the occurrence probabilities of ground- and metastable states of spherical, three-dimensional Yukawa clusters by means of molecular dynamics and Monte Carlo simulations and an analytical method. We find that metastable states can occur with a higher probability than the ground state, thus confirming recent dusty plasma experiments with so-called Yukawa balls [1]. The analytical method [2], based on the harmonic approximation of the potential energy, allows for a very intuitive explanation of the probabilities when combined with the simulation results [3].[1] D. Block, S. Käding, A. Melzer, A. Piel, H. Baumgartner, and M. Bonitz, Physics of Plasmas 15, 040701 (2008)[2] F. Baletto and R. Ferrando, Reviews of Modern Physics 77, 371 (2005)[3] H. Kählert, P. Ludwig, H. Baumgartner, M. Bonitz, D. Block, S. Käding, A. Melzer, and A. Piel, submitted for publication (2008)

  1. A Quantum Probability Model of Causal Reasoning

    Science.gov (United States)

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  2. Bacteria survival probability in bactericidal filter paper.

    Science.gov (United States)

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive.

  3. Probability of Default and Default Correlations

    Directory of Open Access Journals (Sweden)

    Weiping Li

    2016-07-01

    Full Text Available We consider a system where the asset values of firms are correlated with the default thresholds. We first evaluate the probability of default of a single firm under the correlated assets assumptions. This extends Merton’s probability of default of a single firm under the independent asset values assumption. At any time, the distance-to-default for a single firm is derived in the system, and this distance-to-default should provide a different measure for credit rating with the correlated asset values into consideration. Then we derive a closed formula for the joint default probability and a general closed formula for the default correlation via the correlated multivariate process of the first-passage-time default correlation model. Our structural model encodes the sensitivities of default correlations with respect to the underlying correlation among firms’ asset values. We propose the disparate credit risk management from our result in contrast to the commonly used risk measurement methods considering default correlations into consideration.

  4. A quantum probability model of causal reasoning

    Directory of Open Access Journals (Sweden)

    Jennifer S Trueblood

    2012-05-01

    Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  5. Principles of failure probability assessment (PoF)

    Energy Technology Data Exchange (ETDEWEB)

    Giribone, R. [Bureau Veritas, Energy and Process Business Line, 17 bis Place des Reflets, Courbevoie 92400 (France); Valette, B. [Bureau Veritas, Energy and Process Business Line, 17 bis Place des Reflets, Courbevoie 92400 (France)]. E-mail: bernard.valette@bureauveritas.com

    2004-11-01

    This abstract presents a method for computing Probability of Failure (PoF) namely the method integrating the so-called 'Bayesian approach'. PoF along with the assessment of the consequences of failure are required when it comes to assessing 'risks'. More and more frequently, in modern industries, the trend is to rely on the use of risk-based approaches for the scheduling of the inspection of static pressure vessels. Equipment PoF is the main driver for scheduling periodical inspections. Within the Bayesian approach, it is expected that the performance of inspection, provided effective techniques are used, increases the knowledge we have on the equipment condition and help us gain confidence in the planning of future inspections. The paper thus describes the theoretical principles yielding to the calculation of the PoF prior to conduct an inspection and after its performance. PoF calculation within a Risk-Based Inspection (RBI) planning is one of the aspects covered by the EU project called 'RIMAP' (Risk-Based Inspection and Maintenance Procedure). PoF calculation using Bayes theorem is the cornerstone of the RBI methodology described in American Petroleum Institute reference 'API 581'.

  6. Landslide Probability Assessment by the Derived Distributions Technique

    Science.gov (United States)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    is used along with the soil characteristic curve (suction vs. moisture) and the Mohr-Coulomb failure criteria in order to calculate the FOS of the slope. Data from two slopes located on steep tropical regions of the cities of Medellín (Colombia) and Rio de Janeiro (Brazil) where used to verify the model's performance. The results indicated significant differences between the obtained FOS values and the behavior observed on the field. The model shows relatively high values of FOS that do not reflect the instability of the analyzed slopes. For the two cases studied, the application of a more simple reliability concept (as the Probability of Failure - PR and Reliability Index - β), instead of a FOS could lead to more realistic results.

  7. Geochemical Calculations Using Spreadsheets.

    Science.gov (United States)

    Dutch, Steven Ian

    1991-01-01

    Spreadsheets are well suited to many geochemical calculations, especially those that are highly repetitive. Some of the kinds of problems that can be conveniently solved with spreadsheets include elemental abundance calculations, equilibrium abundances in nuclear decay chains, and isochron calculations. (Author/PR)

  8. Autistic Savant Calendar Calculators.

    Science.gov (United States)

    Patti, Paul J.

    This study identified 10 savants with developmental disabilities and an exceptional ability to calculate calendar dates. These "calendar calculators" were asked to demonstrate their abilities, and their strategies were analyzed. The study found that the ability to calculate dates into the past or future varied widely among these…

  9. How Do Calculators Calculate Trigonometric Functions?

    Science.gov (United States)

    Underwood, Jeremy M.; Edwards, Bruce H.

    How does your calculator quickly produce values of trigonometric functions? You might be surprised to learn that it does not use series or polynomial approximations, but rather the so-called CORDIC method. This paper will focus on the geometry of the CORDIC method, as originally developed by Volder in 1959. This algorithm is a wonderful…

  10. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  11. A new estimator of the discovery probability.

    Science.gov (United States)

    Favaro, Stefano; Lijoi, Antonio; Prünster, Igor

    2012-12-01

    Species sampling problems have a long history in ecological and biological studies and a number of issues, including the evaluation of species richness, the design of sampling experiments, and the estimation of rare species variety, are to be addressed. Such inferential problems have recently emerged also in genomic applications, however, exhibiting some peculiar features that make them more challenging: specifically, one has to deal with very large populations (genomic libraries) containing a huge number of distinct species (genes) and only a small portion of the library has been sampled (sequenced). These aspects motivate the Bayesian nonparametric approach we undertake, since it allows to achieve the degree of flexibility typically needed in this framework. Based on an observed sample of size n, focus will be on prediction of a key aspect of the outcome from an additional sample of size m, namely, the so-called discovery probability. In particular, conditionally on an observed basic sample of size n, we derive a novel estimator of the probability of detecting, at the (n+m+1)th observation, species that have been observed with any given frequency in the enlarged sample of size n+m. Such an estimator admits a closed-form expression that can be exactly evaluated. The result we obtain allows us to quantify both the rate at which rare species are detected and the achieved sample coverage of abundant species, as m increases. Natural applications are represented by the estimation of the probability of discovering rare genes within genomic libraries and the results are illustrated by means of two expressed sequence tags datasets.

  12. Uncertainty squared: Choosing among multiple input probability distributions and interpreting multiple output probability distributions in Monte Carlo climate risk models

    Science.gov (United States)

    Baer, P.; Mastrandrea, M.

    2006-12-01

    Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly

  13. Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation

    Science.gov (United States)

    Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin

    2016-12-01

    If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.

  14. Data analysis & probability task & drill sheets

    CERN Document Server

    Cook, Tanya

    2011-01-01

    For grades 3-5, our State Standards-based combined resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to review the concepts in unique ways. The task sheets introduce the mathematical concepts to the students around a central problem taken from real-life experiences, while the drill sheets provide warm-up and timed practice questions for the students to strengthen their procedural proficiency skills. Included in our resource are activities to help students learn how to collect, organize, analyze, interpret, and predict data pro

  15. Normativity And Probable Reasoning: Hume On Induction

    OpenAIRE

    Tejedor, Chon

    2011-01-01

    En este artículo examino el debate entre los intérpretes epistémicos y descriptivistas de la discusión humeana de la inducción y el razonamiento probable. Los intérpretes epistémicos consideran a Hume como concernido principalmente con cuestiones relacionadas con la autoridad y justificación epistémica de nuestros principios y creencias inductivas. Los intérpretes descriptivistas, por contra, sugieren que lo que Hume pretende es explicar cómo se producen nuestras creencias, no dictaminar si e...

  16. Elemental mercury poisoning probably causes cortical myoclonus.

    Science.gov (United States)

    Ragothaman, Mona; Kulkarni, Girish; Ashraf, Valappil V; Pal, Pramod K; Chickabasavaiah, Yasha; Shankar, Susarla K; Govindappa, Srikanth S; Satishchandra, Parthasarthy; Muthane, Uday B

    2007-10-15

    Mercury toxicity causes postural tremors, commonly referred to as "mercurial tremors," and cerebellar dysfunction. A 23-year woman, 2 years after injecting herself with elemental mercury developed disabling generalized myoclonus and ataxia. Electrophysiological studies confirmed the myoclonus was probably of cortical origin. Her deficits progressed over 2 years and improved after subcutaneous mercury deposits at the injection site were surgically cleared. Myoclonus of cortical origin has never been described in mercury poisoning. It is important to ask patients presenting with jerks about exposure to elemental mercury even if they have a progressive illness, as it is a potentially reversible condition as in our patient.

  17. Atomic transition probabilities of Gd i

    Science.gov (United States)

    Lawler, J. E.; Bilty, K. A.; Den Hartog, E. A.

    2011-05-01

    Fourier transform spectra are used to determine emission branching fractions for 1290 lines of the first spectrum of gadolinium (Gd i). These branching fractions are converted to absolute atomic transition probabilities using previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 055001). The wavelength range of the data set is from 300 to 1850 nm. A least squares technique for separating blends of the first and second spectra lines is also described and demonstrated in this work.

  18. Atomic transition probabilities of Er i

    Science.gov (United States)

    Lawler, J. E.; Wyart, J.-F.; Den Hartog, E. A.

    2010-12-01

    Atomic transition probabilities for 562 lines of the first spectrum of erbium (Er i) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2010 J. Phys. B: At. Mol. Opt. Phys. 43 155004). The wavelength range of the data set is from 298 to 1981 nm. In this work we explore the utility of parametric fits based on the Cowan code in assessing branching fraction errors due to lines connecting to unobserved lower levels.

  19. Intermediate Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the second in a series of three short books on probability theory and random processes for biomedical engineers. This volume focuses on expectation, standard deviation, moments, and the characteristic function. In addition, conditional expectation, conditional moments and the conditional characteristic function are also discussed. Jointly distributed random variables are described, along with joint expectation, joint moments, and the joint characteristic function. Convolution is also developed. A considerable effort has been made to develop the theory in a logical manner--developing sp

  20. Probability of inflation in loop quantum cosmology

    Science.gov (United States)

    Ashtekar, Abhay; Sloan, David

    2011-12-01

    Inflationary models of the early universe provide a natural mechanism for the formation of large scale structure. This success brings to forefront the question of naturalness: Does a sufficiently long slow roll inflation occur generically or does it require a careful fine tuning of initial parameters? In recent years there has been considerable controversy on this issue (Hollands and Wald in Gen Relativ Gravit, 34:2043, 2002; Kofman et al. in J High Energy Phys 10:057, 2002); (Gibbons and Turok in Phys Rev D 77:063516, 2008). In particular, for a quadratic potential, Kofman et al. (J High Energy Phys 10:057, 2002) have argued that the probability of inflation with at least 65 e-foldings is close to one, while Gibbons and Turok (Phys Rev D 77:063516, 2008) have argued that this probability is suppressed by a factor of ~10-85. We first clarify that such dramatically different predictions can arise because the required measure on the space of solutions is intrinsically ambiguous in general relativity. We then show that this ambiguity can be naturally resolved in loop quantum cosmology (LQC) because the big bang is replaced by a big bounce and the bounce surface can be used to introduce the structure necessary to specify a satisfactory measure. The second goal of the paper is to present a detailed analysis of the inflationary dynamics of LQC using analytical and numerical methods. By combining this information with the measure on the space of solutions, we address a sharper question than those investigated in Kofman et al. (J High Energy Phys 10:057, 2002), Gibbons and Turok (Phys Rev D 77:063516, 2008), Ashtekar and Sloan (Phys Lett B 694:108, 2010): What is the probability of a sufficiently long slow roll inflation which is compatible with the seven year WMAP data? We show that the probability is very close to 1. The material is so organized that cosmologists who may be more interested in the inflationary dynamics in LQC than in the subtleties associated with

  1. Numerical Ultimate Ruin Probabilities under Interest Force

    Directory of Open Access Journals (Sweden)

    Juma Kasozi

    2005-01-01

    Full Text Available This work addresses the issue of ruin of an insurer whose portfolio is exposed to insurance risk arising from the classical surplus process. Availability of a positive interest rate in the financial world forces the insurer to invest into a risk free asset. We derive a linear Volterra integral equation of the second kind and apply an order four Block-by-block method in conjuction with the Simpson rule to solve the Volterra equation for ultimate ruin. This probability is arrived at by taking a linear combination of some two solutions to the Volterra integral equation. The several numerical examples given show that our results are excellent and reliable.

  2. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  3. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  4. Acceleration Detection of Large (Probably Prime Numbers

    Directory of Open Access Journals (Sweden)

    Dragan Vidakovic

    2013-02-01

    Full Text Available In order to avoid unnecessary applications of Miller-Rabin algorithm to the number in question, we resortto trial division by a few initial prime numbers, since such a division take less time. How far we should gowith such a division is the that we are trying to answer in this paper?For the theory of the matter is fullyresolved. However, that in practice we do not have much use.Therefore, we present a solution that isprobably irrelevant to theorists, but it is very useful to people who have spent many nights to producelarge (probably prime numbers using its own software.

  5. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  6. Random iteration with place dependent probabilities

    CERN Document Server

    Kapica, R

    2011-01-01

    Markov chains arising from random iteration of functions $S_{\\theta}:X\\to X$, $\\theta \\in \\Theta$, where $X$ is a Polish space and $\\Theta$ is arbitrary set of indices are considerd. At $x\\in X$, $\\theta$ is sampled from distribution $\\theta_x$ on $\\Theta$ and $\\theta_x$ are different for different $x$. Exponential convergence to a unique invariant measure is proved. This result is applied to case of random affine transformations on ${\\mathbb R}^d$ giving existence of exponentially attractive perpetuities with place dependent probabilities.

  7. Modulation Based on Probability Density Functions

    Science.gov (United States)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  8. Optimal Reinsurance with Heterogeneous Reference Probabilities

    Directory of Open Access Journals (Sweden)

    Tim J. Boonen

    2016-07-01

    Full Text Available This paper studies the problem of optimal reinsurance contract design. We let the insurer use dual utility, and the premium is an extended Wang’s premium principle. The novel contribution is that we allow for heterogeneity in the beliefs regarding the underlying probability distribution. We characterize layer-reinsurance as an optimal reinsurance contract. Moreover, we characterize layer-reinsurance as optimal contracts when the insurer faces costs of holding regulatory capital. We illustrate this in cases where both firms use the Value-at-Risk or the conditional Value-at-Risk.

  9. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  10. Nanotube Tunneling as a Consequence of Probable Discrete Trajectories

    Science.gov (United States)

    Robinson, Daryl C.

    2001-01-01

    It has been recently reported that the electrical charge in a semiconductive carbon nanotube is not evenly distributed, but is divided into charge "islands." A clear understanding of tunneling phenomena can be useful to elucidate the mechanism for electrical conduction in nanotubes. This paper represents the first attempt to shed light on the aforementioned phenomenon through viewing tunneling as a natural consequence of "discrete trajectories." The relevance of this analysis is that it may provide further insight into the higher rate of tunneling processes, which makes tunneling devices attractive. In a situation involving particles impinging on a classically impenetrable barrier, the result of quantum mechanics that the probability of detecting transmitted particles falls off exponentially is derived without wave theory. This paper should provide a basis for calculating the charge profile over the length of the tube so that nanoscale devices' conductive properties may be fully exploited.

  11. Cosmological constraints from the convergence 1-point probability distribution

    CERN Document Server

    Patton, Kenneth; Honscheid, Klaus; Huff, Eric; Melchior, Peter; Ross, Ashley J; Suchyta, Eric

    2016-01-01

    We examine the cosmological information available from the 1-point probability distribution (PDF) of the weak-lensing convergence field, utilizing fast L-PICOLA simulations and a Fisher analysis. We find competitive constraints in the $\\Omega_m$-$\\sigma_8$ plane from the convergence PDF with $188\\ arcmin^2$ pixels compared to the cosmic shear power spectrum with an equivalent number of modes ($\\ell < 886$). The convergence PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is less susceptible, and improves the total figure of merit by a factor of $2-3$, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.

  12. Experimental and theoretical lifetimes and transition probabilities in Sb I

    CERN Document Server

    Hartman, Henrik; Engström, Lars; Lundberg, Hans; Palmeri, Patrick; Quinet, Pascal; Biémont, Emile; 10.1103/PhysRevA.82.052512

    2010-01-01

    We present experimental atomic lifetimes for 12 levels in Sb I, out of which seven are reported for the first time. The levels belong to the 5p$^2$($^3$P)6s $^{2}$P, $^{4}$P and 5p$^2$($^3$P)5d $^{4}$P, $^{4}$F and $^{2}$F terms. The lifetimes were measured using time-resolved laser-induced fluorescence. In addition, we report new calculations of transition probabilities in Sb I using a Multiconfigurational Dirac-Hartree-Fock method. The physical model being tested through comparisons between theoretical and experimental lifetimes for 5d and 6s levels. The lifetimes of the 5d $^4$F$_{3/2, 5/2, 7/2}$ levels (19.5, 7.8 and 54 ns, respectively) depend strongly on the $J$-value. This is explained by different degrees of level mixing for the different levels in the $^4$F term.

  13. Application of Risk Probability Evaluation Method to Offshore Platform Construction

    Institute of Scientific and Technical Information of China (English)

    YU Jianxing; TAN Zhendong

    2005-01-01

    Offshore project risk concerns many influence factors with complex relationship, and traditional methods cannot be used for the evaluation on risk probability. To deal with this problem, a new method was developed by the combination of improved technique for order preference by similarity ideal solution method, analytical hierarchy process method and the network response surface method. The risk probability was calculated by adopting network response surface analysis based on the state variable of a known event and its degree of membership.This quantification method was applied to an offshore platform project, Bonan oil and gas field project in Bohai Bay in June 2004.There were 7 sub-projects and each includes 4 risk factors.The values of 28 risk factors, ranging from 10-6 to 10-4, were achieved. This precision conforms to the international principle of as low as reasonably practically.The evaluation indicates that the values of comprehensive level of construction group and ability of technical personnel on the spot are relatively high among all risk factors, so these two factors should be paid more attention to in offshore platform construction.

  14. Probability of failure of waste disposals sites in Žirovski vrh uranium mine

    Directory of Open Access Journals (Sweden)

    Tomaž Beguš

    2002-12-01

    Full Text Available The only Uranium mine in Slovenia @irovski vrh was closed in 1990 due to economic reasons. After the closure extensive decommissioning works in the mine and in the surrounding began. In the very beginning after the closure great landslide has been occurred in the mill tailings site and recalculation of stability of existent and alternate sites were performed. In this calculations I used statistical scatter of input variables and calculated probability of failure of sites.

  15. Collision strengths and transition probabilities for Co II infrared forbidden lines

    CERN Document Server

    Storey, P J; Sochi, Taha

    2016-01-01

    We calculate collision strengths and their thermally-averaged Maxwellian values for electron excitation and de-excitation between the fifteen lowest levels of singly-ionised cobalt, Co+, which give rise to emission lines in the near- and mid-infrared. Transition probabilities are also calculated and relative line intensities predicted for conditions typical of supernova ejecta. The diagnostic potential of the 10.52, 15.46 and 14.74 micro-metre transition lines is briefly discussed.

  16. Determination of riverbank erosion probability using Locally Weighted Logistic Regression

    Science.gov (United States)

    Ioannidou, Elena; Flori, Aikaterini; Varouchakis, Emmanouil A.; Giannakis, Georgios; Vozinaki, Anthi Eirini K.; Karatzas, George P.; Nikolaidis, Nikolaos

    2015-04-01

    erosion occurrence probability can be calculated in conjunction with the model deviance regarding the independent variables tested. The most straightforward measure for goodness of fit is the G statistic. It is a simple and effective way to study and evaluate the Logistic Regression model efficiency and the reliability of each independent variable. The developed statistical model is applied to the Koiliaris River Basin on the island of Crete, Greece. Two datasets of river bank slope, river cross-section width and indications of erosion were available for the analysis (12 and 8 locations). Two different types of spatial dependence functions, exponential and tricubic, were examined to determine the local spatial dependence of the independent variables at the measurement locations. The results show a significant improvement when the tricubic function is applied as the erosion probability is accurately predicted at all eight validation locations. Results for the model deviance show that cross-section width is more important than bank slope in the estimation of erosion probability along the Koiliaris riverbanks. The proposed statistical model is a useful tool that quantifies the erosion probability along the riverbanks and can be used to assist managing erosion and flooding events. Acknowledgements This work is part of an on-going THALES project (CYBERSENSORS - High Frequency Monitoring System for Integrated Water Resources Management of Rivers). The project has been co-financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: THALES. Investing in knowledge society through the European Social Fund.

  17. Probability Density and CFAR Threshold Estimation for Hyperspectral Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G A

    2004-09-21

    implementations to compute the decision threshold r{sub 0}*that will provide the appropriate desired Probability of False Alarm P{sub FA} for the matched filter. The goal is to use prior knowledge of the background data to generate an estimate of the probability density function (pdf) [13] of the matched filter threshold r for the case in which the data measurement contains only background data (we call this case the null hypothesis, or H{sub 0}) [10, 11]. We call the pdf estimate {cflx f}(r|H{sub 0}). In this report, we use histograms and Parzen pdf estimators [14, 15, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27]. Once the estimate is obtained, it can be integrated to compute an estimate of the P{sub FA}as a function of the matched filter detection threshold r. We can then interpolate r vs. P{sub FA} to obtain a curve that gives the threshold r{sub 0}* that will provide the appropriate desired Probability of False Alarm P{sub FA}for the matched filter. Processing results have been computed using both simulated and real LASI data sets. The algorithms and codes have been validated, and the results using LASI data are presented here. Future work includes applying the pdf estimation and CFAR threshold calculation algorithms to the LASI matched filter based upon global background statistics, and developing a new adaptive matched filter algorithm based upon local background statistics. Another goal is to implement the 4-Gamma pdf modeling method proposed by Stocker et. al. [4] and comparing results using histograms and the Parzen pdf estimators.

  18. Core calculations of JMTR

    Energy Technology Data Exchange (ETDEWEB)

    Nagao, Yoshiharu [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    1998-03-01

    In material testing reactors like the JMTR (Japan Material Testing Reactor) of 50 MW in Japan Atomic Energy Research Institute, the neutron flux and neutron energy spectra of irradiated samples show complex distributions. It is necessary to assess the neutron flux and neutron energy spectra of an irradiation field by carrying out the nuclear calculation of the core for every operation cycle. In order to advance core calculation, in the JMTR, the application of MCNP to the assessment of core reactivity and neutron flux and spectra has been investigated. In this study, in order to reduce the time for calculation and variance, the comparison of the results of the calculations by the use of K code and fixed source and the use of Weight Window were investigated. As to the calculation method, the modeling of the total JMTR core, the conditions for calculation and the adopted variance reduction technique are explained. The results of calculation are shown. Significant difference was not observed in the results of neutron flux calculations according to the difference of the modeling of fuel region in the calculations by K code and fixed source. The method of assessing the results of neutron flux calculation is described. (K.I.)

  19. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  20. Probability-consistent spectrum and code spectrum

    Institute of Scientific and Technical Information of China (English)

    沈建文; 石树中

    2004-01-01

    In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference between them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortification criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The results show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period response spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.