WorldWideScience

Sample records for calculating age-conditional probabilities

  1. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number...

  2. Validation of fluorescence transition probability calculations

    OpenAIRE

    M. G. PiaINFN, Sezione di Genova; P. Saracco(INFN, Sezione di Genova); Manju Sudhaka(INFN, Sezione di Genova)

    2015-01-01

    A systematic and quantitative validation of the K and L shell X-ray transition probability calculations according to different theoretical methods has been performed against experimental data. This study is relevant to the optimization of data libraries used by software systems, namely Monte Carlo codes, dealing with X-ray fluorescence. The results support the adoption of transition probabilities calculated according to the Hartree-Fock approach, which manifest better agreement with experimen...

  3. Necessity of Exact Calculation for Transition Probability

    Institute of Scientific and Technical Information of China (English)

    LIU Fu-Sui; CHEN Wan-Fang

    2003-01-01

    This paper shows that exact calculation for transition probability can make some systems deviate fromFermi golden rule seriously. This paper also shows that the corresponding exact calculation of hopping rate inducedby phonons for deuteron in Pd-D system with the many-body electron screening, proposed by Ichimaru, can explainthe experimental fact observed in Pd-D system, and predicts that perfection and low-dimension of Pd lattice are veryimportant for the phonon-induced hopping rate enhancement in Pd-D system.

  4. Calculation of radiative transition probabilities and lifetimes

    Science.gov (United States)

    Zemke, W. T.; Verma, K. K.; Stwalley, W. C.

    1982-01-01

    Procedures for calculating bound-bound and bound-continuum (free) radiative transition probabilities and radiative lifetimes are summarized. Calculations include rotational dependence and R-dependent electronic transition moments (no Franck-Condon or R-centroid approximation). Detailed comparisons of theoretical results with experimental measurements are made for bound-bound transitions in the A-X systems of LiH and Na2. New bound-free results are presented for LiH. New bound-free results and comparisons with very recent fluorescence experiments are presented for Na2.

  5. Calculating nuclear accident probabilities from empirical frequencies

    OpenAIRE

    Ha-Duong, Minh; Journé, V.

    2014-01-01

    International audience Since there is no authoritative, comprehensive and public historical record of nuclear power plant accidents, we reconstructed a nuclear accident data set from peer-reviewed and other literature. We found that, in a sample of five random years, the worldwide historical frequency of a nuclear major accident, defined as an INES level 7 event, is 14 %. The probability of at least one nuclear accident rated at level ≥4 on the INES scale is 67 %. These numbers are subject...

  6. Computational methods for probability of instability calculations

    Science.gov (United States)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  7. A Method for Calculating Collision Probability Between Space Objects

    OpenAIRE

    Xu, Xiaoli; Xiong, Yongqing

    2013-01-01

    A method is developed to calculate collision probability in this paper. Based on the encounter geometric features of space objects, it is reasonable to separate the radial orbital motions from that in the cross section for most encounter events in near circular orbit. Therefore, the collision probability caused by orbit altitude difference in the radial direction and the collision probability caused by arrival time difference in the cross section are calculated respectively. The net collision...

  8. Calculation Model and Simulation of Warship Damage Probability

    Institute of Scientific and Technical Information of China (English)

    TENG Zhao-xin; ZHANG Xu; YANG Shi-xing; ZHU Xiao-ping

    2008-01-01

    The combat efficiency of mine obstacle is the focus of the present research. Based on the main effects that mine obstacle has on the target warship damage probability such as: features of mines with maneuverability, the success rate of mine-laying, the hit probability, mine reliability and action probability, a calculation model of target warship mine-encounter probability is put forward under the condition that the route selection of target warships accords with even distribution and the course of target warships accords with normal distribution. And a damage probability model of mines with maneuverability to target warships is set up, a simulation way proved the model to be a high practicality.

  9. A Method for Calculating Collision Probability Between Space Objects

    CERN Document Server

    Xu, Xiaoli

    2013-01-01

    A method is developed to calculate collision probability in this paper. Based on the encounter geometric features of space objects, it is reasonable to separate the radial orbital motions from that in the cross section for most encounter events in near circular orbit. Therefore, the collision probability caused by orbit altitude difference in the radial direction and the collision probability caused by arrival time difference in the cross section are calculated respectively. The net collision probability is expressed as an explicit expression by multiplying the above two components. Numerical cases are applied to test this method by comparing the results with the general method. The results indicate that this method is valid for most near circular orbital encounter events.

  10. Calculating the probability of detecting radio signals from alien civilizations

    CERN Document Server

    Horvat, Marko

    2006-01-01

    Although it might not be self-evident, it is in fact entirely possible to calculate the probability of detecting alien radio signals by understanding what types of extraterrestrial radio emissions can be expected and what properties these emissions can have. Using the Drake equation as the obvious starting point, and logically identifying and enumerating constraints of interstellar radio communications can yield the probability of detecting a genuine alien radio signal.

  11. Calculating state-to-state transition probabilities within TDDFT

    OpenAIRE

    Rohringer, Nina; Peter, Simone; Burgdörfer, Joachim

    2005-01-01

    The determination of the elements of the S-matrix within the framework of time-dependent density-functional theory (TDDFT) has remained a widely open question. We explore two different methods to calculate state-to-state transition probabilities. The first method closely follows the extraction of the S-matrix from the time-dependent Hartree-Fock approximation. This method suffers from cross-channel correlations resulting in oscillating transition probabilities in the asymptotic channels. An a...

  12. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  13. Fostering Positive Attitude in Probability Learning Using Graphing Calculator

    Science.gov (United States)

    Tan, Choo-Kim; Harji, Madhubala Bava; Lau, Siong-Hoe

    2011-01-01

    Although a plethora of research evidence highlights positive and significant outcomes of the incorporation of the Graphing Calculator (GC) in mathematics education, its use in the teaching and learning process appears to be limited. The obvious need to revisit the teaching and learning of Probability has resulted in this study, i.e. to incorporate…

  14. Revised transition probabilities for Fe XXV: Relativistic CI calculations

    International Nuclear Information System (INIS)

    Revised data are provided for transition probabilities between fine-structure components of levels with n ≤ 6 in Fe XXV. Earlier published data for transitions between fine-structure levels in Fe XXV are found to be in error, especially for certain classes of transitions. The purpose of the present note is to provide a corrected database for transitions in Fe XXV. Wavefunctions and energies for states with n ≤ 6 and J = 0, 1, 2, 3 are determined using a relativistic configuration interaction (CI) expansion that includes the Breit interaction. To measure and control the numerical accuracy of the calculations, we compare our CI energies and matrix elements with values calculated using relativistic second-order many-body perturbation theory (MBPT), also including the Breit interaction. We obtain good agreement between our CI and MBPT calculations but disagree with earlier calculations for transitions with ΔL = 2 and for intercombination transitions (ΔS = 1). We provide wavelengths, line strengths, and transitions rates for fine-structure transition between levels with n ≤ 6 in Fe XXV

  15. CALCULATION OF PER PARCEL PROBABILITY FOR DUD BOMBS IN GERMANY

    Directory of Open Access Journals (Sweden)

    S. M. Tavakkoli Sabour

    2014-10-01

    Full Text Available Unexploded aerial Bombs, also known as duds or unfused bombs, of the bombardments in the past wars remain explosive for decades after the war under the earth’s surface threatening the civil activities especially if dredging works are involved. Interpretation of the aerial photos taken shortly after bombardments has been proven to be useful for finding the duds. Unfortunately, the reliability of this method is limited by some factors. The chance of finding a dud on an aerial photo depends strongly on the photography system, the size of the bomb and the landcover. On the other hand, exploded bombs are considerably better detectable on aerial photos and confidently represent the extent and density of a bombardment. Considering an empirical quota of unfused bombs, the expected number of duds can be calculated by the number of exploded bombs. This can help to have a better calculation of cost-risk ratio and to classify the areas for clearance. This article is about a method for calculation of a per parcel probability of dud bombs according to the distribution and density of exploded bombs. No similar work has been reported in this field by other authors.

  16. Relative Velocity as a Metric for Probability of Collision Calculations

    Science.gov (United States)

    Frigm, Ryan Clayton; Rohrbaugh, Dave

    2008-01-01

    Collision risk assessment metrics, such as the probability of collision calculation, are based largely on assumptions about the interaction of two objects during their close approach. Specifically, the approach to probabilistic risk assessment can be performed more easily if the relative trajectories of the two close approach objects are assumed to be linear during the encounter. It is shown in this analysis that one factor in determining linearity is the relative velocity of the two encountering bodies, in that the assumption of linearity breaks down at low relative approach velocities. The first part of this analysis is the determination of the relative velocity threshold below which the assumption of linearity becomes invalid. The second part is a statistical study of conjunction interactions between representative asset spacecraft and the associated debris field environment to determine the likelihood of encountering a low relative velocity close approach. This analysis is performed for both the LEO and GEO orbit regimes. Both parts comment on the resulting effects to collision risk assessment operations.

  17. Calculation of paternity probabilities from multilocus DNA profiles.

    Science.gov (United States)

    Brenner, C H; Rittner, C; Schneider, P M

    1994-02-01

    We describe a procedure for evaluation of paternity evidence from multi-locus DNA probe patterns. A computer program abstracts a "+/-" notation description from the multilocus profile and then calculates a paternity index based on observed phenotypic fragment frequencies. The biostatistical evaluation considers only bands found in the child and missing from the mother--a simplified approach that is at once robust and conservative. Mutations are of course taken into account. Particular features lending objectivity to the interpretation include computer reading and matching decisions, and specific recognition and statistical compensation for ambiguities ("faint orphans").

  18. Calculation of transition probabilities using the multiconfiguration Dirac-Fock method

    International Nuclear Information System (INIS)

    The performance of the multiconfiguration Dirac-Fock (MCDF) method in calculating transition probabilities of atoms is reviewed. In general, the MCDF wave functions will lead to transition probabilities accurate to ∼ 10% or better for strong, electric-dipole allowed transitions for small atoms. However, it is more difficult to get reliable transition probabilities for weak transitions. Also, some MCDF wave functions for a specific J quantum number may not reduce to the appropriate L and S quantum numbers in the nonrelativistic limit. Transition probabilities calculated from such MCDF wave functions for nonrelativistically forbidden transitions are unreliable. Remedies for such cases are discussed

  19. Calculation of transition probabilities using the multiconfiguration Dirac-Fock method

    International Nuclear Information System (INIS)

    The performance of the multiconfiguration Dirac-Fock (MCDF) method in calculating transition probabilities of atoms is reviewed. In general, the MCDF wave functions will lead to transition probabilities accurate to ∼10% or better for strong, electric-dipole allowed transitions for small atoms. However, it is more difficult to get reliable transition probabilities for weak transitions. Also, some MCDF wave functions for specific J quantum number may not reduce to the appropriate L and S quantum numbers in the nonrelativistic limit. Transition probabilities calculation from such MCDF wave functions for nonrelativistically forbidden transitions are unreliable. Remedies for such cases are discussed.

  20. Duality-based calculations for transition probabilities in birth-death processes

    OpenAIRE

    Ohkubo, Jun

    2015-01-01

    Transition probabilities in birth-death processes are fomulated via the corresponding dual birth-death processes. In order to obtain the corresponding dual processes, the Doi-Peliti formalism is employed. Conventional numerical evaluation enables us to obtain the transition probabilities from a fixed initial state; on the other hand, the duality relation gives us a useful method to calculate the transition probabilities to a fixed final state. Furthermore, it is clarified that the transition ...

  1. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  2. Quantum dynamics calculation of reaction probability for H+Cl2→HC1+Cl

    Institute of Scientific and Technical Information of China (English)

    王胜龙; 赵新生

    2001-01-01

    We present in this paper a time-dependent quantum wave packet calculation of the initial state selected reaction probability for H + CI2 based on the GHNS potential energy surface with total angular momentum J= 0. The effects of the translational, vibrational and rotational excitation of CI2 on the reaction probability have been investigated. In a broad region of the translational energy, the rotational excitation enhances the reaction probability while the vibrational excitation depresses the reaction probability. The theoretical results agree well with the fact that it is an early down-hill reaction.

  3. Quantum dynamics calculation of reaction probability for H+Cl2→HCl+Cl

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We present in this paper a time-dependent quantum wave packet calculation of the initial state selected reaction probability for H + Cl2 based on the GHNS potential energy surface with total angular momentum J = 0. The effects of the translational, vibrational and rotational excitation of Cl2 on the reaction probability have been investigated. In a broad region of the translational energy, the rotational excitation enhances the reaction probability while the vibrational excitation depresses the reaction probability. The theoretical results agree well with the fact that it is an early down-hill reaction.

  4. The risk of major nuclear accident: calculation and perception of probabilities

    International Nuclear Information System (INIS)

    Whereas before the Fukushima accident, already eight major accidents occurred in nuclear power plants, a number which is higher than that expected by experts and rather close to that corresponding of people perception of risk, the author discusses how to understand these differences and reconcile observations, objective probability of accidents and subjective assessment of risks, why experts have been over-optimistic, whether public opinion is irrational regarding nuclear risk, and how to measure risk and its perception. Thus, he addresses and discusses the following issues: risk calculation (cost, calculated frequency of major accident, bias between the number of observed accidents and model predictions), perceived probabilities and aversion for disasters (perception biases of probability, perception biases unfavourable to nuclear), the Bayes contribution and its application (Bayes-Laplace law, statistics, choice of an a priori probability, prediction of the next event, probability of a core fusion tomorrow)

  5. Notes on Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities

    OpenAIRE

    Hyun-Kyung Chung; Per Jönsson; Alexander Kramida

    2013-01-01

    Atomic structure and transition probabilities are fundamental physical data required in many fields of science and technology. Atomic physics codes are freely available to other community users to generate atomic data for their interest, but the quality of these data is rarely verified. This special issue addresses estimation of uncertainties in atomic structure and transition probability calculations, and discusses methods and strategies to assess and ensure the quality of theoretical atomic...

  6. 'PRIZE': A program for calculating collision probabilities in R-Z geometry

    International Nuclear Information System (INIS)

    PRIZE is an IBM7090 program which computes collision probabilities for systems with axial symmetry and outputs them on cards in suitable format for the PIP1 program. Its method of working, data requirements, output, running time and accuracy are described. The program has been used to compute non-escape (self-collision) probabilities of finite circular cylinders, and a table is given by which non-escape probabilities of slabs, finite and infinite circular cylinders, infinite square cylinders, cubes, spheres and hemispheres may quickly be calculated to 1/2% or better. (author)

  7. Torpedo's Search Trajectory Design Based on Acquisition and Hit Probability Calculation

    Institute of Scientific and Technical Information of China (English)

    LI Wen-zhe; ZHANG Yu-wen; FAN Hui; WANG Yong-hu

    2008-01-01

    Taking aim at light torpedo search trajectory characteristic of warship, by analyzing common used torpedo search trajectory, a better torpedo search trajectory is designed, a mathematic model is built up, and the simulation calculation taking MK46 torpedo for example is carried out. The calculation results testify that this method can increase acquisition probability and hit probability by about 10%-30% at some situations and becomes feasible for the torpedo trajectory design. The research is of great reference value for the acoustic homing torpedo trajectory design and the torpedo combat efficiency research.

  8. A semi-mechanistic approach to calculate the probability of fuel defects

    International Nuclear Information System (INIS)

    In this paper the authors describe the status of a semi-mechanistic approach to the calculation of the probability of fuel defects. This approach expresses the defect probability in terms of fundamental parameters such as local stresses, local strains, and fission product concentration. The calculations of defect probability continue to reflect the influences of the conventional parameters like power ramp, burnup and CANLUB. In addition, the new approach provides a mechanism to account for the impacts of additional factors involving detailed fuel design and reactor operation, for example pellet density, pellet shape and size, sheath diameter and thickness, pellet/sheath clearance, and coolant temperature and pressure. The approach has been validated against a previous empirical correlation. AN illustrative example shows how the defect thresholds are influenced by changes in the internal design of the element and in the coolant pressure. (Author) (7 figs., tab., 12 refs.)

  9. Theoretical calculation of the rotational excitation probability of the lithium chloride molecule in terahertz frequency combs

    International Nuclear Information System (INIS)

    We investigated how the pulse parameters of optical frequency combs affect the rotational excitation probability of the lithium chloride (7Li37Cl) molecule. Time evolution of the rotational population distribution was calculated by the close-coupling method. It was confirmed that the rotational excitation is restricted owing to the centrifugal distortion of the rotating molecule. (author)

  10. Calculation of Quantum Probability in O(2,2) String Cosmology with a Dilaton Potential

    Institute of Scientific and Technical Information of China (English)

    YAN Jun

    2006-01-01

    The quantum properties of O(2,2) string cosmology with a dilaton potential are studied in this paper. The cosmological solutions are obtained on three-dimensional space-time. Moreover, the quantum probability of transition between two duality universe is calculated through a Wheeler-De Witt approach.

  11. PABS: A Computer Program to Normalize Emission Probabilities and Calculate Realistic Uncertainties

    International Nuclear Information System (INIS)

    The program PABS normalizes relative particle emission probabilities to an absolute scale and calculates the relevant uncertainties on this scale. The program is written in Java using the JDK 1.6 library. For additional information about system requirements, the code itself, and compiling from source, see the README file distributed with this program. The mathematical procedures used are given.

  12. Calculation of the escape probabilities of Fe ⅹⅦ resonance lines for the Voigt profile

    Institute of Scientific and Technical Information of China (English)

    Jian HE; Qing-guo ZHANG

    2008-01-01

    Using the Voigt profile we obtained, we cal-culate the escape probabilities of Fe ⅹⅦ resonance lines at 15.02, 13.28, 12.12, 11.13, 11.02 and 10.12 A for op-tically thick plasma, both for slab and cylindrical ge-ometry. The oscillator strength, the number density of the absorbing atoms in the ground state, and the optical depth in the line center are discussed in this calculation. Results show that the escape probabilities for the slab geometry are larger than that for the cylindrical geom-etry. This calculation is useful for the study of the Fe ⅹⅦ resonance lines.

  13. Accurate multiconfiguration Dirac–Hartree–Fock calculations of transition probabilities for magnesium-like ions

    International Nuclear Information System (INIS)

    Results from multiconfiguration Dirac–Hartree–Fock (MCDHF) and relativistic configuration interaction (RCI) calculations are presented for the n=3 to n′=3 transitions in the Mg isoelectronic sequence. The calculated values for the lowest 35 levels including core–valence correlation are found to be similar and to compare very well with other theoretical and experimental values. The Breit interaction and leading quantum electrodynamic effects are included as perturbations. The calculations can provide useful data for the experimental study of determining the fine structure levels in future work. - Highlights: • Multiconfiguration Dirac–Hartree–Fock (MCDHF) and relativistic configuration interaction calculations were used. • The valence–valence and core–valence correlations are considered. • Energy levels and transitions probabilities are calculated for 35 levels of magnesium-like ions. • Detail QED and total energy for four configurations were presented

  14. Corrections to vibrational transition probabilities calculated from a three-dimensional model.

    Science.gov (United States)

    Stallcop, J. R.

    1972-01-01

    Corrections to the collision-induced vibration transition probability calculated by Hansen and Pearson from a three-dimensional semiclassical model are examined. These corrections come from the retention of higher order terms in the expansion of the interaction potential and the use of the actual value of the deflection angle in the calculation of the transition probability. It is found that the contribution to the transition cross section from previously neglected potential terms can be significant for short range potentials and for the large relative collision velocities encountered at high temperatures. The correction to the transition cross section obtained from the use of actual deflection angles will not be appreciable unless the change in the rotational quantum number is large.

  15. Efficient Probability of Failure Calculations for QMU using Computational Geometry LDRD 13-0144 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Scott A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rushdi, Ahmad A. [Univ. of Texas, Austin, TX (United States); Abdelkader, Ahmad [Univ. of Maryland, College Park, MD (United States)

    2015-09-01

    This SAND report summarizes our work on the Sandia National Laboratory LDRD project titled "Efficient Probability of Failure Calculations for QMU using Computational Geometry" which was project #165617 and proposal #13-0144. This report merely summarizes our work. Those interested in the technical details are encouraged to read the full published results, and contact the report authors for the status of the software and follow-on projects.

  16. The Calculations of Oscillator Strengths and Transition Probabilities for Atomic Fluorine

    OpenAIRE

    ÇELİK, Gültekin; KILIÇ, H. Şükür; Akin, Erhan

    2006-01-01

    Oscillator strengths for transitions between individual lines belonging to some doublet and quartet terms, and multiplet transition probabilities of atomic fluorine have been calculated using weakest bound electron potential model theory (WBEPMT). In the determination of relevant parameters, we employed numerical non-relativistic Hartree-Fock (NRHF) wave functions for expectation values of radii and the necessary energy values have been taken from experimental energy data in the liter...

  17. EROS --- automated software system for ephemeris calculation and estimation of probability domain (Abstract)

    Science.gov (United States)

    Skripnichenko, P.; Galushina, T.; Loginova, M.

    2015-08-01

    This work is devoted to the description of the software EROS (Ephemeris Research and Observation Services), which is being developed both by the astronomy department of Ural Federal University and Tomsk State University. This software provides the ephemeris support for the positional observations. The most interesting feature of the software is an automatization of all the processes preparation for observations from the determination of the night duration to the ephemeris calculation and forming of a program observation schedule. The accuracy of ephemeris calculation mostly depends on initial data precision that defined from errors of observations which used to determination of orbital elements. In the case if object has a small number of observations which spread at short arc of orbit there is a real necessity to calculate not only at nominal orbit but probability domain both. In this paper under review ephemeris we will be understand a field on the celestial sphere which calculated based on the probability domain. Our software EROS has a relevant functional for estimation of review ephemeris. This work contains description of software system and results of the program using.

  18. Impact of temporal probability in 4D dose calculation for lung tumors.

    Science.gov (United States)

    Rouabhi, Ouided; Ma, Mingyu; Bayouth, John; Xia, Junyi

    2015-11-08

    The purpose of this study was to evaluate the dosimetric uncertainty in 4D dose calculation using three temporal probability distributions: uniform distribution, sinusoidal distribution, and patient-specific distribution derived from the patient respiratory trace. Temporal probability, defined as the fraction of time a patient spends in each respiratory amplitude, was evaluated in nine lung cancer patients. Four-dimensional computed tomography (4D CT), along with deformable image registration, was used to compute 4D dose incorporating the patient's respiratory motion. First, the dose of each of 10 phase CTs was computed using the same planning parameters as those used in 3D treatment planning based on the breath-hold CT. Next, deformable image registration was used to deform the dose of each phase CT to the breath-hold CT using the deformation map between the phase CT and the breath-hold CT. Finally, the 4D dose was computed by summing the deformed phase doses using their corresponding temporal probabilities. In this study, 4D dose calculated from the patient-specific temporal probability distribution was used as the ground truth. The dosimetric evaluation matrix included: 1) 3D gamma analysis, 2) mean tumor dose (MTD), 3) mean lung dose (MLD), and 4) lung V20. For seven out of nine patients, both uniform and sinusoidal temporal probability dose distributions were found to have an average gamma passing rate > 95% for both the lung and PTV regions. Compared with 4D dose calculated using the patient respiratory trace, doses using uniform and sinusoidal distribution showed a percentage difference on average of -0.1% ± 0.6% and -0.2% ± 0.4% in MTD, -0.2% ± 1.9% and -0.2% ± 1.3% in MLD, 0.09% ± 2.8% and -0.07% ± 1.8% in lung V20, -0.1% ± 2.0% and 0.08% ± 1.34% in lung V10, 0.47% ± 1.8% and 0.19% ± 1.3% in lung V5, respectively. We concluded that four-dimensional dose computed using either a uniform or sinusoidal temporal probability distribution can

  19. A semiclassical model for the calculation of nonadiabatic transition probabilities for classically forbidden transitions.

    Science.gov (United States)

    Dang, Phuong-Thanh; Herman, Michael F

    2009-02-01

    A semiclassical surface hopping model is presented for the calculation of nonadiabatic transition probabilities for the case in which the avoided crossing point is in the classically forbidden regions. The exact potentials and coupling are replaced with simple functional forms that are fitted to the values, evaluated at the turning point in the classical motion, of the Born-Oppenheimer potentials, the nonadiabatic coupling, and their first few derivatives. For the one-dimensional model considered, reasonably accurate results for transition probabilities are obtained down to around 10(-10). The possible extension of this model to many dimensional problems is discussed. The fact that the model requires only information at the turning point, a point that the trajectories encounter would be a significant advantage in many dimensional problems over Landau-Zener type models, which require information at the avoided crossing seam, which is in the forbidden region where the trajectories do not go.

  20. Calculations of hydrogen atom multiphoton energy level shifts, transition amplitudes and ionization probabilities

    International Nuclear Information System (INIS)

    Analyses of the resonant multiphoton ionization of atoms require knowledge of ac Stark energy shifts and of multiphoton, bound-to-bound state, transition amplitudes. In this paper, we consider the three-photon photoionization of hydrogen atoms at frequencies that are at and surrounding the two-photon 1s to 2s resonance. AC energy shift sums of both the 1s and 2s states are calculated as a function of the laser frequency along with two-photon 1s → 2s resonant transition amplitude sums. These quantities are calculated using an extended version of a method, which has often been employed in a variety of ways, of calculating these sums by expressing them in terms of solutions to a variety of differential equations that are derived from the different sums being evaluated. We demonstrate how exact solutions are obtained to these differential equations, which lead to exact evaluations of the corresponding sums. A variety of different cases are analysed, some involving analytic continuation, some involving real number analysis and some involving complex number analysis. A dc Stark sum calculation of the 2s state is carried out to illustrate the case where analytic continuation, pole isolation and pole subtraction are required and where the calculation can be carried out analytically; the 2s state, ac Stark shift sum calculations involve a case where no analytic continuation is required, but where the solution to the differential equation produces complex numbers owing to the finite photoionization lifetime of the 2s state. Results from these calculations are then used to calculate three-photon ionization probabilities of relevance to an analysis of the multiphoton ionization data published by Kyrala and Nichols (1991 Phys. Rev. A 44, R1450)

  1. Failures probability calculation of the energy supply of the Angra-1 reactor rods assembly

    International Nuclear Information System (INIS)

    This work analyses the electric power system of the Angra I PWR plant. It is demonstrated that this system is closely coupled with the safety engineering features, which are the equipments provided to prevent, limit, or mitigate the release of radioactive material and to permit the safe reactor shutdown. Event trees are used to analyse the operation of those systems which can lead to the release of radioactivity following a specified initial event. The fault trees technique is used to calculate the failure probability of the on-site electric power system

  2. Calculation of the number of Monte Carlo histories for a planetary protection probability of impact estimation

    Science.gov (United States)

    Barengoltz, Jack

    2016-07-01

    Monte Carlo (MC) is a common method to estimate probability, effectively by a simulation. For planetary protection, it may be used to estimate the probability of impact P{}_{I} by a launch vehicle (upper stage) of a protected planet. The object of the analysis is to provide a value for P{}_{I} with a given level of confidence (LOC) that the true value does not exceed the maximum allowed value of P{}_{I}. In order to determine the number of MC histories required, one must also guess the maximum number of hits that will occur in the analysis. This extra parameter is needed because a LOC is desired. If more hits occur, the MC analysis would indicate that the true value may exceed the specification value with a higher probability than the LOC. (In the worst case, even the mean value of the estimated P{}_{I} might exceed the specification value.) After the analysis is conducted, the actual number of hits is, of course, the mean. The number of hits arises from a small probability per history and a large number of histories; these are the classic requirements for a Poisson distribution. For a known Poisson distribution (the mean is the only parameter), the probability for some interval in the number of hits is calculable. Before the analysis, this is not possible. Fortunately, there are methods that can bound the unknown mean for a Poisson distribution. F. Garwoodfootnote{ F. Garwood (1936), ``Fiduciary limits for the Poisson distribution.'' Biometrika 28, 437-442.} published an appropriate method that uses the Chi-squared function, actually its inversefootnote{ The integral chi-squared function would yield probability α as a function of the mean µ and an actual value n.} (despite the notation used): This formula for the upper and lower limits of the mean μ with the two-tailed probability 1-α depends on the LOC α and an estimated value of the number of "successes" n. In a MC analysis for planetary protection, only the upper limit is of interest, i.e., the single

  3. Theoretical Calculations of Transition Probabilities and Oscillator Strengths for Sc(Ⅲ) and Y(Ⅲ)

    Institute of Scientific and Technical Information of China (English)

    Tian-yi Zhang; Neng-wu Zheng

    2009-01-01

    The Weakest Bound Electron Potential Model theory is used to calculate transition probability-values and oscillator strength-values for individual lines of Sc(Ⅲ) and Y(Ⅲ). In this method, by solving the SchrSdinger equation of the weakest bound electron, the expressions of energy eigenvalue and the radial function can be obtained. And a coupled equation is used to determine the parameters which are needed in the calculations. The ob-tained results of Sc(Ⅲ) from this work agree very well with the accepted values taken from the National Institute of Standards and Technoligy (NIST) data base, most deviations are within the accepted level. For Y(Ⅲ) there are no accepted values reported by the NIST data base. So we compared our results of Y(Ⅲ) with other theoretical results, good agreement is also obtained.

  4. Direct calculation of the probability of pionium ionization in the target

    International Nuclear Information System (INIS)

    The goal of the DIRAC experiment at CERN is the lifetime measurement of pionium (π+π- atom). Its lifetime is mainly defined by the charge-exchange process π+π-→ π0π0. The value of the lifetime in the ground state is predicted in the framework of chiral perturbation theory with high precision: τ1S = (2.9 ± 0.1) x 10-15 s. The method used by DIRAC is based on analysis of π+π--pair spectra with small relative momenta in their center-of-mass system in order to find out the signal from pionium ionization (breakup) in the target. Pioniums are produced in proton-nucleus collisions and have relativistic velocities (γ > 10). For fixed values of the pionium momentum and the target thickness, the probability of pionium ionization in the target depends on its lifetime in a unique way; thus, the pionium lifetime can be deduced from the experimentally defined probability of pionium ionization. On the basis of ionization cross sections of pionium with target atoms, we perform the first direct calculation of the pionium ionization probability in the target.

  5. New energy levels, calculated lifetimes and transition probabilities in Xe IX

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, M; Raineri, M; Reyna Almandos, J [Centro de Investigaciones Opticas (CIOp), CC 3 (1897) Gonnet, La Plata (Argentina); Biemont, E [IPNAS, Universite de Liege, B15 Sart Tilman, B-4000 Liege (Belgium)

    2011-02-28

    Twenty-one new experimental energy levels belonging to the 4d{sup 9}6p, 4d{sup 9}4f and 4d{sup 9}5f configurations of Xe IX are presented. They have been deduced from 75 newly classified lines involving the configurations 4d{sup 9}5p, 4d{sup 9}6p, 4d{sup 9}4f, 4d{sup 9}5f and 4d{sup 9}5d, 4d{sup 9}5s, 4d{sup 9}6s for the odd and even parities, respectively. The radiative lifetimes of these levels as well as the weighted oscillator strengths and transition probabilities for all the observed spectral lines have been calculated with optimized parameters deduced from a least-squares fitting procedure applied in the framework of a relativistic Hartree-Fock method including core-polarization effects. The scale of transition probabilities has also been assessed through comparisons with lifetimes calculated using a relativistic multiconfigurational Dirac-Fock approach.

  6. Calculation of the number of Monte Carlo histories for a planetary protection probability of impact estimation

    Science.gov (United States)

    Barengoltz, Jack

    2016-07-01

    Monte Carlo (MC) is a common method to estimate probability, effectively by a simulation. For planetary protection, it may be used to estimate the probability of impact P{}_{I} by a launch vehicle (upper stage) of a protected planet. The object of the analysis is to provide a value for P{}_{I} with a given level of confidence (LOC) that the true value does not exceed the maximum allowed value of P{}_{I}. In order to determine the number of MC histories required, one must also guess the maximum number of hits that will occur in the analysis. This extra parameter is needed because a LOC is desired. If more hits occur, the MC analysis would indicate that the true value may exceed the specification value with a higher probability than the LOC. (In the worst case, even the mean value of the estimated P{}_{I} might exceed the specification value.) After the analysis is conducted, the actual number of hits is, of course, the mean. The number of hits arises from a small probability per history and a large number of histories; these are the classic requirements for a Poisson distribution. For a known Poisson distribution (the mean is the only parameter), the probability for some interval in the number of hits is calculable. Before the analysis, this is not possible. Fortunately, there are methods that can bound the unknown mean for a Poisson distribution. F. Garwoodfootnote{ F. Garwood (1936), ``Fiduciary limits for the Poisson distribution.'' Biometrika 28, 437-442.} published an appropriate method that uses the Chi-squared function, actually its inversefootnote{ The integral chi-squared function would yield probability α as a function of the mean µ and an actual value n.} (despite the notation used): This formula for the upper and lower limits of the mean μ with the two-tailed probability 1-α depends on the LOC α and an estimated value of the number of "successes" n. In a MC analysis for planetary protection, only the upper limit is of interest, i.e., the single

  7. Calculating inspector probability of detection using performance demonstration program pass rates

    Science.gov (United States)

    Cumblidge, Stephen; D'Agostino, Amy

    2016-02-01

    The United States Nuclear Regulatory Commission (NRC) staff has been working since the 1970's to ensure that nondestructive testing performed on nuclear power plants in the United States will provide reasonable assurance of structural integrity of the nuclear power plant components. One tool used by the NRC has been the development and implementation of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code Section XI Appendix VIII[1] (Appendix VIII) blind testing requirements for ultrasonic procedures, equipment, and personnel. Some concerns have been raised, over the years, by the relatively low pass rates for the Appendix VIII qualification testing. The NRC staff has applied statistical tools and simulations to determine the expected probability of detection (POD) for ultrasonic examinations under ideal conditions based on the pass rates for the Appendix VIII qualification tests for the ultrasonic testing personnel. This work was primarily performed to answer three questions. First, given a test design and pass rate, what is the expected overall POD for inspectors? Second, can we calculate the probability of detection for flaws of different sizes using this information? Finally, if a previously qualified inspector fails a requalification test, does this call their earlier inspections into question? The calculations have shown that one can expect good performance from inspectors who have passed appendix VIII testing in a laboratory-like environment, and the requalification pass rates show that the inspectors have maintained their skills between tests. While these calculations showed that the PODs for the ultrasonic inspections are very good under laboratory conditions, the field inspections are conducted in a very different environment. The NRC staff has initiated a project to systematically analyze the human factors differences between qualification testing and field examinations. This work will be used to evaluate and prioritize

  8. Calculation of probability density functions for temperature and precipitation change under global warming

    International Nuclear Information System (INIS)

    Full text: he IPCC Fourth Assessment Report (Meehl ef al. 2007) presents multi-model means of the CMIP3 simulations as projections of the global climate change over the 21st century under several SRES emission scenarios. To assess the possible range of change for Australia based on the CMIP3 ensemble, we can follow Whetton etal. (2005) and use the 'pattern scaling' approach, which separates the uncertainty in the global mean warming from that in the local change per degree of warming. This study presents several ways of representing these two factors as probability density functions (PDFs). The beta distribution, a smooth, bounded, function allowing skewness, is found to provide a useful representation of the range of CMIP3 results. A weighting of models based on their skill in simulating seasonal means in the present climate over Australia is included. Dessai ef al. (2005) and others have used Monte-Carlo sampling to recombine such global warming and scaled change factors into values of net change. Here, we use a direct integration of the product across the joint probability space defined by the two PDFs. The result is a cumulative distribution function (CDF) for change, for each variable, location, and season. The median of this distribution provides a best estimate of change, while the 10th and 90th percentiles represent a likely range. The probability of exceeding a specified threshold can also be extracted from the CDF. The presentation focuses on changes in Australian temperature and precipitation at 2070 under the A1B scenario. However, the assumption of linearity behind pattern scaling allows results for different scenarios and times to be simply obtained. In the case of precipitation, which must remain non-negative, a simple modification of the calculations (based on decreases being exponential with warming) is used to avoid unrealistic results. These approaches are currently being used for the new CSIRO/ Bureau of Meteorology climate projections

  9. Peculiarities of high-overtone transition probabilities in carbon monoxide revealed by high-precision calculation

    Energy Technology Data Exchange (ETDEWEB)

    Medvedev, Emile S., E-mail: esmedved@orc.ru [The Institute of Problems of Chemical Physics, Russian Academy of Sciences, Prospect Akademika Semenova 1, 142432 Chernogolovka (Russian Federation); Meshkov, Vladimir V.; Stolyarov, Andrey V. [Department of Chemistry, Lomonosov Moscow State University, Leninskie gory 1/3, 119991 Moscow (Russian Federation); Gordon, Iouli E. [Atomic and Molecular Physics Division, Harvard-Smithsonian Center for Astrophysics, 60 Garden St, Cambridge, Massachusetts 02138 (United States)

    2015-10-21

    In the recent work devoted to the calculation of the rovibrational line list of the CO molecule [G. Li et al., Astrophys. J., Suppl. Ser. 216, 15 (2015)], rigorous validation of the calculated parameters including intensities was carried out. In particular, the Normal Intensity Distribution Law (NIDL) [E. S. Medvedev, J. Chem. Phys. 137, 174307 (2012)] was employed for the validation purposes, and it was found that, in the original CO line list calculated for large changes of the vibrational quantum number up to Δn = 41, intensities with Δn > 11 were unphysical. Therefore, very high overtone transitions were removed from the published list in Li et al. Here, we show how this type of validation is carried out and prove that the quadruple precision is indispensably required to predict the reliable intensities using the conventional 32-bit computers. Based on these calculations, the NIDL is shown to hold up for the 0 → n transitions till the dissociation limit around n = 83, covering 45 orders of magnitude in the intensity. The low-intensity 0 → n transition predicted in the work of Medvedev [Determination of a new molecular constant for diatomic systems. Normal intensity distribution law for overtone spectra of diatomic and polyatomic molecules and anomalies in overtone absorption spectra of diatomic molecules, Institute of Chemical Physics, Russian Academy of Sciences, Chernogolovka, 1984] at n = 5 is confirmed, and two additional “abnormal” intensities are found at n = 14 and 23. Criteria for the appearance of such “anomalies” are formulated. The results could be useful to revise the high-overtone molecular transition probabilities provided in spectroscopic databases.

  10. Peculiarities of high-overtone transition probabilities in carbon monoxide revealed by high-precision calculation

    Science.gov (United States)

    Medvedev, Emile S.; Meshkov, Vladimir V.; Stolyarov, Andrey V.; Gordon, Iouli E.

    2015-10-01

    In the recent work devoted to the calculation of the rovibrational line list of the CO molecule [G. Li et al., Astrophys. J., Suppl. Ser. 216, 15 (2015)], rigorous validation of the calculated parameters including intensities was carried out. In particular, the Normal Intensity Distribution Law (NIDL) [E. S. Medvedev, J. Chem. Phys. 137, 174307 (2012)] was employed for the validation purposes, and it was found that, in the original CO line list calculated for large changes of the vibrational quantum number up to Δn = 41, intensities with Δn > 11 were unphysical. Therefore, very high overtone transitions were removed from the published list in Li et al. Here, we show how this type of validation is carried out and prove that the quadruple precision is indispensably required to predict the reliable intensities using the conventional 32-bit computers. Based on these calculations, the NIDL is shown to hold up for the 0 → n transitions till the dissociation limit around n = 83, covering 45 orders of magnitude in the intensity. The low-intensity 0 → n transition predicted in the work of Medvedev [Determination of a new molecular constant for diatomic systems. Normal intensity distribution law for overtone spectra of diatomic and polyatomic molecules and anomalies in overtone absorption spectra of diatomic molecules, Institute of Chemical Physics, Russian Academy of Sciences, Chernogolovka, 1984] at n = 5 is confirmed, and two additional "abnormal" intensities are found at n = 14 and 23. Criteria for the appearance of such "anomalies" are formulated. The results could be useful to revise the high-overtone molecular transition probabilities provided in spectroscopic databases.

  11. Peculiarities of high-overtone transition probabilities in carbon monoxide revealed by high-precision calculation

    International Nuclear Information System (INIS)

    In the recent work devoted to the calculation of the rovibrational line list of the CO molecule [G. Li et al., Astrophys. J., Suppl. Ser. 216, 15 (2015)], rigorous validation of the calculated parameters including intensities was carried out. In particular, the Normal Intensity Distribution Law (NIDL) [E. S. Medvedev, J. Chem. Phys. 137, 174307 (2012)] was employed for the validation purposes, and it was found that, in the original CO line list calculated for large changes of the vibrational quantum number up to Δn = 41, intensities with Δn > 11 were unphysical. Therefore, very high overtone transitions were removed from the published list in Li et al. Here, we show how this type of validation is carried out and prove that the quadruple precision is indispensably required to predict the reliable intensities using the conventional 32-bit computers. Based on these calculations, the NIDL is shown to hold up for the 0 → n transitions till the dissociation limit around n = 83, covering 45 orders of magnitude in the intensity. The low-intensity 0 → n transition predicted in the work of Medvedev [Determination of a new molecular constant for diatomic systems. Normal intensity distribution law for overtone spectra of diatomic and polyatomic molecules and anomalies in overtone absorption spectra of diatomic molecules, Institute of Chemical Physics, Russian Academy of Sciences, Chernogolovka, 1984] at n = 5 is confirmed, and two additional “abnormal” intensities are found at n = 14 and 23. Criteria for the appearance of such “anomalies” are formulated. The results could be useful to revise the high-overtone molecular transition probabilities provided in spectroscopic databases

  12. Burnup calculation by the method of first-flight collision probabilities using average chords prior to the first collision

    Science.gov (United States)

    Karpushkin, T. Yu.

    2012-12-01

    A technique to calculate the burnup of materials of cells and fuel assemblies using the matrices of first-flight neutron collision probabilities rebuilt at a given burnup step is presented. A method to rebuild and correct first collision probability matrices using average chords prior to the first neutron collision, which are calculated with the help of geometric modules of constructed stochastic neutron trajectories, is described. Results of calculation of the infinite multiplication factor for elementary cells with a modified material composition compared to the reference one as well as calculation of material burnup in the cells and fuel assemblies of a VVER-1000 are presented.

  13. Theoretical Calculations of Thermal Broadenings and Transition Probabilities of R, R' and B Line-Groups for Ruby

    Institute of Scientific and Technical Information of China (English)

    MA Dong-Ping; LIU Yan-Yun; CHEN Ju-Rong

    2001-01-01

    On the basis of the unified calculation of the thermal shifts of R1 line, R2 line and ground-state-splitting transition probabilities of direct and Raman processes have theoretically been calculated. The thermal broadenings of R,The theoretically predicted transition probabilities are in good agreement with the experimental ones.PACS numbers: 71.70.Ch, 78.20.Nv, 63.20.Mt, 63.20.Kr

  14. UMTS Uplink Loading Probability Calculation Using Log-Normal Interferers Contributions

    Directory of Open Access Journals (Sweden)

    Mosleh M. Al-Harthi

    2012-10-01

    Full Text Available In this paper we introduce the probabilistic notion of uplink loading in a UMTS network subject to uplink interferers from both the home cell and other neighbouring cells. Our study is based on the assumption that, for a given UMTS cell, all the interferers in the uplink are log-normally distributed and that the Gaussian noise is also taken into account. The latter is considered as a constant contribution to be added to the sum of all interferers that are summed up using Wilkinson’s [1] calculation method. The uplink loading h is regarded as a random variable as it is directly dependent upon uplink log-normal interferers. In this paper we propose a new way of assessing this uplink loading and for a given User Equipments spatial distribution snapshot. Subsequently the probability that the uplink loading exceeds a given threshold, say h0, is thus derived and for different correlation figures and different noise levels as well.

  15. Prospective validation of a risk calculator which calculates the probability of a positive prostate biopsy in a contemporary clinical cohort

    NARCIS (Netherlands)

    van Vugt, Heidi A.; Kranse, Ries; Steyerberg, Ewout W.; van der Poel, Henk G.; Busstra, Martijn; Kil, Paul; Oomens, Eric H.; de Jong, Igle J.; Bangma, Chris H.; Roobol, Monique J.

    2012-01-01

    Background: Prediction models need validation to assess their value outside the development setting. Objective: To assess the external validity of the European Randomised study of Screening for Prostate Cancer (ERSPC) Risk Calculator (RC) in a contemporary clinical cohort. Methods: The RC calculates

  16. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    of distributions with a slowly varying tail. An example from risk theory, comparing ruin probabilities for a classical risk process with Pareto distributed claim sizes, is presented and exact known ruin probabilities for the Pareto case are compared to the ones obtained by approximating by an infinite...

  17. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P(d,N)m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  18. A polynomial time algorithm for calculating the probability of a ranked gene tree given a species tree

    OpenAIRE

    Stadler, Tanja; Degnan, James H

    2012-01-01

    Background The ancestries of genes form gene trees which do not necessarily have the same topology as the species tree due to incomplete lineage sorting. Available algorithms determining the probability of a gene tree given a species tree require exponential computational runtime. Results In this paper, we provide a polynomial time algorithm to calculate the probability of a ranked gene tree topology for a given species tree, where a ranked tree topology is a tree topology with the internal v...

  19. Generalized probability model for calculation of interference to the Deep Space Network due to circularly Earth-orbiting satellites

    Science.gov (United States)

    Ruggier, C. J.

    1992-01-01

    The probability of exceeding interference power levels and the duration of interference at the Deep Space Network (DSN) antenna is calculated parametrically when the state vector of an Earth-orbiting satellite over the DSN station view area is not known. A conditional probability distribution function is derived, transformed, and then convolved with the interference signal uncertainties to yield the probability distribution of interference at any given instant during the orbiter's mission period. The analysis is applicable to orbiting satellites having circular orbits with known altitude and inclination angle.

  20. The risk of a major nuclear accident: calculation and perception of probabilities

    International Nuclear Information System (INIS)

    The accident at Fukushima Daiichi, Japan, occurred on 11 March 2011. This nuclear disaster, the third on such a scale, left a lasting mark in the minds of hundreds of millions of people. Much as Three Mile Island or Chernobyl, yet another place will be permanently associated with a nuclear power plant which went out of control. Fukushima Daiichi revived the issue of the hazards of civil nuclear power, stirring up all the associated passion and emotion. The whole of this paper is devoted to the risk of a major nuclear accident. By this we mean a failure initiating core meltdown, a situation in which the fuel rods melt and mix with the metal in their cladding. Such accidents are classified as at least level 5 on the International Nuclear Event Scale. The Three Mile Island accident, which occurred in 1979 in the United States, reached this level of severity. The explosion of reactor 4 at the Chernobyl plant in Ukraine in 1986 and the recent accident in Japan were classified as class 7, the highest grade on this logarithmic scale. The main difference between the top two levels and level 5 relates to a significant or major release of radioactive material to the environment. In the event of a level-5 accident, damage is restricted to the inside of the plant, whereas, in the case of level-7 accidents, huge areas of land, above or below the surface, and/or sea may be contaminated. Before the meltdown of reactors 1, 2 and 3 at Fukushima Daiichi, eight major accidents affecting nuclear power plants had occurred worldwide. This is a high figure compared with the one calculated by the experts. Observations in the field do not appear to fit the results of the probabilistic models of nuclear accidents produced since the 1970's. Oddly enough the number of major accidents is closer to the risk as perceived by the general public. In general we tend to overestimate any risk relating to rare, fearsome accidents. What are we to make of this divergence? How are we to reconcile

  1. Improved techniques for outgoing wave variational principle calculations of converged state-to-state transition probabilities for chemical reactions

    Science.gov (United States)

    Mielke, Steven L.; Truhlar, Donald G.; Schwenke, David W.

    1991-01-01

    Improved techniques and well-optimized basis sets are presented for application of the outgoing wave variational principle to calculate converged quantum mechanical reaction probabilities. They are illustrated with calculations for the reactions D + H2 yields HD + H with total angular momentum J = 3 and F + H2 yields HF + H with J = 0 and 3. The optimization involves the choice of distortion potential, the grid for calculating half-integrated Green's functions, the placement, width, and number of primitive distributed Gaussians, and the computationally most efficient partition between dynamically adapted and primitive basis functions. Benchmark calculations with 224-1064 channels are presented.

  2. Calculation of Fire Severity Factors and Fire Non-Suppression Probabilities For A DOE Facility Fire PRA

    Energy Technology Data Exchange (ETDEWEB)

    Tom Elicson; Bentley Harwood; Jim Bouchard; Heather Lucek

    2011-03-01

    Over a 12 month period, a fire PRA was developed for a DOE facility using the NUREG/CR-6850 EPRI/NRC fire PRA methodology. The fire PRA modeling included calculation of fire severity factors (SFs) and fire non-suppression probabilities (PNS) for each safe shutdown (SSD) component considered in the fire PRA model. The SFs were developed by performing detailed fire modeling through a combination of CFAST fire zone model calculations and Latin Hypercube Sampling (LHS). Component damage times and automatic fire suppression system actuation times calculated in the CFAST LHS analyses were then input to a time-dependent model of fire non-suppression probability. The fire non-suppression probability model is based on the modeling approach outlined in NUREG/CR-6850 and is supplemented with plant specific data. This paper presents the methodology used in the DOE facility fire PRA for modeling fire-induced SSD component failures and includes discussions of modeling techniques for: • Development of time-dependent fire heat release rate profiles (required as input to CFAST), • Calculation of fire severity factors based on CFAST detailed fire modeling, and • Calculation of fire non-suppression probabilities.

  3. Chronology of Postglacial Eruptive Activity and Calculation of Eruption Probabilities for Medicine Lake Volcano, Northern California

    Science.gov (United States)

    Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.

    2007-01-01

    Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.

  4. ELIPGRID-PC: A PC program for calculating hot spot probabilities

    International Nuclear Information System (INIS)

    ELIPGRID-PC, a new personal computer program has been developed to provide easy access to Singer's 1972 ELIPGRID algorithm for hot-spot detection probabilities. Three features of the program are the ability to determine: (1) the grid size required for specified conditions, (2) the smallest hot spot that can be sampled with a given probability, and (3) the approximate grid size resulting from specified conditions and sampling cost. ELIPGRID-PC also provides probability of hit versus cost data for graphing with spread-sheets or graphics software. The program has been successfully tested using Singer's published ELIPGRID results. An apparent error in the original ELIPGRID code has been uncovered and an appropriate modification incorporated into the new program

  5. Calculation of rotational transition probabilities in molecular collisions - Application to N2 + N2

    Science.gov (United States)

    Itikawa, Y.

    1975-01-01

    A computational method is proposed to obtain rotational transition probabilities in collisions between two diatomic molecules. The potential method of Rabitz and an exponential approximation are used to solve the semiclassical coupled equations without invoking any perturbational technique. The collision trajectory is determined in the classical modified-wave-number approximation. The method can treat systems involving strong interactions and provide probabilities for transitions even with a multiquantum jump. A simultaneous transition in the rotational states of both molecules, i.e., the rotational-rotational energy transfer, is taken into account. An application to the system N2 + N2 is presented.

  6. Optimization of next-event estimation probability in Monte Carlo shielding calculations

    International Nuclear Information System (INIS)

    In Monte Carlo radiation transport calculations with point detectors, the next-event estimation is employed to estimate the response to each detector from all collision sites. The computation time required for this estimation process is substantial and often exceeds the time required to generate and process particle histories in a calculation. This estimation from all collision sites is, therefore, very wasteful in Monte Carlo shielding calculations. For example, in the source region and in regions far away from the detectors, the next-event contribution of a particle is often very small and insignificant. A method for reducing this inefficiency is described

  7. On a best-estimate approach to the calculation of dryout probability during BWR transients

    International Nuclear Information System (INIS)

    A method is proposed whereby uncertainty of any dryout margin measure (figure of merit) may be quantified when the only experimental information available for validation is whether dryout has occurred or not. The method does not involve the heater temperature, except as a discrete dryout indicator. This is an advantage when analysing anticipated operational occurrences for which the acceptance criterion refers exclusively to the probability of dryout occurrence. The derived uncertainty provides a direct relation between the simulated dryout margin and the aforementioned probability. Furthermore, the method, which is based on logistic regression, has been designed to be consistent with more common parametric methods of uncertainty analysis that are likely to be used for other parts of a thermal hydraulic model. One example is provided where the method is utilized to assess statistical properties, which would have been difficult to quantify by other means. (author)

  8. Monte Carlo calculation of the total probability for gamma-Ray interaction in toluene

    International Nuclear Information System (INIS)

    Interaction and absorption probabilities for gamma-rays with energies between 1 and 1000 KeV have been computed and tabulated. Toluene based scintillator solution has been assumed in the computation. Both, point sources and homogeneously dispersed radioactive material have been assumed. These tables may be applied to cylinders with radii between 1.25 cm and 0.25 cm and heights between 4.07 cm and 0.20 cm. (Author) 26 refs

  9. Special Issue on Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities

    OpenAIRE

    Per Jönsson; Hyun-Kyung Chung

    2013-01-01

    There exist several codes in the atomic physics community to generate atomic structure and transition probabilities freely and readily distributed to researchers outside atomic physics community, in plasma, astrophysical or nuclear physics communities. Users take these atomic physics codes to generate the necessary atomic data or modify the codes for their own applications. However, there has been very little effort to validate and verify the data sets generated by non-expert users. [...

  10. Syntax for calculation of discounting indices from the monetary choice questionnaire and probability discounting questionnaire.

    Science.gov (United States)

    Gray, Joshua C; Amlung, Michael T; Palmer, Abraham A; MacKillop, James

    2016-09-01

    The 27-item Monetary Choice Questionnaire (MCQ; Kirby, Petry, & Bickel, 1999) and 30-item Probability Discounting Questionnaire (PDQ; Madden, Petry, & Johnson, 2009) are widely used, validated measures of preferences for immediate versus delayed rewards and guaranteed versus risky rewards, respectively. The MCQ measures delayed discounting by asking individuals to choose between rewards available immediately and larger rewards available after a delay. The PDQ measures probability discounting by asking individuals to choose between guaranteed rewards and a chance at winning larger rewards. Numerous studies have implicated these measures in addiction and other health behaviors. Unlike typical self-report measures, the MCQ and PDQ generate inferred hyperbolic temporal and probability discounting functions by comparing choice preferences to arrays of functions to which the individual items are preconfigured. This article provides R and SPSS syntax for processing the MCQ and PDQ. Specifically, for the MCQ, the syntax generates k values, consistency of the inferred k, and immediate choice ratios; for the PDQ, the syntax generates h indices, consistency of the inferred h, and risky choice ratios. The syntax is intended to increase the accessibility of these measures, expedite the data processing, and reduce risk for error.

  11. Large-scale Breit-Pauli R-matrix calculations for transition probabilities of Fe V

    OpenAIRE

    Nahar, Sultana N.; Pradhan, Anil K.

    2000-01-01

    Ab initio theoretical calculations are reported for the electric (E1) dipole allowed and intercombination fine structure transitions in Fe V using the Breit-Pauli R-matrix (BPRM) method. We obtain 3865 bound fine structure levels of Fe V and $1.46 x 10^6$ oscillator strengths, Einstein A-coefficients and line strengths. In addition to the relativistic effects, the intermediate coupling calculations include extensive electron correlation effects that represent the complex configuration interac...

  12. Calculation of identity-by-descent probabilities of short chromosome segments.

    Science.gov (United States)

    Tuchscherer, A; Teuscher, F; Reinsch, N

    2012-12-01

    For some purposes, identity-by-descent (IBD) probabilities for entire chromosome segments are required. Making use of pedigree information, length of the segment and the assumption of no crossing-over, a generalization of a previously published graph theory oriented algorithm accounting for nonzero IBD of common ancestors is given, which can be viewed as method of path coefficients for entire chromosome segments. Furthermore, rules for setting up a gametic version of a segmental IBD matrix are presented. Results from the generalized graph theory oriented method, the gametic segmental IBD matrix and the segmental IBD matrix for individuals are identical.

  13. [Probabilities cannot be calculated retrospectively--not even in the courtroom].

    Science.gov (United States)

    van Gijn, J

    2005-12-24

    Chance events are part of everyday life, but coincidence of diseases often raises suspicions about hidden causes, for example when power lines are blamed for the geographical clustering of cancer. Recently, criminal procedures in the Netherlands have revolved around the question of whether statistical 'predictions' are a valid reason to hold a hospital nurse accountable for the occurrence of excess deaths during her duty hours, or a kindergarten employee for unexplained respiratory problems in several infants. In both cases, the appeals court judges did not accept the statistical 'argument' in the absence of other evidence. In the UK, however, Sally Clark's initial life sentence for the double murder of her 2 babies was largely based on 'probabilities in retrospect', put forward by the paediatrician Sir Roy Meadow as an expert witness. 4 years later she was acquitted, whereas Meadow was struck off the medical register on a charge of professional misconduct. There is no Bayesian or other mathematical solution to the problem of chance events. Only the detection of causal factors that are plausible and supported by new evidence can help to reinterpret coincidences as relationships. Scrupulous reasoning about probabilities is required, not only of physicians but also of judges and politicians. PMID:16402517

  14. Internationally comparable diagnosis-specific survival probabilities for calculation of the ICD-10-based Injury Severity Score

    DEFF Research Database (Denmark)

    Gedeborg, R.; Warner, M.; Chen, L. H.;

    2014-01-01

    BACKGROUND: The International Statistical Classification of Diseases, 10th Revision (ICD-10) -based Injury Severity Score (ICISS) performs well but requires diagnosis-specific survival probabilities (DSPs), which are empirically derived, for its calculation. The objective was to examine if DSPs...... country's own DSPs for ICISS calculation, the pooled DSPs resulted in somewhat reduced discrimination in predicting mortality (difference in c statistic varied from 0.006 to 0.04). Calibration was generally good when the predicted mortality risk was less than 20%. When Danish and Swedish data were used...

  15. Relativistic Calculation Of K$\\beta$ Hypersatellite Energies and Transition Probabilities for Selected Atoms with 13<=Z<=80

    CERN Document Server

    Costa, A M; Santos, J P; Indelicato, P J; Parente, F; Indelicato, Paul

    2006-01-01

    Energies and transition probabilities of K$\\beta$ hypersatellite lines are computed using the Dirac-Fock model for several values of $Z$ throughout the periodic table. The influence of the Breit interaction on the energy shifts from the corresponding diagram lines and on the K$\\beta\\_{1}^{\\rm h}$/K$\\beta\\_{3}^{\\rm h}$ intensity ratio is evaluated. The widths of the double-K hole levels are calculated for Al and Sc. The results are compared to experiment and to other theoretical calculations.

  16. Calculated level energies, transition probabilities, and lifetimes of silicon-like ions

    International Nuclear Information System (INIS)

    The authors present theoretical excitation energies and lifetimes for the 27 low-lying levels of silicon-like ions of S, Ar, Ca, Ti, Fe, Zn, and Kr (16 ≤ Z ≤ 36). Special attention has been paid to provide a complete tabulation of all electric-dipole (E1) allowed transitions from levels of the 3s3p3 and 3s23p3d excited configurations to those of the 3s23p2 ground-state configuration, including all weak and intercombination transitions. Large-scale multiconfiguration Dirac-Fock wave functions are applied to compute transition energies and probabilities. They further investigate the decay of the 3s23p3dJ = 4 level which is connected to the ground-state configuration only via forbidden M2 transitions but otherwise mainly decays via M1 to lower-lying levels of the same parity. For a few selected data, they compare the results with experiment and with previous computations

  17. Numerical calculation of vibrational transition probability for the forced morse oscillator by use of the anharmonic boson operators

    International Nuclear Information System (INIS)

    The vibrational transition probability expressions for the forced Morse oscillator have been derived using the commutation relations of the anharmonic Boson operators. The formulation is based on the collinear collision model with the exponential repulsive potential in the framework of semiclassical collision dynamics. The sample calculation results for H2 + He collision system, where the anharmonicity is large, are in excellent agreement with those from an exact, numerical quantum mechanical study by Clark and Dickinson, using the reactance matrix. Our results,however, are markedly different from those of Ree, Kim, and Shin's in which they approximate the commutation operator Io as unity, the harmonic oscillator limit. We have concluded that the quantum number dependence in Io must be retained to get accurate vibrational transition probabilities for the Morse oscillator

  18. Calculation of transition probabilities and ac Stark shifts in two-photon laser transitions of antiprotonic helium

    OpenAIRE

    HORI, MASAKI; Korobov, Vladimir I.

    2010-01-01

    Numerical ab initio variational calculations of the transition probabilities and ac Stark shifts in two-photon transitions of antiprotonic helium atoms driven by two counter-propagating laser beams are presented. We found that sub-Doppler spectroscopy is in principle possible by exciting transitions of the type (n,L)->(n-2,L-2) between antiprotonic states of principal and angular momentum quantum numbers n~L-1~35, first by using highly monochromatic, nanosecond laser beams of intensities 10^4...

  19. Fine-structure calculations of energy levels, oscillator strengths, and transition probabilities for sulfur-like iron, Fe XI

    International Nuclear Information System (INIS)

    Energy levels, oscillator strengths, and transition probabilities for transitions among the 14 LS states belonging to configurations of sulfur-like iron, Fe XI, have been calculated. These states are represented by configuration interaction wavefunctions and have configurations 3s23p4, 3s3p5, 3s23p33d, 3s23p34s, 3s23p34p, and 3s23p34d, which give rise to 123 fine-structure energy levels. Extensive configuration interaction calculations using the CIV3 code have been performed. To assess the importance of relativistic effects, the intermediate coupling scheme by means of the Breit–Pauli Hamiltonian terms, such as the one-body mass correction and Darwin term, and spin–orbit, spin–other-orbit, and spin–spin corrections, are incorporated within the code. These incorporations adjusted the energy levels, therefore the calculated values are close to the available experimental data. Comparisons between the present calculated energy levels as well as oscillator strengths and both experimental and theoretical data have been performed. Our results show good agreement with earlier works, and they might be useful in thermonuclear fusion research and astrophysical applications. -- Highlights: •Accurate atomic data of iron ions are needed for identification of solar corona. •Extensive configuration interaction wavefunctions including 123 fine-structure levels have been calculated. •The relativistic effects by means of the Breit–Pauli Hamiltonian terms are incorporated. •This incorporation adjusts the energy levels, therefore the calculated values are close to experimental values

  20. Web Service for Calculating the Probability of Returning a Loan – De-sign, Implementation and Deployment

    Directory of Open Access Journals (Sweden)

    Julian VASILEV

    2014-01-01

    Full Text Available The purpose of this paper is to describe the process of designing, creating, implementing and deploying a real web service. A basic theory approach is used to analyze the implementation of web services. An existing profit model is used. Its business logic is integrated within a web ser-vice. Another desktop application is created to demonstrate the use of the recently created web service. This study shows a methodology for fast development and deployment of web services. The methodology has wide practical implications – in credit institutions and banks when giving a loan. This study is the first of its kind to show the design, implementation and deployment of a web service for calculating the probability of returning a loan. The methodology may be used for the encapsulation of other business logic into web services.

  1. Calculation of transition probabilities and ac Stark shifts in two-photon laser transitions of antiprotonic helium

    International Nuclear Information System (INIS)

    Numerical ab initio variational calculations of the transition probabilities and ac Stark shifts in two-photon transitions of antiprotonic helium atoms driven by two counter-propagating laser beams are presented. We found that sub-Doppler spectroscopy is, in principle, possible by exciting transitions of the type (n,L)→(n-2,L-2) between antiprotonic states of principal and angular momentum quantum numbers n∼L-1∼35, first by using highly monochromatic, nanosecond laser beams of intensities 104-105 W/cm2, and then by tuning the virtual intermediate state close (e.g., within 10-20 GHz) to the real state (n-1,L-1) to enhance the nonlinear transition probability. We expect that ac Stark shifts of a few MHz or more will become an important source of systematic error at fractional precisions of better than a few parts in 109. These shifts can, in principle, be minimized and even canceled by selecting an optimum combination of laser intensities and frequencies. We simulated the resonance profiles of some two-photon transitions in the regions n=30-40 of the p4He+ and p3He+ isotopes to find the best conditions that would allow this.

  2. Relativistic Many-body Moller-Plesset Perturbation Theory Calculations of the Energy Levels and Transition Probabilities in Na- to P-like Xe Ions

    Energy Technology Data Exchange (ETDEWEB)

    Vilkas, M J; Ishikawa, Y; Trabert, E

    2007-03-27

    Relativistic multireference many-body perturbation theory calculations have been performed on Xe{sup 43+}-Xe{sup 39+} ions, resulting in energy levels, electric dipole transition probabilities, and level lifetimes. The second-order many-body perturbation theory calculation of energy levels included mass shifts, frequency-dependent Breit correction and Lamb shifts. The calculated transition energies and E1 transition rates are used to present synthetic spectra in the extreme ultraviolet range for some of the Xe ions.

  3. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    International Nuclear Information System (INIS)

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features

  4. PAPIN: A Fortran-IV program to calculate cross section probability tables, Bondarenko and transmission self-shielding factors for fertile isotopes in the unresolved resonance region

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Cobos, J.G.

    1981-08-01

    The Fortran IV code PAPIN has been developed to calculate cross section probability tables, Bondarenko self-shielding factors and average self-indication ratios for non-fissile isotopes, below the inelastic threshold, on the basis of the ENDF/B prescriptions for the unresolved resonance region. Monte-Carlo methods are utilized to generate ladders of resonance parameters in the unresolved resonance region, from average resonance parameters and their appropriate distribution functions. The neutron cross-sections are calculated by the single level Breit-Wigner (SLBW) formalism, with s, p and d-wave contributions. The cross section probability tables are constructed by sampling the Doppler-broadened cross sections. The various self-shielded factors are computed numerically as Lebesgue integrals over the cross section probability tables. The program PAPIN has been validated through extensive comparisons with several deterministic codes.

  5. Calculation of probabilities of transfer, recurrence intervals, and positional indices for linear compartment models. Environmental Sciences Division Publication no. 1544

    Energy Technology Data Exchange (ETDEWEB)

    Carney, J.H.; DeAngelis, D.L.; Gardner, R.H.; Mankin, J.B.; Post, W.M.

    1981-02-01

    Six indices are presented for linear compartment systems that quantify the probable pathways of matter or energy transfer, the likelihood of recurrence if the model contains feedback loops, and the number of steps (transfers) through the system. General examples are used to illustrate how these indices can simplify the comparison of complex systems or organisms in unrelated systems.

  6. Calculation of the transition probabilities of superfluid Fermi gas with orbital angular momentum l=1 at low temperatures

    Directory of Open Access Journals (Sweden)

    S Nasirimoghadam

    2011-09-01

    Full Text Available  The ultracold atoms fermion gas such as 6Li undergo superfluidity state. The transport quantities of these fluids have a direct dependence on the transition probabilities. Here, by obtaining possible processes in p-wave superfluid, we have shown that only binary processes are dominate at low temperatures.

  7. Calculation of probabilities of transfer, recurrence intervals, and positional indices for linear compartment models. Environmental Sciences Division Publication no. 1544

    International Nuclear Information System (INIS)

    Six indices are presented for linear compartment systems that quantify the probable pathways of matter or energy transfer, the likelihood of recurrence if the model contains feedback loops, and the number of steps (transfers) through the system. General examples are used to illustrate how these indices can simplify the comparison of complex systems or organisms in unrelated systems

  8. Grit-mediated frictional ignition of a polymer-bonded explosive during oblique impacts: Probability calculations for safety engineering

    International Nuclear Information System (INIS)

    Frictional heating of high-melting-point grit particles during oblique impacts of consolidated explosives is considered to be the major source of ignition in accidents involving dropped explosives. It has been shown in other work that the lower temperature melting point of two frictionally interacting surfaces will cap the maximum temperature reached, which provides a simple way to mitigate the danger in facilities by implementing surfaces with melting points below the ignition temperature of the explosive. However, a recent series of skid testing experiments has shown that ignition can occur on low-melting-point surfaces with a high concentration of grit particles, most likely due to a grit–grit collision mechanism. For risk-based safety engineering purposes, the authors present a method to estimate the probability of grit contact and/or grit–grit collision during an oblique impact. These expressions are applied to potentially high-consequence oblique impact scenarios in order to give the probability of striking one or more grit particles (for high-melting-point surfaces), or the probability of one or more grit–grit collisions occurring (for low-melting-point surfaces). The probability is dependent on a variety of factors, many of which can be controlled for mitigation to achieve acceptable risk levels for safe explosives handling operations. - Highlights: • Unexpectedly, grit-mediated ignition of a PBX occurred on low-melting point surfaces. • On high-melting surfaces frictional heating is due to a grit–surface interaction. • For low-melting point surfaces the heating mechanism is grit–grit collisions. • A method for estimating the probability of ignition is presented for both surfaces

  9. Gamma probability distribution calculation using Orthogonal Polynomials%Gamma概率分布值的正交多项式矩计算法研究

    Institute of Scientific and Technical Information of China (English)

    高钰婧; 宋松柏

    2013-01-01

    A numerical calculation of Gamma distribution value was presented.Taking common two-pa-rameter gamma distribution in hydrological analysis as example,employing the advantage of high accuracy calculation of Mathematica,the paper calculated the recurrence relation coefficients of non-classical orthogonal polynomials of weight function and further computed hydrological probability distribution.The results show that the method of probability distribution calculation has high accuracy,and is a general method for numerical integration of weight function and interval and can provide a way for the hydrological probability distribution calculation.%研究Gamma分布概率分布值的数值计算,以水文分析常用的两参数Gamma分布为例,采用Mathematica数值高精度计算的优势,计算权函数的非古典正交多项式递推系数,进而进行水文概率分布计算.结果表明:文中方法推求的概率分布值计算方法具有较高的计算精度,是一种通用的权函数和积分区间数值积分算法,可为水文概率计算提供一种计算途径.

  10. Lab Retriever: a software tool for calculating likelihood ratios incorporating a probability of drop-out for forensic DNA profiles

    OpenAIRE

    Inman, Keith; Rudin, Norah; Cheng, Ken; Robinson, Chris; Kirschner, Adam; Inman-Semerau, Luke; Lohmueller, Kirk E.

    2015-01-01

    Background Technological advances have enabled the analysis of very small amounts of DNA in forensic cases. However, the DNA profiles from such evidence are frequently incomplete and can contain contributions from multiple individuals. The complexity of such samples confounds the assessment of the statistical weight of such evidence. One approach to account for this uncertainty is to use a likelihood ratio framework to compare the probability of the evidence profile under different scenarios....

  11. Are classical molecular dynamics calculations accurate for state-to-state transition probabilities in the H + D2 reaction?

    International Nuclear Information System (INIS)

    We present converged quantum dynamics for the H + D2 reaction at a total energy high enough to produce HD in the v' = 3, j' = 7 vibrational-rotational state and for total angular momenta J = 0, 1, and 2. We compare state-to-state partial cross sections for H + D2 (v = 0-2, j = 0, J = 0-2) → HD (v' = 0-2, j') + H and H + D2 (v = 1, j = 6, J = 0-2) → HD (v' = 0-2, j') + H as calculated from classical trajectory calculations with quantized initial conditions, i.e., a quasiclassical trajectory (QCT) simulation, to the results of converged quantum dynamics calculations involving up to 654 coupled channels. Final states in the QCT calculations are assigned by the quadratic smooth sampling (QSS) method. Since the quasiclassical and quantal calculations are carried out with the same potential energy surface, the comparison provides a direct test of the accuracy of the quasiclassical simulations as a function of the initial vibrational-rotational state and the final vibrational-rotational state

  12. Qualification of the calculational methods of the fluence in the pressurised water reactors. Improvement of the cross sections treatment by the probability table method

    International Nuclear Information System (INIS)

    It is indispensable to know the fluence on the nuclear reactor pressure vessel. The cross sections and their treatment have an important rule to this problem. In this study, two ''benchmarks'' have been interpreted by the Monte Carlo transport program TRIPOLI to qualify the calculational method and the cross sections used in the calculations. For the treatment of the cross sections, the multigroup method is usually used but it exists some problems such as the difficulty to choose the weighting function and the necessity of a great number of energy to represent well the cross section's fluctuation. In this thesis, we propose a new method called ''Probability Table Method'' to treat the neutron cross sections. For the qualification, a program of the simulation of neutron transport by the Monte Carlo method in one dimension has been written; the comparison of multigroup's results and probability table's results shows the advantages of this new method. The probability table has also been introduced in the TRIPOLI program; the calculational results of the iron deep penetration benchmark has been improved by comparing with the experimental results. So it is interest to use this new method in the shielding and neutronic calculation. (author). 42 refs., 109 figs., 36 tabs

  13. Relativistic many-body calculations of excitation energy and radiative transition probabilities for many-electron ions

    International Nuclear Information System (INIS)

    Energy levels, line strengths, oscillator strengths, and transition rates are calculated for electric dipole nl1nl2[LSJ]-nl3nl4[L'S'J'] transition in Be- (n=2), Mg- (n=3), Zn- (n=4) and Sm- (n=5) like ions with nuclear charges ranging from Z=N to 100 where N is number of electron in system. (author)

  14. Relativistic calculation of the beta decay probabilities in the optimized Dirac-Kohn-Sham atom model and a chemical environment effect

    Energy Technology Data Exchange (ETDEWEB)

    Glushkov, Alexander [Odessa University (Ukraine); Russian Academy of Sciences, Troitsk (Russian Federation); Khetselius, Olga; Dubrovskaya, Yuliya [Odessa University (Ukraine); Lovett, Ludmila [UK National Academy of Sciences and Bookdata Co., London (United Kingdom)

    2009-07-01

    New theoretical scheme for calculating the beta decay characteristics and an account for chemical environment effect on the beta decay ones is developed. As method of calculation of the relativistic fields and electron wave functions, the gauge invariant Dirac-Fock and Dirac-Kohn-Sham approaches are used. The results of calculating the decay probabilities for the beta decays: {sup 33}P-{sup 33}S, {sup 35}S-{sup 35}Cl, {sup 63}Ni-{sup 63}Cu, {sup 241}Pu-{sup 241}Am are presented. Comparison of the Fermi function values is carried out for different approximations of an exchange effect account, calculation with using wave functions on the boundary of the charged spherical nucleus and with using squires of the amplitudes of expansion of these functions near zero.

  15. Relativistic many-body calculations of excitation energy and radiative transition probabilities for many-electron ions

    Energy Technology Data Exchange (ETDEWEB)

    Safronova, U.I.; Johnson, W.R. [Dept. of Physics, Univ. of Notre Dame, IN (United States)

    2000-01-01

    Energy levels, line strengths, oscillator strengths, and transition rates are calculated for electric dipole nl{sub 1}nl{sub 2}[LSJ]-nl{sub 3}nl{sub 4}[L'S'J'] transition in Be- (n=2), Mg- (n=3), Zn- (n=4) and Sm- (n=5) like ions with nuclear charges ranging from Z=N to 100 where N is number of electron in system. (author)

  16. Electron-ion recombination in nuclear recoils tracks in nonpolar liquids. Calculation of the effect of external electric field on the escape probability

    Science.gov (United States)

    Mateja, Piotr; Wojcik, Mariusz

    2016-07-01

    A computer simulation method is applied to study electron-ion recombination in tracks of low-energy nuclear recoils in nonpolar liquids in which the electron transport can be described as ideal diffusion. The electron escape probability is calculated as a function of applied electric field, both for the field parallel to the track and for the field perpendicular to the track. The dependence of escape probability on the field direction is the stronger, the longer the ionization track, with a significant effect being found already for tracks of ~100 nm length. The results are discussed in the context of possible applications of nonpolar molecular liquids as target media in directional dark matter detectors.

  17. Tumor control probability and the utility of 4D vs 3D dose calculations for stereotactic body radiotherapy for lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Valdes, Gilmer, E-mail: gilmer.valdes@uphs.upenn.edu [Department of Radiation Oncology, Perelman Center for Advanced Medicine, University of Pennsylvania, Philadelphia, PA (United States); Robinson, Clifford [Department of Radiation Oncology, Siteman Cancer Center, Washington University in St. Louis, St. Louis, MO (United States); Lee, Percy [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States); Morel, Delphine [Department of Biomedical Engineering, AIX Marseille 2 University, Marseille (France); Department of Medical Physics, Joseph Fourier University, Grenoble (France); Low, Daniel; Iwamoto, Keisuke S.; Lamb, James M. [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States)

    2015-04-01

    Four-dimensional (4D) dose calculations for lung cancer radiotherapy have been technically feasible for a number of years but have not become standard clinical practice. The purpose of this study was to determine if clinically significant differences in tumor control probability (TCP) exist between 3D and 4D dose calculations so as to inform the decision whether 4D dose calculations should be used routinely for treatment planning. Radiotherapy plans for Stage I-II lung cancer were created for 8 patients. Clinically acceptable treatment plans were created with dose calculated on the end-exhale 4D computed tomography (CT) phase using a Monte Carlo algorithm. Dose was then projected onto the remaining 9 phases of 4D-CT using the Monte Carlo algorithm and accumulated onto the end-exhale phase using commercially available deformable registration software. The resulting dose-volume histograms (DVH) of the gross tumor volume (GTV), planning tumor volume (PTV), and PTV{sub setup} were compared according to target coverage and dose. The PTV{sub setup} was defined as a volume including the GTV and a margin for setup uncertainties but not for respiratory motion. TCPs resulting from these DVHs were estimated using a wide range of alphas, betas, and tumor cell densities. Differences of up to 5 Gy were observed between 3D and 4D calculations for a PTV with highly irregular shape. When the TCP was calculated using the resulting DVHs for fractionation schedules typically used in stereotactic body radiation therapy (SBRT), the TCP differed at most by 5% between 4D and 3D cases, and in most cases, it was by less than 1%. We conclude that 4D dose calculations are not necessary for most cases treated with SBRT, but they might be valuable for irregularly shaped target volumes. If 4D calculations are used, 4D DVHs should be evaluated on volumes that include margin for setup uncertainty but not respiratory motion.

  18. Relativistic calculations of charge transfer probabilities in U92+ - U91+(1s) collisions using the basis set of cubic Hermite splines

    CERN Document Server

    Maltsev, I A; Tupitsyn, I I; Shabaev, V M; Kozhedub, Y S; Plunien, G; Stoehlker, Th

    2013-01-01

    A new approach for solving the time-dependent two-center Dirac equation is presented. The method is based on using the finite basis set of cubic Hermite splines on a two-dimensional lattice. The Dirac equation is treated in rotating reference frame. The collision of U92+ (as a projectile) and U91+ (as a target) is considered at energy E_lab=6 MeV/u. The charge transfer probabilities are calculated for different values of the impact parameter. The obtained results are compared with the previous calculations [I. I. Tupitsyn et al., Phys. Rev. A 82, 042701 (2010)], where a method based on atomic-like Dirac-Sturm orbitals was employed. This work can provide a new tool for investigation of quantum electrodynamics effects in heavy-ion collisions near the supercritical regime.

  19. Comparison Analysis for Calculating Spacecraft Collision Probability%航天器碰撞概率计算对比分析

    Institute of Scientific and Technical Information of China (English)

    齐征; 殷建丰; 韩潮

    2012-01-01

    The methods used in general calculating of collision probability between two orbiting objects were developed. The probability was calculated by given the respective state vectors and error covariance matrices and by Monte Carlo numerical verification. The parallel Monte Carlo numerical verification was given. Seven different methods were compared with process and time consuming. Effects of the uncertainty on the probability of collision were analyzed. The test case about ISS was given by seven methods and the result was compared with the fact case.%为了研究航天器碰撞概率计算的主要方法,通过航天器状态矢量、位置误差协方差计算碰撞概率,和蒙特卡罗法计算碰撞概率.给出了蒙特卡罗方法并行化的实现方法.比较了七种概率密度计算方法的优缺点和计算时间.分析了碰撞概率计算的影响因素和不同方法中对碰撞概率计算的影响因素.最后通过数值计算验证了七种计算方法的特点和正确性,并与国际空间站例子进行对比.对比结果表明,三维数值积分方法计算碰撞概率,可使计算简洁,并且所用时间短.

  20. Impact of aging conditions on mechanical properties of thermoplastic polyurethane

    International Nuclear Information System (INIS)

    In this study, impact of environmental aging conditions on the mechanical properties of thermoplastic polyurethane (TPU) was investigated. Especially, effect of temperature on water diffusion has been studied. Water-sorption experiments, tensile test and dynamic mechanical thermal analysis (DMTA) were performed after immersion in distilled water at different temperatures (25, 70 and 90 oC). The sorption process was analyzed by gravimetric measurements at different temperatures. Also, diffusion coefficients of solvent molecules in the TPU samples were identified. Therefore the activation energy and the mixing enthalpy were deduced. The aging impact on some mechanical properties of this material has been investigated after various aging cycles. Degradation of mechanical properties was observed. In fact, elastic modulus and stress at 200% of strain were decreased. It was also shown that such degradation largely depends on both aging temperature and aging immersion duration. The storage modulus (E') was also affected by the hygrothermal (HT) environment. The modification of mechanical properties seems to be well correlated to structural observations obtained from scanning electron microscopy (SEM) photographs. Finally, through thermal aging experiments, it was deduced that the combination of temperature with water seems to be a major factor of TPU degradation.

  1. Simultaneous analysis of matter radii, transition probabilities, and excitation energies of Mg isotopes by angular-momentum-projected configuration-mixing calculations

    Science.gov (United States)

    Shimada, Mitsuhiro; Watanabe, Shin; Tagami, Shingo; Matsumoto, Takuma; Shimizu, Yoshifumi R.; Yahiro, Masanobu

    2016-06-01

    We perform simultaneous analysis of (1) matter radii, (2) B (E 2 ;0+→2+) transition probabilities, and (3) excitation energies, E (2+) and E (4+) , for Mg-4024 by using the beyond-mean-field (BMF) framework with angular-momentum-projected configuration mixing with respect to the axially symmetric β2 deformation with infinitesimal cranking. The BMF calculations successfully reproduce all of the data for rm,B (E 2 ) , and E (2+) and E (4+) , indicating that it is quite useful for data analysis; particularly for low-lying states. We also discuss the absolute value of the deformation parameter β2 deduced from measured values of B (E 2 ) and rm. This framework makes it possible to investigate the effects of β2 deformation, the change in β2 due to restoration of rotational symmetry, β2 configuration mixing, and the inclusion of time-odd components by infinitesimal cranking. Under the assumption of axial deformation and parity conservation, we clarify which effect is important for each of the three measurements and propose the kinds of BMF calculations that are practical for each of the three kinds of observables.

  2. Progress in Calculation Methods for Collision Probability of Spacecraft%航天器碰撞概率计算方法研究进展

    Institute of Scientific and Technical Information of China (English)

    杨维维; 赵勇; 陈小前; 王振国

    2012-01-01

    针对碰撞检测与预报问题,对航天器碰撞概率计算方法的研究现状进行了总结分析.归纳了航天器线性及非线性相对运动条件下典型碰撞概率计算方法的特点及适用性;为满足某些特殊任务的碰撞预警实时性要求,讨论了瞬时碰撞概率及最大瞬时碰撞概率计算方法.通过对研究现状的分析,提出了现阶段碰撞概率计算方法中存在的问题及未来发展方向,可为未来轨迹安全问题的研究和技术发展提供参考.%To investigate the collision detection and prediction, the status and development of spacecraft collision probability computations was presented. Firstly, the characteristics and applicability of typical formulations were summarized for both linear and nonlinear encounters. Secondly, calculation methods of maximum instantaneous collision probability were discussed for certain applications with real-time collision warning requirements. Finally, some challenges and future directions were proposed based on the analysis of recent work. This work will benefit for future research and development in the field of trajectory safety.

  3. Uncertainty calculation in the RIO air quality interpolation model and aggregation to yearly average and exceedance probability taking into account the temporal auto-correlation.

    Science.gov (United States)

    Maiheu, Bino; Nele, Veldeman; Janssen, Stijn; Fierens, Frans; Trimpeneers, Elke

    2010-05-01

    RIO is an operational air quality interpolation model developed by VITO and IRCEL-CELINE and produces hourly maps for different pollutant concentrations such as O3, PM10 and NO2 measured in Belgium [1]. The RIO methodology consists of residual interpolation by Ordinary Kriging of the residuals of the measured concentrations and pre-determined trend functions which express the relation between land cover information derived from the CORINE dataset and measured time-averaged concentrations [2]. RIO is an important tool for the Flemish administration and is among others used to report, as is required by each member state, on the air quality status in Flanders to the European Union. We feel that a good estimate of the uncertainty of the yearly average concentration maps and the probability of norm-exceedance are both as important as the values themselves. In this contribution we will discuss the uncertainties specific to the RIO methodology, where we have both contributions from the Ordinary Kriging technique as well as the trend functions. Especially the parameterisation of the uncertainty w.r.t. the trend functions will be the key indicator for the degree of confidence the model puts into using land cover information for spatial interpolation of pollutant concentrations. Next, we will propose a method which enables us to calculate the uncertainty on the yearly average concentrations as well as the number of exceedance days, taking into account the temporal auto-correlation of the concentration fields. It is clear that the autocorrelation will have a strong impact on the uncertainty estimation [3] of yearly averages. The method we propose is based on a Monte Carlo technique that generates an ensemble of interpolation maps with the correct temporal auto-correlation structure. From a generated ensemble, the calculation of norm-exceedance probability at each interpolation location becomes quite straightforward. A comparison with the ad-hoc method proposed in [3], where

  4. URR [Unresolved Resonance Region] computer code: A code to calculate resonance neutron cross-section probability tables, Bondarenko self-shielding factors, and self-indication ratios for fissile and fertile nuclides

    International Nuclear Information System (INIS)

    The URR computer code has been developed to calculate cross-section probability tables, Bondarenko self-shielding factors, and self-indication ratios for fertile and fissile isotopes in the unresolved resonance region. Monte Carlo methods are utilized to select appropriate resonance parameters and to compute the cross sections at the desired reference energy. The neutron cross sections are calculated by the single-level Breit-Wigner formalism with s-, p-, and d-wave contributions. The cross-section probability tables are constructed by sampling by Doppler broadened cross-sections. The various self-shielding factors are computer numerically as Lebesgue integrals over the cross-section probability tables

  5. Simultaneous analysis of matter radii, transition probabilities, and excitation energies of Mg isotopes by angular-momentum-projected configuration-mixing calculations

    CERN Document Server

    Shimada, Mitsuhiro; Tagami, Shingo; Matsumoto, Takuma; Shimizu, Yoshifumi R; Yahiro, Masanobu

    2016-01-01

    We perform simultaneous analysis of (1) matter radii, (2) $B(E2; 0^+ \\rightarrow 2^+ )$ transition probabilities, and (3) excitation energies, $E(2^+)$ and $E(4^+)$, for $^{24-40}$Mg by using the beyond mean-field (BMF) framework with angular-momentum-projected configuration mixing with respect to the axially symmetric $\\beta_2$ deformation with infinitesimal cranking. The BMF calculations successfully reproduce all of the data for $r_{\\rm m}$, $B(E2)$, and $E(2^+)$ and $E(4^+)$, indicating that it is quite useful for data analysis, particularly for low-lying states. We also discuss the absolute value of the deformation parameter $\\beta_2$ deduced from measured values of $B(E2)$ and $r_{\\rm m}$. This framework makes it possible to investigate the effects of $\\beta_2$ deformation, the change in $\\beta_2$ due to restoration of rotational symmetry, $\\beta_2$ configuration mixing, and the inclusion of time-odd components by infinitesimal cranking. Under the assumption of axial deformation and parity conservation,...

  6. Application of multi-dimensional discrimination diagrams and probability calculations to Paleoproterozoic acid rocks from Brazilian cratons and provinces to infer tectonic settings

    Science.gov (United States)

    Verma, Sanjeet K.; Oliveira, Elson P.

    2013-08-01

    In present work, we applied two sets of new multi-dimensional geochemical diagrams (Verma et al., 2013) obtained from linear discriminant analysis (LDA) of natural logarithm-transformed ratios of major elements and immobile major and trace elements in acid magmas to decipher plate tectonic settings and corresponding probability estimates for Paleoproterozoic rocks from Amazonian craton, São Francisco craton, São Luís craton, and Borborema province of Brazil. The robustness of LDA minimizes the effects of petrogenetic processes and maximizes the separation among the different tectonic groups. The probability based boundaries further provide a better objective statistical method in comparison to the commonly used subjective method of determining the boundaries by eye judgment. The use of readjusted major element data to 100% on an anhydrous basis from SINCLAS computer program, also helps to minimize the effects of post-emplacement compositional changes and analytical errors on these tectonic discrimination diagrams. Fifteen case studies of acid suites highlighted the application of these diagrams and probability calculations. The first case study on Jamon and Musa granites, Carajás area (Central Amazonian Province, Amazonian craton) shows a collision setting (previously thought anorogenic). A collision setting was clearly inferred for Bom Jardim granite, Xingú area (Central Amazonian Province, Amazonian craton) The third case study on Older São Jorge, Younger São Jorge and Maloquinha granites Tapajós area (Ventuari-Tapajós Province, Amazonian craton) indicated a within-plate setting (previously transitional between volcanic arc and within-plate). We also recognized a within-plate setting for the next three case studies on Aripuanã and Teles Pires granites (SW Amazonian craton), and Pitinga area granites (Mapuera Suite, NW Amazonian craton), which were all previously suggested to have been emplaced in post-collision to within-plate settings. The seventh case

  7. 外浮顶油罐雷击起火概率计算%Calculation on probability of fire caused by lightning for external floating roof oil tanks

    Institute of Scientific and Technical Information of China (English)

    刘健; 杨仲江; 卢慧慧

    2016-01-01

    Fire accidents caused by lightning strike on large external floating roof ( EFR) oil tanks have occurred for many times, so it is practically significant to evaluate the safety objectively and calculate the probability of fire caused by lightning. The harm modes of lightning on oil tanks were presented.The annual incidence of lightning strike for external floating roof oil tanks were calculated by means of Monte Carlo method combined with electro-geometric model ( EGM ) .The difference in protection effects of conventional electrostatic conductors and retractable grounding assemblies ( RGA) were analyzed.The annual accident rates of spark discharge by lightning on oil tanks installing RGAs were discussed.The results showed that the annual incidence of lightning strike increase with the increasing diameters and heights of oil tanks.The protection effect by u-sing RGA is better than that of conventional electrostatic conductors.The probability and annual accident rates of spark dis-charge can be significantly decreased when multiple RGAs are installed.The annual accident rates of spark discharge by lightning on oil tanks can be reduced to 10 -5 when only two RGAs are installed.%大型外浮顶储罐多次发生雷击起火事故,因此对其安全性做出客观评价,计算雷击起火概率现实意义重大。通过分析雷电对外浮顶油罐的危害方式,利用蒙特卡洛方法结合电气几何模型计算外浮顶油罐年雷击率。分析采用导静电线和可伸缩接地装置( RGA)的防护效果差别。最后计算安装可伸缩接地装置后油罐遭受雷击产生火花放电的年事故率。计算结果表明:年雷击率随着油罐直径和罐壁高度的增大而增加;采用可伸缩接地装置的防护效果明显优于传统导静电线;安装多个可伸缩接地装置可以明显降低产生火花的概率和年事故率。两个RGA就可以将油罐遭受雷击产生火花放电的年事故率降至10-5以下。

  8. Monte Carlo dose calculations and radiobiological modelling: analysis of the effect of the statistical noise of the dose distribution on the probability of tumour control

    International Nuclear Information System (INIS)

    The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, σd; whilst the quantities d and σd depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 108 from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error on the tcp to

  9. URR [Unresolved Resonance Region] computer code: A code to calculate resonance neutron cross-section probability tables, Bondarenko self-shielding factors, and self-indication ratios for fissile and fertile nuclides

    International Nuclear Information System (INIS)

    The URR computer code has been developed to calculate cross-section probability tables, Bondarenko self-shielding factors, and self- indication ratios for fertile and fissile isotopes in the unresolved resonance region. Monte Carlo methods are utilized to select appropriate resonance parameters and to compute the cross sections at the desired reference energy. The neutron cross sections are calculated by the single-level Breit-Wigner formalism with s-, p-, and d-wave contributions. The cross-section probability tables are constructed by sampling the Doppler broadened cross-section. The various shelf-shielded factors are computed numerically as Lebesgue integrals over the cross-section probability tables. 6 refs

  10. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  11. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  12. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  13. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  14. Continuous-Energy Adjoint Flux and Perturbation Calculation using the Iterated Fission Probability Method in Monte Carlo Code TRIPOLI-4® and Underlying Applications

    Science.gov (United States)

    Truchet, G.; Leconte, P.; Peneliau, Y.; Santamarina, A.; Malvagi, F.

    2014-06-01

    Pile-oscillation experiments are performed in the MINERVE reactor at the CEA Cadarache to improve nuclear data accuracy. In order to precisely calculate small reactivity variations (kinetic parameters (βeff, Λeff) or sensitivity parameters.

  15. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  16. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  17. The comparison of calculated transition probabilities with luminescence characteristics of erbium(III) in fluoride glasses and in the mixed yttrium-zirconium oxide crystal

    Science.gov (United States)

    Reisfeld, R.; Katz, G.; Jacoboni, C.; De Pape, R.; Drexhage, M. G.; Brown, R. N.; Jørgensen, C. K.

    1983-07-01

    Fluorozirconate glasses containing 2 mole% ErF 3 were prepared by melting the binary fluorides with ammonium bifluoride under an atmosphere of carbon tetrachloride and argon at 850°C. Absorption spectra of these glasses were obtained and the Judd-Ofelt parameters were calculated. Emission spectra and lifetimes of erbium in fluorozirconate glass, in lead-gallium-zinc fluoride glass, and in yttrium-zirconium oxide crystal were measured and compared with the theoretical calculations. Laser emission lines in these materials are deduced from these measurements. It is suggested that materials doped with erbium may serve as light sources for fiber optic waveguides made from the undoped materials.

  18. 船-冰碰撞几何概率计算方法及影响因素研究%Calculation method research of geometric collision probability for ship and ice

    Institute of Scientific and Technical Information of China (English)

    张健; 万正权; 张充霖; 尹群

    2013-01-01

    Based on the probability theory of two-dimension continuous random variables and the concept of complex on aerospace engineering, a typical channel in a specific area is chosen as a object, the colli-sion geometric probability calculation mathematical model between ships and ice is put forward. With Mat-lab, the program is compiled and the geometric collision probability is calculated. Finally, by studying the sizes of ships and ice floes, relevant parameters of the ships’ distribution function, the relationship between collision probability and each parameters is revealed.%  文章以特定海域内的典型航道作为研究对象,基于二维连续型随机变量的概率理论,并结合航天工程中复合体的概念,建立了船—冰碰撞几何概率计算的数学模型;同时应用Matlab编制了相关程序,计算了船—冰碰撞的几何概率;最后分析了船舶和浮冰尺寸以及船舶分布函数中的相关参数对于几何碰撞概率的影响,揭示了几何碰撞概率与各个参数之间的关系。

  19. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  20. Asteroidal collision probabilities

    Science.gov (United States)

    Bottke, William F., Jr.; Greenberg, Richard

    1993-01-01

    Several past calculations of collision probabilities between pairs of bodies on independent orbits have yielded inconsistent results. We review the methodologies and identify their various problems. Greenberg's (1982) collision probability formalism (now with a corrected symmetry assumption) is equivalent to Wetherill's (1967) approach, except that it includes a way to avoid singularities near apsides. That method shows that the procedure by Namiki and Binzel (1991) was accurate for those cases where singularities did not arise.

  1. Probable approaches to develop particle beam energy drivers and to calculate wall material ablation with X ray radiation from imploded targets

    International Nuclear Information System (INIS)

    The first subject was the development of future ion beam driver with medium-mass ion specie. This may enable us to develop a compromised driver from the point of view of the micro-divergence angle and the cost. We produced nitrogen ion beams, and measured the micro-divergence angle on the anode surface. The measured value was 5-6mrad for the above beam with 300-400keV energy, 300A peak current and 50ns duration. This value was enough small and tolerable for the future energy driver. The corresponding value for the proton beam with higher peak current was 20-30mrad, which was too large. So that, the scale-up experiment with the above kind of medium-mass ion beam must be realized urgently to clarify the beam characteristics in more details. The reactor wall ablation with the implosion X-ray was also calculated as the second subject in this paper. (author)

  2. Relativistic calculations of the K-K charge transfer and K-vacancy production probabilities in low-energy ion-atom collisions

    CERN Document Server

    Tupitsyn, I I; Shabaev, V M; Bondarev, A I; Deyneka, G B; Maltsev, I A; Hagmann, S; Plunien, G; Stoehlker, Th

    2011-01-01

    The previously developed technique for evaluation of charge-transfer and electron-excitation processes in low-energy heavy-ion collisions [I.I. Tupitsyn et al., Phys. Rev. A 82, 042701(2010)] is extended to collisions of ions with neutral atoms. The method employs the active electron approximation, in which only the active electron participates in the charge transfer and excitation processes while the passive electrons provide the screening DFT potential. The time-dependent Dirac wave function of the active electron is represented as a linear combination of atomic-like Dirac-Fock-Sturm orbitals, localized at the ions (atoms). The screening DFT potential is calculated using the overlapping densities of each ions (atoms), derived from the atomic orbitals of the passive electrons. The atomic orbitals are generated by solving numerically the one-center Dirac-Fock and Dirac-Fock-Sturm equations by means of a finite-difference approach with the potential taken as the sum of the exact reference ion (atom) Dirac-Fock...

  3. Effect of physicochemical aging conditions on the composite-composite repair bond strength

    NARCIS (Netherlands)

    Brendeke, Johannes; Ozcan, Mutlu

    2007-01-01

    Purpose: This study evaluated the effect of different physicochemical aging methods and surface conditioning techniques on the repair bond strength of composite. It was hypothesized that the aging conditions would decrease the repair bond strength and surface conditioning methods would perform simil

  4. A Research on Collision Probability Calculation of Space Debris for Nonlinear Relative Motion%非线性相对运动下空间碎片碰撞概率计算的研究

    Institute of Scientific and Technical Information of China (English)

    许晓丽; 熊永清

    2011-01-01

    The calculation of collision probability is the foundation of collision detection and avoidance maneuver for space objects. Now an assumption of linear relative motion is usually applied in the calculation of collision probability and then the complex 3-dimensional problem can be reduced to a 2-dimensional integral of probability density function over the area of circle. However, if the relative velocity value is very small, the term of linear relative motion is not valid. So it is necessary to consider the calculation of collision probability for nonlinear relative motion. The method used to calculate collision probability for nonlinear relative motion is studied, and test cases are designed to prove the validity of this method.It is applicable to collision probability problems involving relative velocity and error covariance varying with time. The results indicate that it is necessary to calculate collision probability with this nonlinear method under certain circumstances. For example, for elliptical relative motions in Satellite Formation Flying, when the relative velocity is below 100 m/s, the relative error between the linear method and the nonlinear method exceeds 5%; for the problem of conjunction analysis of two satellites with circular orbit, when the relative velocity is below 10 m/s, the relative error is also larger than 1%. Some significant conclusions are obtained for collision detection system of our country.%空间目标碰撞概率的计算是航天器进行空间碎片预警和规避机动的基础.为了简化计算,目前国内外在计算碰撞概率问题过程中大多基于线性相对运动的条件,将三维碰撞概率计算积分问题简化为位置误差概率密度函数在圆域内的二维积分问题.但是当空间目标相对运动速度较小时,这种线性运动条件不再成立,就需要在真实的非线性相对运动状态下重新考虑碰撞概率的计算问题.研究了非线性相对运动状态下碰撞概率

  5. Method for calculating collision probability between space objects with unknown position uncertainty%空间目标位置误差未知的碰撞概率算法

    Institute of Scientific and Technical Information of China (English)

    吴波; 赵拥军; 胡德秀

    2011-01-01

    针对航天器和空间碎片位置误差未知的情况,提出一种空间目标碰撞概率的计算方法.该方法基于正交投影变换将碰撞概率中的三维问题转化到二维进行求解,并利用变量置换的方法计算空间目标碰撞概率.讨论了位置误差椭球形状和大小变化对碰撞概率的影响.根据实际碰撞事例验证了其有效性.%Method is given to calculate collision probability with the unknown position uncertainty of spacecraft and space debris.Orthogonal projection changes the problem from 3-D to 2-D and with variable substitution the probability is obtained. The influence of position uncertainty to the collision probability is discussed.A collision example is given to validate that the method is effective.

  6. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  7. 雷电引发油罐火灾爆炸事故的概率计算%Study on Probability Calculation for Oil Tank Fire and Explosion Caused by Lightning

    Institute of Scientific and Technical Information of China (English)

    苏伯尼; 黄弘; 李云涛

    2013-01-01

    In order to perform quantitative risk assessment of fire caused by lightning in oil tank areas,probabilities of lightning-induced floating roof tank fire and explosion are estimated.According to domestic and international standards,models of lightning and floating roof tanks were established.The risk probability of lightning hitting oil tanks was calculated using these models.After that,the risk probability of fire and explosion when lightning hit oil tanks was estimated according to some existing experimental results.The results show that probability of rim seal spark causing fire is greater than that of lightning burning through oil tank shell under normal circumstances.And probability of lightning-induced fire and explosion on tanks using mechanical primary seal is higher than that on tanks using soft primary seal.%为了对储油罐区雷击火灾进行定量的风险评估,估算雷电引发浮顶罐火灾爆炸事件的概率.根据国内外相关标准,建立闪电和浮顶罐的模型,并计算闪电击中不同尺寸油罐的概率.然后,根据已有的试验结果,估算闪电在击中不同尺寸油罐的情况下,通过烧穿油罐外壳以及引起密封圈火花2种途径导致火灾爆炸的概率,开展雷电引发油罐火灾爆炸的风险评估.计算结果表明:一般情况下,密封圈火花导致火灾的概率大于闪电烧穿油罐外壳导致火灾的概率;采用机械密封的浮顶罐雷击火灾事故概率比采用软密封的高.

  8. Probability calculation method of design flow rate for domestic hot-water supply%热水供应设计流量的概率计算方法

    Institute of Scientific and Technical Information of China (English)

    邹平华; 赵金玲

    2000-01-01

    The basic formula of the design flow rate for domestic hot-water supply is investigated onthe basis of probability method. The relationship between water supply assurance factor and the quantity of sanitary fittings operated simultaneously which is used to determine the design flow rate is analyzed. The design flow rate calculated by the probability method has already considered the effects ofcombination of different consumption units, consumption loading per unit and the total amount of sani-tary fittings. The formulas of the design flow rate for domestic hot-water supply used in former USSRand Russia are introduced simply. It is indicated in the paper that the probability method is animportant research direction.%分析了用概率方法来研究热水供应设计流量的基本公式,剖析了给水保证率与确定设计流量所用同时使用器具数的关系。指出了用概率方法计算能反映不同类型用水单位组合、单位器具负荷用户数和卫生器具数量等因素对设计流量的影响。简介了苏、俄采用概率法的热水供应设计流量计算公式。指出了该方法是一值得重视的研究方向。

  9. 考虑变量相关性的尾矿坝坡失稳溃坝概率计算方法%Calculation method on probability of tailings dam failure caused by dam slope instability considering correlation of variables

    Institute of Scientific and Technical Information of China (English)

    郑欣; 李全明; 许开立; 耿丽艳

    2015-01-01

    对国内外尾矿坝溃坝事故进行整理分析,得出坝坡失稳是导致溃坝的一个主要原因,利用突变理论对坝坡失稳进行分析,从突变学角度证明内摩擦角和粘聚力是尾矿坝坡失稳溃坝的主要影响因素。摒弃传统的尾矿坝抗滑稳定性安全系数定值计算方法,选取综合考虑诸因素的不确定性的概率方法来计算尾矿坝坡失稳溃坝概率,确定内摩擦角和粘聚力作为随机参数,建立尾矿坝坡失稳破坏的功能函数,确定模拟次数;采用考虑随机变量相关性的蒙特卡罗进行尾矿坝坡失稳溃坝概率计算。该方法成功克服了matlab传统子集模拟方法只能解决随机变量为正态分布且变量之间相互独立的不足,并以某尾矿坝为例对其因坝坡失稳导致的溃坝概率进行了计算。%The accidents of tailings dam failure at home and abroad were analyzed, and it showed that the dam slope instability was a major reason causing dam failure.The dam slope instability was analyzed by the catastrophe theory, and it proved from the perspective of catastrophe that the internal friction angle and cohesion strength were the main influence factors of dam slope instability and failure.Abandoning the traditional calculation method on an-ti-slide stability safety coefficient of tailings dam, the probability method comprehensively considering the uncertain-ty of each factor was selected to calculate the probability of dam failure.The internal friction angle and cohesion strength were taken as random parameters, the function of dam slope instability and failure was established, and the number of simulation was determined.The Monte Carlo considering the correlation between random variables was applied to calculate the probability of dam failure.This method overcomes the shortcomings that the traditional Mat-lab subset simulation method can only solve the problems that random variables are normal distribution and the

  10. Critical effect of aging condition on mesostructural ordering in mesoporous titania thin film

    Energy Technology Data Exchange (ETDEWEB)

    Oveisi, Hamid [World Premier International (WPI) Research Center for Materials Nanoarchitectonics (MANA), National Institute for Materials Science - NIMS, 1-1 Namiki, Tsukuba 305-0044 (Japan); Center of Excellence in Advanced Materials and Processing, Department of Metallurgy and Materials Engineering, Iran University of Science and Technology (IUST), Narmak, Tehran 16844 (Iran, Islamic Republic of); Suzuki, Norihiro; Nemoto, Yoshihiro; Srinivasu, Pavuluri [World Premier International (WPI) Research Center for Materials Nanoarchitectonics (MANA), National Institute for Materials Science - NIMS, 1-1 Namiki, Tsukuba 305-0044 (Japan); Beitollahi, Ali [Center of Excellence in Advanced Materials and Processing, Department of Metallurgy and Materials Engineering, Iran University of Science and Technology (IUST), Narmak, Tehran 16844 (Iran, Islamic Republic of); Yamauchi, Yusuke, E-mail: Yamauchi.Yusuke@nims.go.j [World Premier International (WPI) Research Center for Materials Nanoarchitectonics (MANA), National Institute for Materials Science - NIMS, 1-1 Namiki, Tsukuba 305-0044 (Japan); Precursory Research for Embryonic Science and Technology (PRESTO), Japan Science and Technology Agency (JST), 4-1-8 Honcho, Kawaguchi, Saitama 332-0012 (Japan)

    2010-09-30

    Here we demonstrate facile synthesis method for formation of highly ordered mesoporous cubic Im-3 m titania thin film. The mesostructural ordering is strongly dependent on the aging condition after the spin-coating. The aging condition under low temperature and low humidity is found as an optimum condition for achieving highly ordered mesostructure in titania thin films. The effects of other important synthetic parameters, such as pH of the precursor solutions and aging periods, on the mesostructural ordering are carefully examined. After the calcination, continuous mesoporous films with partially crystallized frameworks are formed without any cracks. The mesostructure of the calcined films is formed by thermal shrinkage of the original Im-3 m mesostructure. The mesostructural change in the films calcined at various temperatures are studied by using the grazing-incidence small-angle X-ray scattering (GI-SAXS). The GI-SAXS patterns show the strong shrinkage along the perpendicular direction to the substrate by increasing the calcination temperature.

  11. Survivability of integrated PVDF film sensors to accelerated ageing conditions in aeronautical/aerospace structures

    International Nuclear Information System (INIS)

    This work validates the use of integrated polyvinylidene fluoride (PVDF) film sensors for dynamic testing, even after being subjected to UV-thermo-hygro-mechanical accelerated ageing conditions. The verification of PVDF sensors’ survivability in these environmental conditions, typically confronted by civil and military aircraft, is the main concern of the study. The evaluation of survivability is made by a comparison of dynamic testing results provided by the PVDF patch sensors subjected to an accelerated ageing protocol, and those provided by neutral non-aged sensors (accelerometers). The available measurements are the time-domain response signals issued from a modal analysis procedure, and the corresponding frequency response functions (FRF). These are in turn used to identify the constitutive properties of the samples by extraction of the modal parameters, in particular the natural frequencies. The composite specimens in this study undergo different accelerated ageing processes. After several weeks of experimentation, the samples exhibit a loss of stiffness, represented by a decrease in the elastic moduli down to 10%. Despite the ageing, the integrated PVDF sensors, subjected to the same ageing conditions, are still capable of providing reliable data to carry out a close followup of these changes. This survivability is a determinant asset in order to use integrated PVDF sensors to perform structural health monitoring (SHM) in the future of full-scale composite aeronautical structures. (paper)

  12. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  13. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  14. Effect of age condition on fatigue properties of 2E12 aluminum alloy

    Institute of Scientific and Technical Information of China (English)

    YAN Liang; DU Feng-shan; DAI Sheng-long; YANG Shou-jie

    2010-01-01

    The fatigue behaviors of 2E12 aluminum alloy in T3 and T6 conditions at room temperature in air were investigated.The microstructures and fatigue fracture surfaces of the alloy were examined by transmission electron microscopy(TEM)and scanning electron microscopy(SEM).The results show that the alloy exhibits higher fatigue crack propagation(FCP)resistance in T3condition than in T6 condition,the fatigue life is increased by 54% and the fatigue crack growth rate(FCGR)decreases significantly.The fatigue fractures of the alloy in T3 and T6 conditions are transgranular.But in T3 condition,secondary cracks occur and fatigue striations are not clear.In T6 condition,ductile fatigue striations are observed.The effect of aging conditions on fatigue behaviors is explained in terms of the slip planarity of dislocations and the cyclic slip reversibility.

  15. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  16. Fusion Probability in Dinuclear System

    CERN Document Server

    Hong, Juhee

    2015-01-01

    Fusion can be described by the time evolution of a dinuclear system with two degrees of freedom, the relative motion and transfer of nucleons. In the presence of the coupling between two collective modes, we solve the Fokker-Planck equation in a locally harmonic approximation. The potential of a dinuclear system has the quasifission barrier and the inner fusion barrier, and the escape rates can be calculated by the Kramers' model. To estimate the fusion probability, we calculate the quasifission rate and the fusion rate. We investigate the coupling effects on the fusion probability and the cross section of evaporation residue.

  17. Calculation of Leakage Probability of Pipe Connecting Flange Based on Monte Carlo Method%基于MonteCarlo法管道连接法兰泄漏概率计算

    Institute of Scientific and Technical Information of China (English)

    王程龙; 谢禹钧; 韦权权; 于小泽

    2016-01-01

    运用有限元软件 Ansys 对螺栓法兰接头进行模拟,得到了在预紧和操作工况下垫片的应力分布,计算出不同内压工况下垫片的应力。采用 Monte Carlo 法在管道工作压力不断波动时进行可靠性分析,创建极限方程并根据极限方程应用大型软件 Matlab 以变量的分布类型对变量进行反复随机抽样,计算出不同工作压力下螺栓法兰泄漏的概率。分析结果表明,工作压力的波动产生的附加载荷对螺栓法兰泄漏会产生很大的影响,必须加以足够的重视。%Bolted flange joints were simulated with the finite element software Ansys,getting the gasket stress distribution under the condition of the preload and operation,and calculating the gasket stress under different working condition of internal pressure.Due to constant fluctuations in pipeline pressure and the pipe stress complex,the working pressure,temperature, uncertain factors such as their own constraints on its tightness should be fully considered.The Monte Carlo method is used to have reliability analysis in pipeline working pressure fluctuating.Limit equation is created and according to the distribution of the variable type and limit equation using large-scale software Matlab is used to repeat random sampling to calculate the probability of bolt flange leakages under different working pressures.The results show that,the additional load produced by working pressure fluctuation will have a very big effect on the bolt flange leak,that must be got seriously enough attention.

  18. 基于模糊故障树方法的钻井平台井喷概率计算%Probability Calculation of Blowout of Drilling Platform Based on Fuzzy Fault Tree Method

    Institute of Scientific and Technical Information of China (English)

    董海波; 顾学康

    2013-01-01

      基于模糊理论,提出了一种定量风险评估方法——模糊故障树方法。查阅历史数据库或者借助专家判断,给出故障树模型中各基本事件发生可能性的模糊数表示。考虑到不同专家意见之间可能存在的差异,给出了处理专家意见的运算法则及确定专家权重的理论方法。以半潜式钻井平台发生井喷为顶事件,构建了故障树模型,依据给出的模糊故障树理论模型,计算得到半潜式平台在钻进或固井过程中发生井喷的概率。%Based on fuzzy theory, a quantitative risk assessment method called fuzzy fault tree analysis is presented. By referring to risk database or by dint of expert judgements, this paper presents occurrence possibility of each basic event in the fault tree model which is expressed in the form of fuzzy numbers. Since each expert may have a different opinion, this paper developes an algorithm to aggregate expert opinion and a method to determine the importance weight of expert opinion. This paper constructs fault tree model on blowout of semi-submersible drilling platform. According to fuzzy fault tree analysis method, the probability of blowout is calculated during drilling or cementing on a semi-submersible drilling platform.

  19. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  20. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  1. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  2. Average Transmission Probability of a Random Stack

    Science.gov (United States)

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  3. Nuclear structure of tellurium 133 via beta decay and shell model calculations in the doubly magic tin 132 region. [J,. pi. , transition probabilities, neutron and proton separation, g factors

    Energy Technology Data Exchange (ETDEWEB)

    Lane, S.M.

    1979-08-01

    An experimental investigation of the level structure of /sup 133/Te was performed by spectroscopy of gamma-rays following the beta-decay of 2.7 min /sup 133/Sb. Multiscaled gamma-ray singles spectra and 2.5 x 10/sup 7/ gamma-gamma coincidence events were used in the assignment of 105 of the approximately 400 observed gamma-rays to /sup 133/Sb decay and in the construction of the /sup 133/Te level scheme with 29 excited levels. One hundred twenty-two gamma-rays were identified as originating in the decay of other isotopes of Sb or their daughter products. The remaining gamma-rays were associated with the decay of impurity atoms or have as yet not been identified. A new computer program based on the Lanczos tridiagonalization algorithm using an uncoupled m-scheme basis and vector manipulations was written. It was used to calculate energy levels, parities, spins, model wavefunctions, neutron and proton separation energies, and some electromagnetic transition probabilities for the following nuclei in the /sup 132/Sn region: /sup 128/Sn, /sup 129/Sn, /sup 130/Sn, /sup 131/Sn, /sup 130/Sb, /sup 131/Sb, /sup 132/Sb, /sup 133/Sb, /sup 132/Te, /sup 133/Te, /sup 134/Te, /sup 134/I, /sup 135/I, /sup 135/Xe, and /sup 136/Xe. The results are compared with experiment and the agreement is generally good. For non-magic nuclei: the lg/sub 7/2/, 2d/sub 5/2/, 2d/sub 3/2/, 1h/sub 11/2/, and 3s/sub 1/2/ orbitals are available to valence protons and the 2d/sub 5/2/, 2d/sub 3/2/, 1h/sub 11/2/, and 3s/sub 1/2/ orbitals are available to valence neutron holes. The present CDC7600 computer code can accommodate 59 single particle states and vectors comprised of 30,000 Slater determinants. The effective interaction used was that of Petrovich, McManus, and Madsen, a modification of the Kallio-Kolltveit realistic force. Single particle energies, effective charges and effective g-factors were determined from experimental data for nuclei in the /sup 132/Sn region. 116 references.

  4. Effect of surface conditioning methods on the microtensile bond strength of resin composite to composite after aging conditions

    NARCIS (Netherlands)

    Ozcan, Mutlu; Barbosa, Silvia Helena; Melo, Renata Marques; Galhano, Graziela Avila Prado; Bottino, Marco Antonio

    2007-01-01

    Objectives. This study evaluated the effect of two different surface conditioning methods on the repair bond strength of a bis-GMA-adduct/bis-EMA/TEGDMA based resin composite after three aging conditions. Methods. Thirty-six composite resin blocks (Esthet X, Dentsply) were prepared (5 mm x 6 mm x 6

  5. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  6. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  7. Non-Archimedean Probability

    CERN Document Server

    Benci, Vieri; Wenmackers, Sylvia

    2011-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization of probability is replaced by a different type of infinite additivity.

  8. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  9. Stage line diagram: an age-conditional reference diagram for tracking development.

    NARCIS (Netherlands)

    Van Buuren, S.; Ooms, J.C.L.

    2009-01-01

    This paper presents a method for calculating stage line diagrams, a novel type of reference diagram useful for tracking developmental processes over time. Potential fields of applications include: dentistry (tooth eruption), oncology (tumor grading, cancer staging), virology (HIV infection and disea

  10. Stage line diagram: An age-conditional reference diagram for tracking development

    NARCIS (Netherlands)

    Buuren, S. van; Ooms, J.C.L.

    2009-01-01

    This paper presents a method for calculating stage line diagrams, a novel type of reference diagram useful for tracking developmental processes over time. Potential fields of applications include: dentistry (tooth eruption), oncology (tumor grading, cancer staging), virology (HIV infection and disea

  11. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  12. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  13. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  14. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  15. Introduction to probability models

    CERN Document Server

    Ross, Sheldon M

    2006-01-01

    Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text. The book begins by introducing basic concepts of probability theory, such as the random v

  16. Molecular contingencies: reinforcement probability.

    Science.gov (United States)

    Hale, J M; Shimp, C P

    1975-11-01

    Pigeons obtained food by responding in a discrete-trials two-choice probability-learning experiment involving temporal stimuli. A given response alternative, a left- or right-key peck, had 11 associated reinforcement probabilities within each session. Reinforcement probability for a choice was an increasing or a decreasing function of the time interval immediately preceding the choice. The 11 equiprobable temporal stimuli ranged from 1 to 11 sec in 1-sec classes. Preference tended to deviate from probability matching in the direction of maximizing; i.e., the percentage of choices of the preferred response alternative tended to exceed the probability of reinforcement for that alternative. This result was qualitatively consistent with probability-learning experiments using visual stimuli. The result is consistent with a molecular analysis of operant behavior and poses a difficulty for molar theories holding that local variations in reinforcement probability may safely be disregarded in the analysis of behavior maintained by operant paradigms. PMID:16811883

  17. Monte Carlo calculation of the total probability for gamma-Ray interaction in toluene; Aplicacion del metodo de Monte Carlo al calcu lo de la probabilidad de interaccion fotonica en tolueno

    Energy Technology Data Exchange (ETDEWEB)

    Grau Malonda, A.; Garcia-Torano, E.

    1983-07-01

    Interaction and absorption probabilities for gamma-rays with energies between 1 and 1000 KeV have been computed and tabulated. Toluene based scintillator solution has been assumed in the computation. Both, point sources and homogeneously dispersed radioactive material have been assumed. These tables may be applied to cylinders with radii between 1.25 cm and 0.25 cm and heights between 4.07 cm and 0.20 cm. (Author) 26 refs.

  18. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  19. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  20. Qubit persistence probability

    International Nuclear Information System (INIS)

    In this work, I formulate the persistence probability for a qubit device as the probability of measuring its computational degrees of freedom in the unperturbed state without the decoherence arising from environmental interactions. A decoherence time can be obtained from the persistence probability. Drawing on recent work of Garg, and also Palma, Suomine, and Ekert, I apply the persistence probability formalism to a generic single-qubit device coupled to a thermal environment, and also apply it to a trapped-ion quantum register coupled to the ion vibrational modes. (author)

  1. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  2. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  3. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  4. 曲柄滑块机构运动精度的概率分析与计算%Probability Analysis and Calculation of Kinematic Accuracy for Slider-Crank Mechanism

    Institute of Scientific and Technical Information of China (English)

    陈胜军; 贾方

    2013-01-01

    A general model of slider-crank mechanism precision analysis is established based on matrix analysis theory;probability analysis model of slider-crank mechanism movement precision is obtained based on state functions; the movement output accuracy model and probability analysis model of slider-crank mechanism are established based on centring slidercrank mechanism.The case study indicated that centring slider-crank mechanism has different kinematic error and different reliability under given design accuracy.The effectiveness probability of slider-crank mechanism can be given out under different motion stage by the models,and it is important for design and manufacture of slider-crank mechanism.%利用矩阵分析理论建立了曲柄滑块机构精度分析的一般模型,利用状态函数建立了曲柄滑块机构运动精度的概率分析模型,以对心曲柄滑块机构为具体对象,建立了曲柄滑块机构运动输出精度模型及其概率分析计算模型.算例分析表明:在给定的设计精度下,对心曲柄滑块机构在不同的运动状态有着不同的运动误差和不同的运动可靠度.模型可以定量地给出曲柄滑块机构在不同运动状态下的失效概率,对曲柄滑块机构的设计与制造具有应用价值.

  5. Insurance Calculation of Bankruptcy Probability of Constant Interest Rate Model under Dependent Negative%负相依下带常数利率模型的破产概率的保险计算

    Institute of Scientific and Technical Information of China (English)

    李明倩

    2014-01-01

    本文研究了负相依索赔条件下带常数利率的风险模型在随机区间上的破产问题,最终得到了该模型破产概率的渐进表达式。%This paper studies the risk model under conditions of constant interest rates negatively correlated claims in the bankruptcy issue random intervals, and finally get the asymptotic expression of the model the probability of bankruptcy.

  6. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  7. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  8. Relativistic calculations of 3s21S0-3s3p 1P1 and 3s21S0-3s3p 3P1,2 transition probabilities in the Mg isoelectronic sequence

    International Nuclear Information System (INIS)

    Using the multi-configuration Dirac—Fock self-consistent field method and the relativistic configuration-interaction method, calculations of transition energies, oscillator strengths and rates are performed for the 3s21S0-3s3p 1P1 spin-allowed transition, 3s21S0-3s3p 3P1,2 intercombination and magnetic quadrupole transition in the Mg isoelectronic sequence (Mg I, Al II, Si III, P IV and S V). Electron correlations are treated adequately, including intravalence electron correlations. The influence of the Breit interaction on oscillator strengths and transition energies are investigated. Quantum electrodynamics corrections are added as corrections. The calculation results are found to be in good agreement with the experimental data and other theoretical calculations. (atomic and molecular physics)

  9. Relativistic many-body calculations of transition probabilities for the 2l12l2[LSJ]-2l32l4[L'S'J'] lines in Be-like ions

    International Nuclear Information System (INIS)

    Reduced matrix elements, oscillator strengths, and transition rates are calculated for all allowed and forbidden 2s-2p electric dipole transitions in berylliumlike ions with nuclear charges ranging from Z = 4 to 100. Many-body perturbation theory (MBPT), including the Breit interaction, is used to evaluate retarded E1 matrix elements in length and velocity forms. The calculations start with a 1s2 Dirac-Fock potential and include all possible n = 2 configurations, leading to 4 odd-parity and 6 even-parity states. First-order perturbation theory is used to obtain intermediate coupling coefficients. Second-order MBPT is used to determine the matrix elements, which are evaluated for the 16 possible E1 transitions. The transition energies used in the calculation of oscillator strengths and transition rates are evaluated using second-order MBPT. The importance of virtual electron-positron pair (negative energy) contributions to the transition amplitudes is discussed. (orig.)

  10. A case concerning the improved transition probability

    OpenAIRE

    Tang, Jian; Wang, An Min

    2006-01-01

    As is well known, the existed perturbation theory can be applied to calculations of energy, state and transition probability in many quantum systems. However, there are different paths and methods to improve its calculation precision and efficiency in our view. According to an improved scheme of perturbation theory proposed by [An Min Wang, quant-ph/0611217], we reconsider the transition probability and perturbed energy for a Hydrogen atom in a constant magnetic field. We find the results obt...

  11. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  12. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  13. Dynamic update with probabilities

    NARCIS (Netherlands)

    J. van Benthem; J. Gerbrandy; B. Kooi

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant pr

  14. Economy, probability and risk

    Directory of Open Access Journals (Sweden)

    Elena Druica

    2007-05-01

    Full Text Available The science of probabilities has earned a special place because it tried through its concepts to build a bridge between theory and experimenting.As a formal notion which by definition does not lead to polemic, probability, nevertheless, meets a series of difficulties of interpretation whenever the probability must be applied to certain particular situations.Usually, the economic literature brings into discussion two interpretations of the concept of probability:the objective interpretation often found under the name of frequency or statistical interpretation and the subjective or personal interpretation. Surprisingly, the third appproach is excluded:the logical interpretation.The purpose of the present paper is to study some aspects of the subjective and logical interpretation of the probability, aswell as the implications in the economics.

  15. Abstract Models of Probability

    Science.gov (United States)

    Maximov, V. M.

    2001-12-01

    Probability theory presents a mathematical formalization of intuitive ideas of independent events and a probability as a measure of randomness. It is based on axioms 1-5 of A.N. Kolmogorov 1 and their generalizations 2. Different formalized refinements were proposed for such notions as events, independence, random value etc., 2,3, whereas the measure of randomness, i.e. numbers from [0,1], remained unchanged. To be precise we mention some attempts of generalization of the probability theory with negative probabilities 4. From another side the physicists tryed to use the negative and even complex values of probability to explain some paradoxes in quantum mechanics 5,6,7. Only recently, the necessity of formalization of quantum mechanics and their foundations 8 led to the construction of p-adic probabilities 9,10,11, which essentially extended our concept of probability and randomness. Therefore, a natural question arises how to describe algebraic structures whose elements can be used as a measure of randomness. As consequence, a necessity arises to define the types of randomness corresponding to every such algebraic structure. Possibly, this leads to another concept of randomness that has another nature different from combinatorical - metric conception of Kolmogorov. Apparenly, discrepancy of real type of randomness corresponding to some experimental data lead to paradoxes, if we use another model of randomness for data processing 12. Algebraic structure whose elements can be used to estimate some randomness will be called a probability set Φ. Naturally, the elements of Φ are the probabilities.

  16. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  17. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  18. The concept of probability

    International Nuclear Information System (INIS)

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  19. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  20. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  1. Stochastic Programming with Probability

    CERN Document Server

    Andrieu, Laetitia; Vázquez-Abad, Felisa

    2007-01-01

    In this work we study optimization problems subject to a failure constraint. This constraint is expressed in terms of a condition that causes failure, representing a physical or technical breakdown. We formulate the problem in terms of a probability constraint, where the level of "confidence" is a modelling parameter and has the interpretation that the probability of failure should not exceed that level. Application of the stochastic Arrow-Hurwicz algorithm poses two difficulties: one is structural and arises from the lack of convexity of the probability constraint, and the other is the estimation of the gradient of the probability constraint. We develop two gradient estimators with decreasing bias via a convolution method and a finite difference technique, respectively, and we provide a full analysis of convergence of the algorithms. Convergence results are used to tune the parameters of the numerical algorithms in order to achieve best convergence rates, and numerical results are included via an example of ...

  2. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  3. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  4. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  5. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  6. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake “calibrating adjustments” to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...

  7. Probability with Roulette

    Science.gov (United States)

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  8. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  9. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  10. Launch Collision Probability

    Science.gov (United States)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  11. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  12. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  13. Probabilities from Envariance

    CERN Document Server

    Zurek, W H

    2004-01-01

    I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...

  14. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  15. The Probability of Default Calculation Model of Listed Banks Based on Long Term Liability Coefficient Optimization%基于最优负债系数的上市银行违约概率测算模型与实证

    Institute of Scientific and Technical Information of China (English)

    迟国泰; 曹勇; 党均章

    2012-01-01

    当上市银行的长期负债系数γ的取值不同时,应用KMV模型测算出的银行违约概率大相径庭.根据债券的实际信用利差可以推算出上市银行的违约概率PD(I),CS,根据长期负债系数γ可以运用KMV模型确定上市银行的理论违约概率PDi,KMV.本文通过理论违约率与实际违约率的总体差异n∑i=(I)|PDi,KMV-PDi,ES|最小的思路建立规划模型,确定了KMV模型的最优长期负债γ系数;通过最优长期负债系数γ建立了未发债上市银行的违约率测算模型、并实证测算了我国14家全都上市银行的违约概率.本文的创新与特色一是采用KMV模型计算的银行违约概率PDi.KMV与实际信用利差确定的银行违约概率PDi,CS总体差异∑|PDi,KMV-PDi,es|最小的思路建立规划模型,确定了KMV模型中的最优长期负债γ系数;使γ系数的确定符合资本市场利差的实际状况,解决了现有研究中在0和1之间当采用不同的长期负债系数γ、其违约概率的计算结果截然不同的问题.二是实证研究表明,当长期负债系数γ=0.7654时,应用KMV模型测算出的我国上市银行违约概率与我国债券市场所接受的上市银行违约概率最为接近.三是实证研究表明国有上市银行违约概率最低,区域性的上市银行违约概率较高,其他上市银行的违约概率居中.%When the long term liability coefficient y equals various values, the default probabilities of listed banks calculated by KMV model are quite different. The real default probabilities of listed banks Pdics can be measured by the credit spreads of financial bond issued by banks. The theoretical default probabilities of listed banks Pdi can be calculated by KMV model with a certain coefficient y of long term liability. A programming model is established to calculate the optimal value of long term liability coefficient y following the idea to minimize the total differences Σim1|PDiKMV-Pdics| between the

  16. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  17. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  18. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  19. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  20. Logic and probability

    OpenAIRE

    Quznetsov, G. A.

    2003-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  1. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  2. Logic, Truth and Probability

    OpenAIRE

    Quznetsov, Gunn

    1998-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  3. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2014-01-01

    that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  4. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  5. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  6. Semiclassical transition probabilities for interacting oscillators

    OpenAIRE

    Khlebnikov, S. Yu.

    1994-01-01

    Semiclassical transition probabilities characterize transfer of energy between "hard" and "soft" modes in various physical systems. We establish the boundary problem for singular euclidean solutions used to calculate such probabilities. Solutions are found numerically for a system of two interacting quartic oscillators. In the double-well case, we find numerical evidence that certain regular {\\em minkowskian} trajectories have approximate stopping points or, equivalently, are approximately pe...

  7. Transition probabilities for argon I

    International Nuclear Information System (INIS)

    Transition probabilities for ArI lines have been calculated on the basis of the (j,k)-coupling scheme for more than 16000 spectral lines belonging to the transition arrays 4s-np (n=4 to n=9), 5s-np (n=5 to n=9), 6s-np (n=6 to n=9), 7s-np (n=8 to n=9), 4p-ns (n=5 to n=10), 5p-ns (n=6 to n=9), 6p-ns (n=7 to n=8), 4p-nd (n=3 to n=9), 5p-nd (n=4 to n=9), 3d-np (n=5 to n=9), 4d-np (n=6 to n=9), 5d-np (n=7 to n=9), 3d-nf (n=4 to n=9), 4d-nf (n=4 to n=9), 5d-nf (n=5 to n=9), 4f-nd (n=5 to n=9) 5f-nd (n=6 to n=9), 4f-ng (n=5 to n=9), 5f-ng (n=6 to n=9). Inso far as values by other authors exist, comparison is made with these values. It turns out that the results obtained in (j,k)-coupling are close to those obtained in intermediate coupling except for intercombination lines. For high principal and/or orbital quantum numbers the transition probabilities for a multiplet approach those of the corresponding transitions in atomic hydrogen. The calculated values are applied to construct a simplified argon-atom model, which reflects the real transition properties and which allows simplified but realistic non-equilibrium calculations for argon plasmas which deviate from local thermodynamic equilibrium (LTE)

  8. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  9. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  10. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  11. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  12. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  13. Angles as probabilities

    CERN Document Server

    Feldman, David V

    2008-01-01

    We use a probabilistic interpretation of solid angles to generalize the well-known fact that the inner angles of a triangle sum to 180 degrees. For the 3-dimensional case, we show that the sum of the solid inner vertex angles of a tetrahedron T, divided by 2*pi, gives the probability that an orthogonal projection of T onto a random 2-plane is a triangle. More generally, it is shown that the sum of the (solid) inner vertex angles of an n-simplex S, normalized by the area of the unit (n-1)-hemisphere, gives the probability that an orthogonal projection of S onto a random hyperplane is an (n-1)-simplex. Applications to more general polytopes are treated briefly, as is the related Perles-Shephard proof of the classical Gram-Euler relations.

  14. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  15. Calculation of the pipes failure probability of the Rcic system of a nuclear power station by means of software WinPRAISE 07; Calculo de la probabilidad de falla de tuberias del sistema RCIC de una central nuclear mediante el software WinPRAISE 07

    Energy Technology Data Exchange (ETDEWEB)

    Jasso G, J.; Diaz S, A.; Mendoza G, G.; Sainz M, E. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Garcia de la C, F. M., E-mail: angeles.diaz@inin.gob.mx [Comision Federal de Electricidad, Central Nucleoelectrica Laguna Verde, Km 44.5 Carretera Cardel-Nautla, 91476 Laguna Verde, Alto Lucero, Veracruz (Mexico)

    2014-10-15

    The growth and the cracks propagation by fatigue are a typical degradation mechanism that is presented in the nuclear industry as in the conventional industry; the unstable propagation of a crack can cause the catastrophic failure of a metallic component even with high ductility; for this reason, activities of programmed maintenance have been established in the industry using inspection and visual techniques and/or ultrasound with an established periodicity allowing to follow up to these growths, controlling the undesirable effects; however, these activities increase the operation costs; and in the peculiar case of the nuclear industry, they increase the radiation exposure to the participant personnel. The use of mathematical processes that integrate concepts of uncertainty, material properties and the probability associated to the inspection results, has been constituted as a powerful tool of evaluation of the component reliability, reducing costs and exposure levels. In this work the evaluation of the failure probability by cracks growth preexisting by fatigue is presented, in pipes of a Reactor Core Isolation Cooling system (Rcic) in a nuclear power station. The software WinPRAISE 07 (Piping Reliability Analysis Including Seismic Events) was used supported in the probabilistic fracture mechanics principles. The obtained values of failure probability evidenced a good behavior of the analyzed pipes with a maximum order of 1.0 E-6, therefore is concluded that the performance of the lines of these pipes is reliable even extrapolating the calculations at 10, 20, 30 and 40 years of service. (Author)

  16. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.;

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...... discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...

  17. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  18. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  19. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    This text provides undergraduate mathematics students with an introduction to the modern theory of probability as well as the roots of the theory's mathematical ideas and techniques. Centered around the concept of measure and integration, the treatment is applicable to other branches of analysis and explores more specialized topics, including convergence theorems and random sequences and functions.The initial part is devoted to an exploration of measure and integration from first principles, including sets and set functions, general theory, and integrals of functions of real variables. These t

  20. Measurement Uncertainty and Probability

    Science.gov (United States)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  1. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  2. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  3. Calculation of decision making probability using probit and logit models

    OpenAIRE

    Barbara Futryn; Marek Fura

    2005-01-01

    The aim of this article is presentation of logit and probit models and their wide application in many different science. Logit and probit regression are used for analyzing the relationship between one or more independent variables with categorical dependent variable. There are a lot of advantages of logit (probit) models over linear multiple regression. These methods imply that the dependent variable is actually the result of a transformation of an underlying variable, which is not restricted...

  4. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  5. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  6. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  7. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities. PMID:27570097

  8. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  9. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  10. Transition probabilities for diffusion equations by means of path integrals

    OpenAIRE

    Goovaerts, Marc; DE SCHEPPER, Ann; Decamps, Marc

    2002-01-01

    In this paper, we investigate the transition probabilities for diffusion processes. In a first part, we show how transition probabilities for rather general diffusion processes can always be expressed by means of a path integral. For several classical models, an exact calculation is possible, leading to analytical expressions for the transition probabilities and for the maximum probability paths. A second part consists of the derivation of an analytical approximation for the transition probab...

  11. Transition probabilities for diffusion equations by means of path integrals.

    OpenAIRE

    Goovaerts, Marc; De Schepper, A; Decamps, M.

    2002-01-01

    In this paper, we investigate the transition probabilities for diffusion processes. In a first part, we show how transition probabilities for rather general diffusion processes can always be expressed by means of a path integral. For several classical models, an exact calculation is possible, leading to analytical expressions for the transition probabilities and for the maximum probability paths. A second part consists of the derivation of an analytical approximation for the transition probab...

  12. Chemical immobilization of adult female Weddell seals with tiletamine and zolazepam: effects of age, condition and stage of lactation

    Directory of Open Access Journals (Sweden)

    Harcourt Robert G

    2006-02-01

    Full Text Available Abstract Background Chemical immobilization of Weddell seals (Leptonychotes weddellii has previously been, for the most part, problematic and this has been mainly attributed to the type of immobilizing agent used. In addition to individual sensitivity, physiological status may play an important role. We investigated the use of the intravenous administration of a 1:1 mixture of tiletamine and zolazepam (Telazol® to immobilize adult females at different points during a physiologically demanding 5–6 week lactation period. We also compared performance between IV and IM injection of the same mixture. Results The tiletamine:zolazepam mixture administered intravenously was an effective method for immobilization with no fatalities or pronounced apnoeas in 106 procedures; however, there was a 25 % (one animal in four mortality rate with intramuscular administration. Induction time was slightly longer for females at the end of lactation (54.9 ± 2.3 seconds than at post-parturition (48.2 ± 2.9 seconds. In addition, the number of previous captures had a positive effect on induction time. There was no evidence for effects due to age, condition (total body lipid, stage of lactation or number of captures on recovery time. Conclusion We suggest that intravenous administration of tiletamine and zolazepam is an effective and safe immobilizing agent for female Weddell seals. Although individual traits could not explain variation in recovery time, we suggest careful monitoring of recovery times during longitudinal studies (> 2 captures. We show that physiological pressures do not substantially affect response to chemical immobilization with this mixture; however, consideration must be taken for differences that may exist for immobilization of adult males and juveniles. Nevertheless, we recommend a mass-specific dose of 0.50 – 0.65 mg/kg for future procedures with adult female Weddell seals and a starting dose of 0.50 mg/kg for other age classes and other

  13. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  14. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-04-01

    Full Text Available In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  15. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  16. The Logic of Parametric Probability

    CERN Document Server

    Norman, Joseph W

    2012-01-01

    The computational method of parametric probability analysis is introduced. It is demonstrated how to embed logical formulas from the propositional calculus into parametric probability networks, thereby enabling sound reasoning about the probabilities of logical propositions. An alternative direct probability encoding scheme is presented, which allows statements of implication and quantification to be modeled directly as constraints on conditional probabilities. Several example problems are solved, from Johnson-Laird's aces to Smullyan's zombies. Many apparently challenging problems in logic turn out to be simple problems in algebra and computer science; often just systems of polynomial equations or linear optimization problems. This work extends the mathematical logic and parametric probability methods invented by George Boole.

  17. Computing Earthquake Probabilities on Global Scales

    Science.gov (United States)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  18. Steering in spin tomographic probability representation

    Science.gov (United States)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  19. Electric quadrupole transition probabilities for atomic lithium

    International Nuclear Information System (INIS)

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT

  20. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  1. 基于响应曲面和重要性抽样方法的热力系统参数失效概率计算%Calculation of Parameter Failure Probability of Thermodynamic System by Response Surface and Importance Sampling Method

    Institute of Scientific and Technical Information of China (English)

    尚彦龙; 蔡琦; 陈力生; 张杨伟

    2012-01-01

    本文研究了将响应曲面与重要性抽样相结合的方法用于复杂热力系统参数失效概率的计算.建立了热力系统物理过程参数失效的数学模型,在此基础上研究了将响应曲面与重要性抽样相结合的算法模型,并给出了热力系统组成设备的性能退化模型和基于重要性抽样的仿真流程,进而对反应堆净化系统工作过程中参数失效问题进行了分析计算.研究表明,对于高维、非线性特性明显并考虑性能退化的复杂热力系统参数失效概率的计算,重要性抽样法较直接抽样能以较高效率获得满意精度的计算结果,而响应曲面法存在局限;响应曲面和重要性抽样相结合的方法是分析热力系统物理过程参数失效的有效方法.%In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the draw-backs of response surface method.

  2. Calculation Method for Injury Probability of Living Beings Caused by Electric Shock Due to Touch and Step Voltages%雷电引起接触电压、跨步电压导致生物伤害的损害概率计算方法探析

    Institute of Scientific and Technical Information of China (English)

    冯鹤

    2016-01-01

    为了提高人员生命损失风险计算的针对性、准确性,为人工接地装置安全布设提出建设性的指导意见,通过对人工接地装置泄流时接触电压、跨步电压的计算方法的分析,确定了人工接地体泄流时接触电压、跨步电压导致人身伤亡的损害概率的定量计算方法.计算时,通过对接触电压、跨步电压计算值与人体耐受接触电压、跨步电压阈值的比较分析,确定能达到该阈值的最小雷电流幅值,再通过对项目所在位置雷电活动特征的分析,确定可能超过灾害阈值对应的雷电流幅值,即可能产生风险的雷电流所占的频率,认为该频率即为接触电压、跨步电压导致生物伤害的损害概率.对人工接地装置接触电压、跨步电压的计算以垂直接地极为例,同时考虑了冲击电流较工频电流对计算方法的影响.%In order to improve the pertinence and veracity of life loss risk calculation,this paper aims to put forward constructive suggestions on safely establishing the artificial earth device.By analyzing the calculation method of touch voltage and step voltage when the artificial earth device discharges,the author confirmed the quantitative calculation method of the injury probability caused by touch voltage and step voltage under the circumstances.In calculation,through comparative analysis of the calculated values of touch voltage and step voltage and the threshold values of human tolerance voltage and step voltage,the author determined the minimum lightning current amplitude which could also reach the threshold value.Afterwards,by measuring the lightning activity features of the program site,the author calculated the lightning current amplitude which was likely to exceed the disaster threshold,namely the frequency of risky lightning current,and based on the result,the author regarded this frequency value as the injury probability of living beings generated by touch voltage and step voltage

  3. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  4. Trajectory versus probability density entropy

    Science.gov (United States)

    Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  5. Match probabilities in racially admixed populations.

    Science.gov (United States)

    Lange, K

    1993-02-01

    The calculation of match probabilities is the most contentious issue dividing prosecution and defense experts in the forensic applications of DNA fingerprinting. In particular, defense experts question the applicability of the population genetic laws of Hardy-Weinberg and linkage equilibrium to racially admixed American populations. Linkage equilibrium justifies the product rule for computing match probabilities across loci. The present paper suggests a method of bounding match probabilities that depends on modeling gene descent from ancestral populations to contemporary populations under the assumptions of Hardy-Weinberg and linkage equilibrium only in the ancestral populations. Although these bounds are conservative from the defendant's perspective, they should be small enough in practice to satisfy prosecutors.

  6. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  7. Investigation of probable decays in rhenium isotopes

    International Nuclear Information System (INIS)

    Making use of effective liquid drop model (ELDM), the feasibility of proton and alpha decays and various cluster decays is analysed theoretically. For different neutron-rich and neutron-deficient isotopes of Rhenium in the mass range 150 < A < 200, the half-lives of proton and alpha decays and probable cluster decays are calculated considering the barrier potential as the effective liquid drop one which is the sum of Coulomb, surface and centrifugal potentials. The calculated half-lives for proton decay from various Rhenium isotopes are then compared with the universal decay law (UDL) model to assess the efficiency of the present formalism. Geiger-Nuttal plots of the probable decays are analysed and their respective slopes and intercepts are evaluated

  8. Fusion probability in heavy nuclei

    Science.gov (United States)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, PCN> , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. PCN> for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: PCN> has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine PCN> . Approximate boundaries have been obtained from where PCN> starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of PCN> from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections

  9. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  10. Probabilities of multiple quantum teleportation

    OpenAIRE

    Woesler, Richard

    2002-01-01

    Using quantum teleportation a quantum state can be teleported with a certain probability. Here the probabilities for multiple teleportation are derived, i. e. for the case that a teleported quantum state is teleported again or even more than two times, for the two-dimensional case, e. g., for the two orthogonal direcations of the polarization of photons. It is shown that the probability for an exact teleportation, except for an irrelevant phase factor, is 25 %, i. e., surprisingly, this resul...

  11. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  12. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte;

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to charac......An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  13. Exciton-Dependent Pre-formation Probability of Composite Particles

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jing-Shang; WANG Ji-Min; DUAN Jun-Feng

    2007-01-01

    In Iwamoto-Harada model the whole phase space is full of fermions. When the momentum distributions of the exciton states are taken into account, the pre-formation probability of light composite particles could be improved,and the exciton state-dependent pre-formation probability has been proposed. The calculated results indicate that the consideration of the momentum distribution enhances the pre-formation probability of [1,m] configuration, and suppresses that of [l > 1, m] configurations seriously.

  14. The albedo effect on neutron transmission probability.

    Science.gov (United States)

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  15. Assault frequency and preformation probability of the alpha emission process

    OpenAIRE

    Zhang, H.F.; Royer, G.; Li, J.Q.

    2011-01-01

    A study of the assault frequency and preformation factor of the α-decay description is performed from the experimental α-decay constant and the penetration probabilities calculated from the generalized liquid-drop model (GLDM) potential barriers. To determine the assault frequency a quantum-mechanical method using a harmonic oscillator is introduced and leads to values of around 1021 s−1, similar to the ones calculated within the classical method. The preformation probability is around 10−1-1...

  16. Probability theory and its models

    OpenAIRE

    Humphreys, Paul

    2008-01-01

    This paper argues for the status of formal probability theory as a mathematical, rather than a scientific, theory. David Freedman and Philip Stark's concept of model based probabilities is examined and is used as a bridge between the formal theory and applications.

  17. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...

  18. Varieties of Belief and Probability

    NARCIS (Netherlands)

    Eijck, D.J.N. van; Ghosh, S.; Szymanik, J.

    2015-01-01

    For reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for knowledge and b

  19. Subjective probability models for lifetimes

    CERN Document Server

    Spizzichino, Fabio

    2001-01-01

    Bayesian methods in reliability cannot be fully utilized and understood without full comprehension of the essential differences that exist between frequentist probability and subjective probability. Switching from the frequentist to the subjective approach requires that some fundamental concepts be rethought and suitably redefined. Subjective Probability Models for Lifetimes details those differences and clarifies aspects of subjective probability that have a direct influence on modeling and drawing inference from failure and survival data. In particular, within a framework of Bayesian theory, the author considers the effects of different levels of information in the analysis of the phenomena of positive and negative aging.The author coherently reviews and compares the various definitions and results concerning stochastic ordering, statistical dependence, reliability, and decision theory. He offers a detailed but accessible mathematical treatment of different aspects of probability distributions for exchangea...

  20. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  1. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  2. Probability

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.

  3. The collision probability modules of WIMS-E

    International Nuclear Information System (INIS)

    This report describes how flat source first flight collision probabilities are calculated and used in the WIMS-E modular program. It includes a description of the input to the modules W-FLU, W-THES, W-PIP, W-PERS and W-MERGE. Input to other collision probability modules are described in separate reports. WIMS-E is capable of calculating collision probabilities in a wide variety of geometries, some of them quite complicated. It can also use them for a variety of purposes. (author)

  4. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  5. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  6. Holographic probabilities in eternal inflation.

    Science.gov (United States)

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  7. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  8. Evaluation for Success Probability of Chaff Centroid Jamming

    Institute of Scientific and Technical Information of China (English)

    GAO Dong-hua; SHI Xiu-hua

    2008-01-01

    As the chaff centroid jamming can introduce the guiding error of the anti-warship missile's seeker and decrease its hitting probability, a new quantitative analysis method and a mathematic model are proposed in this paper to evaluate the success jamming probability. By using this method, the optimal decision scheme of chaff centroid jamming in different threat situations can be found, and also the success probability of this scheme can be calculated quantitatively. Thus, the operation rules of the centroid jamming and the tactical approach for increasing the success probability can be determined.

  9. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  10. Methodology for assessing probability of extreme hydrologic events coincidence

    Directory of Open Access Journals (Sweden)

    Prohaska Stevan

    2010-01-01

    Full Text Available The aim of the presented research is improvement of methodology for probability calculation of coinciding occurrence of historic floods and droughts in the same year. The original procedure was developed in order to determine the occurrence probability of such an extreme historic event. There are two phases in calculation procedure for assessment of both extreme drought and flood occurrence probability in the same year. In the first phase outliers are detected as indicators of extreme events, their return periods are calculated and series' statistics adjusted. In the second phase conditional probabilities are calculated: empirical points are plotted, and both extreme drought and flood occurrence probability in the same year is assessed based on the plot. Outlier detection is performed for the territory of Serbia. Results are shown as maps of regions (basins prone to floods, hydrologic drought, or both. Step-by-step numeric example is given for assessing conditional probability of occurrence of flood and drought for GS Raska on the river Raska. Results of assessment of conditional probability in two more cases are given for combination of extreme flood and 30 day minimum flow.

  11. Probability representation of classical states

    NARCIS (Netherlands)

    Man'ko, OV; Man'ko, [No Value; Pilyavets, OV

    2005-01-01

    Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.

  12. Transition probabilities of Br II

    Science.gov (United States)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  13. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  14. THE SURVIVAL PROBABILITY IN FINITE TIME PERIOD IN FULLY DISCRETE RISK MODEL

    Institute of Scientific and Technical Information of China (English)

    ChengShixue; WuBiao

    1999-01-01

    The probabilities of the following events are first discussed in this paper: the insurance company survives to any fixed time k and the surplus at time k equals x≥1. The formulas for calculating such probabilities are deduced through analytical and probabilistic arguments respectively. Finally, other probability laws relating to risk are determined based on the probabilities mentioned above.

  15. Logical, conditional, and classical probability

    OpenAIRE

    Quznetsov, G. A.

    2005-01-01

    The propositional logic is generalized on the real numbers field. the logical function with all properties of the classical probability function is obtained. The logical analog of the Bernoulli independent tests scheme is constructed. The logical analog of the Large Number Law is deduced from properties of these functions. The logical analog of thd conditional probability is defined. Consistency encured by a model on a suitable variant of the nonstandard analysis.

  16. Compliance with endogenous audit probabilities

    OpenAIRE

    Konrad, Kai A.; Lohse, Tim; Qari, Salmai

    2015-01-01

    This paper studies the effect of endogenous audit probabilities on reporting behavior in a face-to-face compliance situation such as at customs. In an experimental setting in which underreporting has a higher expected payoff than truthful reporting we find an increase in compliance of about 80% if subjects have reason to believe that their behavior towards an officer influences their endogenous audit probability. Higher compliance is driven by considerations about how own appearance and perfo...

  17. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  18. Joint probabilities and quantum cognition

    CERN Document Server

    de Barros, J Acacio

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  19. Novel Bounds on Marginal Probabilities

    OpenAIRE

    Mooij, Joris M.; Kappen, Hilbert J

    2008-01-01

    We derive two related novel bounds on single-variable marginal probability distributions in factor graphs with discrete variables. The first method propagates bounds over a subtree of the factor graph rooted in the variable, and the second method propagates bounds over the self-avoiding walk tree starting at the variable. By construction, both methods not only bound the exact marginal probability distribution of a variable, but also its approximate Belief Propagation marginal (``belief''). Th...

  20. Trajectory probability hypothesis density filter

    OpenAIRE

    García-Fernández, Ángel F.; Svensson, Lennart

    2016-01-01

    This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the best Poisson approximation to the multitrajectory filtering density in the sense of minimising the K...

  1. Transition probabilities between levels of K and K+

    International Nuclear Information System (INIS)

    In this work transition probabilities between Ievels of n < 11 for K and for the known of K+ are calculated. Two computer programs based on the Coulomb approximation and the most suitable coupling schemes has been used. Lifetimes of all these levels are also calculated. (Author)

  2. Energy-shifting formulae yield reliable reaction and capture probabilities

    International Nuclear Information System (INIS)

    Predictions of energy-shifting formulae for partial reaction and capture probabilities are compared with coupled channels calculations. The quality of the agreement notably improves with increasing mass of the system and/or decreasing mass asymmetry in the heavy-ion collision. The formulae are reliable and useful for circumventing impracticable reaction calculations at low energies

  3. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  4. Improved Ar(II) transition probabilities

    OpenAIRE

    Danzmann, K.; de Kock, M

    1986-01-01

    Precise Ar(II) branching ratios have been measured on a high current hollow cathode with a 1-m Fourier transform spectrometer. Absolute transition probabilities for 11 Ar(II) lines were calculated from these branching ratios and lifetime measurements published by Mohamed et al. For the prominent 4806 Å line, the present result is Aik = 7.12×107s-1 ±2.8%, which is in excellent agreement with recent literature data derived from pure argon diagnostics, two-wavelength-interferometry, and Hβ-diagn...

  5. MATHEMATICAL EXPECTATION ABOUT DISCRETE RANDOM VARIABLE WITH INTERVAL PROBABILITY OR FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The character and an algorithm about DRVIP(discrete random variable with interval probability) and the second kind DRVFP (discrete random variable with crisp event-fuzzy probability) are researched. Using the fuzzy resolution theorem, the solving mathematical expectation of a DRVFP can be translated into solving mathematical expectation of a series of RVIP. It is obvious that solving mathematical expectation of a DRVIP is a typical linear programming problem. A very functional calculating formula for solving mathematical expectation of DRVIP was obtained by using the Dantzig's simplex method. The example indicates that the result obtained by using the functional calculating formula fits together completely with the result obtained by using the linear programming method, but the process using the formula deduced is simpler.

  6. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  7. Calculation Methods for Wallenius’ Noncentral Hypergeometric Distribution

    DEFF Research Database (Denmark)

    Fog, Agner

    2008-01-01

    conditional distribution of independent binomial variates given their sum. No reliable calculation method for Wallenius' noncentral hypergeometric distribution has hitherto been described in the literature. Several new methods for calculating probabilities from Wallenius' noncentral hypergeometric...

  8. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  9. Born Rule and Noncontextual Probability

    CERN Document Server

    Logiurato, Fabrizio

    2012-01-01

    The probabilistic rule that links the formalism of Quantum Mechanics (QM) to the real world was stated by Born in 1926. Since then, there were many attempts to derive the Born postulate as a theorem, Gleason's one being the most prominent. The Gleason derivation, however, is generally considered as rather intricate and its physical meaning, in particular in relation with the noncontextuality of probability (NP), is not quite evident. More recently, we are witnessing a revival of interest on possible demonstrations of the Born rule, like Zurek's and Deutsch's based on the decoherence and on the theory of decisions, respectively. Despite an ongoing debate about the presence of hidden assumptions and circular reasonings, these have the merit of prompting more physically oriented approaches to the problem. Here we suggest a new proof of the Born rule based on the noncontextuality of probability. Within the theorem we also demonstrate the continuity of probability with respect to the amplitudes, which has been sug...

  10. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  11. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  12. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  13. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  14. Probability as a physical motive

    CERN Document Server

    Martin, P

    2007-01-01

    Recent theoretical progress in nonequilibrium thermodynamics, linking the physical principle of Maximum Entropy Production ("MEP") to the information-theoretical "MaxEnt" principle of scientific inference, together with conjectures from theoretical physics that there may be no fundamental causal laws but only probabilities for physical processes, and from evolutionary theory that biological systems expand "the adjacent possible" as rapidly as possible, all lend credence to the proposition that probability should be recognized as a fundamental physical motive. It is further proposed that spatial order and temporal order are two aspects of the same thing, and that this is the essence of the second law of thermodynamics.

  15. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  16. Pollock on probability in epistemology

    OpenAIRE

    Fitelson, Branden

    2010-01-01

    In Thinking and Acting John Pollock offers some criticisms of Bayesian epistemology, and he defends an alternative understanding of the role of probability in epistemology. Here, I defend the Bayesian against some of Pollock's criticisms, and I discuss a potential problem for Pollock's alternative account.

  17. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  18. Quantum correlations; quantum probability approach

    OpenAIRE

    Majewski, W A

    2014-01-01

    This survey gives a comprehensive account of quantum correlations understood as a phenomenon stemming from the rules of quantization. Centered on quantum probability it describes the physical concepts related to correlations (both classical and quantum), mathematical structures, and their consequences. These include the canonical form of classical correlation functionals, general definitions of separable (entangled) states, definition and analysis of quantumness of correlations, description o...

  19. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  20. Asbestos and Probable Microscopic Polyangiitis

    OpenAIRE

    George S Rashed Philteos; Kelly Coverett; Rajni Chibbar; Ward, Heather A; Cockcroft, Donald W

    2004-01-01

    Several inorganic dust lung diseases (pneumoconioses) are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase) positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex) is described and the possible...

  1. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  2. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  3. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    Science.gov (United States)

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  4. Transition probability and preferential gauge

    OpenAIRE

    Chen, C.Y.

    1999-01-01

    This paper is concerned with whether or not the preferential gauge can ensure the uniqueness and correctness of results obtained from the standard time-dependent perturbation theory, in which the transition probability is formulated in terms of matrix elements of Hamiltonian.

  5. Investigation of Flood Inundation Probability in Taiwan

    Science.gov (United States)

    Wang, Chia-Ho; Lai, Yen-Wei; Chang, Tsang-Jung

    2010-05-01

    Taiwan is located at a special point, which is in the path of typhoons from northeast Pacific Ocean. Taiwan is also situated in a tropical-subtropical transition zone. As a result, rainfall is abundant all the year round, especially in summer and autumn. For flood inundation analysis in Taiwan, there exist a lot of uncertainties in hydrological, hydraulic and land-surface topography characteristics, which can change flood inundation characteristics. According to the 7th work item of article 22 in Disaster Prevention and Protection Act in Taiwan, for preventing flood disaster being deteriorating, investigation analysis of disaster potentials, hazardous degree and situation simulation must be proceeded with scientific approaches. However, the flood potential analysis uses a deterministic approach to define flood inundation without considering data uncertainties. This research combines data uncertainty concept in flood inundation maps for showing flood probabilities in each grid. It can be an emergency evacuation basis as typhoons come and extremely torrential rain begin. The research selects Hebauyu watershed of Chiayi County as the demonstration area. Owing to uncertainties of data used, sensitivity analysis is first conducted by using Latin Hypercube sampling (LHS). LHS data sets are next input into an integrated numerical model, which is herein developed to assess flood inundation hazards in coastal lowlands, base on the extension of the 1-D river routing model and the 2-D inundation routing model. Finally, the probability of flood inundation simulation is calculated, and the flood inundation probability maps are obtained. Flood Inundation probability maps can be an alternative of the old flood potential maps for being a regard of building new hydraulic infrastructure in the future.

  6. MEMS Calculator

    Science.gov (United States)

    SRD 166 MEMS Calculator (Web, free access)   This MEMS Calculator determines the following thin film properties from data taken with an optical interferometer or comparable instrument: a) residual strain from fixed-fixed beams, b) strain gradient from cantilevers, c) step heights or thicknesses from step-height test structures, and d) in-plane lengths or deflections. Then, residual stress and stress gradient calculations can be made after an optical vibrometer or comparable instrument is used to obtain Young's modulus from resonating cantilevers or fixed-fixed beams. In addition, wafer bond strength is determined from micro-chevron test structures using a material test machine.

  7. Unified set of atomic transition probabilities for neutral argon

    OpenAIRE

    Wiese, W.; Brault, J.; Danzmann, K.; Helbig, V.; de Kock, M

    1989-01-01

    The atomic transition probabilities and radiative lifetimes of neutral argon have been the subject of numerous experiments and calculations, but the results exhibit many discrepancies and inconsistencies. We present a unified set of atomic transition probabilities, which is consistent with essentially all recent results, albeit sometimes only after critical reanalysis. The data consistency and scale confirmation has been achieved in two ways. (i) We have carried out some lifetime–branching-ra...

  8. Evaluation of photoexcitation and photoionization probabilities by the trajectory method

    International Nuclear Information System (INIS)

    A new trajectory-based method of transition probability evaluation in quantum system was developed. It is based on a path integral representation of probability and uses Weyl symbols for initial and final states. The method belongs to the efficient initial value representation (IVR) schemes. The pre-exponential factor specific to the semi-classical method is equal to one, and does not need be separately calculated. This eliminates problems with caustics and Maslov indices of trajectories. The method is equally efficient for evaluation of the transition probabilities into separate states and groups of states, including an entire ionization continuum, for example. The capabilities of the method are demonstrated by the evaluation of the photo-excitation and photo-ionization probabilities in the hydrogen atom exposed to an ultrashort photo-pulse, and total photo-ionization probability in the helium atom. (authors)

  9. Collision probability at low altitudes resulting from elliptical orbits

    Science.gov (United States)

    Kessler, Donald J.

    1990-01-01

    The probability of collision between a spacecraft and another object is calculated for various altitude and orbit conditions, and factors affecting the probability are discussed. It is shown that a collision can only occur when the spacecraft is located at an altitude which is between the perigee and apogee altitudes of the object and that the probability per unit time is largest when the orbit of the object is nearly circular. However, at low altitudes, the atmospheric drag causes changes with time of the perigee and the apogee, such that circular orbits have a much shorter lifetime than many of the elliptical orbits. Thus, when the collision probability is integrated over the lifetime of the orbiting object, some elliptical orbits are found to have much higher total collision probability than circular orbits. Rocket bodies used to boost payloads from low earth orbit to geosynchronous orbit are an example of objects in these elliptical orbits.

  10. Systematic study of survival probability of excited superheavy nuclei

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The stability of excited superheavy nuclei (SHN) with 100 Z 134 against neutron emission and fission is investigated by using a statistical model. In particular, a systematic study of the survival probability against fission in the 1n-channel of these SHN is made. The present calculations consistently take the neutron separation energies and shell correction energies from the calculated results of the finite range droplet model which predicts an island of stability of SHN around Z = 115 and N = 179. It turns out that this island of stability persists for excited SHN in the sense that the calculated survival probabilities in the 1n-channel of excited SHN at the optimal excitation energy are maximized around Z = 115 and N = 179. This indicates that the survival probability in the 1n-channel is mainly determined by the nuclear shell effects.

  11. Transition probabilities and radiative lifetimes of levels in F I

    Energy Technology Data Exchange (ETDEWEB)

    Celik, Gueltekin, E-mail: gultekin@selcuk.edu.tr; Dogan, Duygu; Ates, Sule; Taser, Mehmet

    2012-07-15

    The electric dipole transition probabilities and the lifetimes of excited levels have been calculated using the weakest bound electron potential model theory (WBEPMT) and the quantum defect orbital theory (QDOT) in atomic fluorine. In the calculations, many of transition arrays included both multiplet and fine-structure transitions are considered. We employed Numerical Coulomb Approximation (NCA) wave functions and numerical non-relativistic Hartree-Fock (NRHF) wave functions for expectation values of radii in determination of parameters. The necessary energy values have been taken from experimental energy data in the literature. The calculated transition probabilities and lifetimes have been compared with available theoretical and experimental results. A good agreement with results in literature has been obtained. Moreover, some transition probability and the lifetime values not existing in the literature for some highly excited levels have been obtained using these methods.

  12. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...... complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested....

  13. Classical Probability and Quantum Outcomes

    Directory of Open Access Journals (Sweden)

    James D. Malley

    2014-05-01

    Full Text Available There is a contact problem between classical probability and quantum outcomes. Thus, a standard result from classical probability on the existence of joint distributions ultimately implies that all quantum observables must commute. An essential task here is a closer identification of this conflict based on deriving commutativity from the weakest possible assumptions, and showing that stronger assumptions in some of the existing no-go proofs are unnecessary. An example of an unnecessary assumption in such proofs is an entangled system involving nonlocal observables. Another example involves the Kochen-Specker hidden variable model, features of which are also not needed to derive commutativity. A diagram is provided by which user-selected projectors can be easily assembled into many new, graphical no-go proofs.

  14. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  15. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  16. Knot probabilities in random diagrams

    Science.gov (United States)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  17. Probability distributions for multimeric systems.

    Science.gov (United States)

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  18. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  19. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  20. Subjective probability and quantum certainty

    CERN Document Server

    Caves, C M; Schack, R; Caves, Carlton M.; Fuchs, Christopher A.; Schack, Ruediger

    2006-01-01

    In the Bayesian approach to quantum mechanics, probabilities--and thus quantum states--represent an agent's degrees of belief, rather than corresponding to objective properties of physical systems. In this paper we investigate the concept of certainty in quantum mechanics. Our analysis reveals fundamental differences between our Bayesian approach on the one hand and the Copenhagen interpretation and similar interpretations of quantum states on the other hand. We first review the main arguments for the general claim that probabilities always represent degrees of belief. We then show that a quantum state prepared by some physical device always depends on an agent's prior beliefs, implying that with-certainty predictions derived from such a state also depend on the agent's prior beliefs. Quantum certainty is therefore always some agent's certainty. Conversely, if facts about an experimental setup could imply certainty for a measurement outcome, that outcome would effectively correspond to a preexisting system pr...

  1. The probability of extraterrestrial life

    International Nuclear Information System (INIS)

    Since beginning of times, human beings need to live in the company of other humans, beings developing what we now know as human societies. According to this idea there has been speculation, specially in the present century, about the possibility that human society has company of the other thinking creatures living in other planets somewhere in our galaxy. In this talk we will only use reliable data from scientific observers in order to establish a probability. We will explain the analysis on the physico-chemical principles which allow the evolution of organic molecules in our planet and establish these as the forerunners of life in our planet. On the other hand, the physical process governing stars, their characteristics and their effects on planets will also be explained as well as the amount of energy that a planet receives, its mass, atmosphere and kind of orbit. Finally, considering all this information, a probability of life from outer space will be given. (Author)

  2. Double K-shell ionization probability in 54Mn

    International Nuclear Information System (INIS)

    We have measured the probability of double K-shell vacancy production in the electron capture decay of 54Mn to the 835-keV level of 54Cr. The probability was deduced from the number of triple coincidences among the Cr hypersatellite and satellite x rays emitted in filling the double vacancy and the 835-keV γ ray. The probability of double K-shell vacancy production per K-shell electron capture (PKK) was found to be (2.3-0.5+0.8)x10-4. Comparisons to previous experimental results and theoretical calculations are discussed

  3. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  4. Tight Bernoulli tail probability bounds

    OpenAIRE

    Dzindzalieta, Dainius

    2014-01-01

    The purpose of the dissertation is to prove universal tight bounds for deviation from the mean probability inequalities for functions of random variables. Universal bounds shows that they are uniform with respect to some class of distributions and quantity of variables and other parameters. The bounds are called tight, if we can construct a sequence of random variables, such that the upper bounds are achieved. Such inequalities are useful for example in insurance mathematics, for constructing...

  5. Asbestos and Probable Microscopic Polyangiitis

    Directory of Open Access Journals (Sweden)

    George S Rashed Philteos

    2004-01-01

    Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.

  6. Relative transition probabilities of cobalt

    Science.gov (United States)

    Roig, R. A.; Miller, M. H.

    1974-01-01

    Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.

  7. Probability Properties of Multi-contact in Protein Molecules

    Institute of Scientific and Technical Information of China (English)

    WANG Xiang-hong; KE Jian-hong; HU Min-xiao; ZHANG Lin-xi

    2003-01-01

    The compact conformations of polymers are important because the native conformations of all bio-polymers with certain function are highly compact. The properties of mutil-contact bio-polymer chains were studied by Gaussian statistics of the random-flight chain. The theoretical expressions(were given, also), the calculations of probability distributions and correlation functions for different topologic cases were derived and made respectively. Comparison between single, double and triple contacts was also made. By means of setting the parameters, the results of the current calculations of the multiple contacts are just the same as those calculated by single, double or tripe contacts separately. It is a useful method to investigate native conformations of biopolymers. The probabilities of multi contacts and correlation functions between chains contacts were calculated for the Gaussian chains. Because the bond probability distributions are Gaussians distributions, the probability distributions of the separations of various points along the chains are always consecutive. All the contacts may break up into several groups, and each group consists of many contacts. Here we investigated the probability distribution from one group to three groups of contacts.

  8. Application of the diagrams of phase transformations during aging for optimizing the aging conditions for V1469 and 1441 Al-Li alloys

    Science.gov (United States)

    Lukina, E. A.; Alekseev, A. A.; Antipov, V. V.; Zaitsev, D. V.; Klochkova, Yu. Yu.

    2009-12-01

    To describe the changes in the phase composition of alloys during aging, it is convenient to construct TTT diagrams on the temperature-aging time coordinates in which time-temperature regions of the existence of nonequilibrium phases that form during aging are indicated. As a rule, in constructing the diagrams of phase transformations during aging (DPTA), time-temperature maps of properties are plotted. A comparison of the diagrams with maps of properties allows one to analyze the effect of the structure on the properties. In this study, we analyze the DPTAs of V1469 (Al-1.2 Li-0.46 Ag-3.4 Cu-0.66 Mg) and 1441 (Al-1.8 Li-1.1 Mg-1.6 Cu, C Mg/ C Cu ≈ 1) alloys. Examples of the application of DPTA for the development of steplike aging conditions are reported.

  9. Electronic factors for K-shell-electron conversion probability and electron-positron pair formation probability in electric monopole transitions

    International Nuclear Information System (INIS)

    This paper presents, in tabular form, the electronic factors ΩK,π(Z,k) of the electric monopole transition probability associated with the internal conversion of an electron from the atomic K shell (IC;K) and with the internal pair formation(IPF;π). The Ωπ values are calculated by taking the nuclear Coulomb effects into account. The corrections to ΩK due to finite nuclear size and bound-state atomic screening are not included in the present calculations. The calculated ratio of the K-shell-electron conversion probability to the electron-positron pair formation probability is found to be in good agreement with the available experimental data for Z-<40

  10. Electric quadrupole transition probabilities and line strengths of Ti11+

    International Nuclear Information System (INIS)

    Electric quadrupole transition probabilities and line strengths have been calculated using the weakest bound electron potential model for sodium-like titanium, considering many transition arrays. We employed numerical Coulomb approximation and non-relativistic Hartree–Fock wavefunctions for the expectation values of radii in determination of parameters of the model. The necessary energy values have been taken from experimental data in the literature. The calculated electric quadrupole line strengths have been compared with available data in the literature and good agreement has been obtained. Moreover, some electric quadrupole transition probability and line strength values not existing in the literature for some highly excited levels have been obtained using this method

  11. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Simonen, Fredric A.

    2009-05-01

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  12. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    International Nuclear Information System (INIS)

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  13. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  14. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  15. Nonlocality, Bell's Ansatz and Probability

    CERN Document Server

    Kracklauer, A F

    2006-01-01

    Quantum Mechanics lacks an intuitive interpretation, which is the cause of a generally formalistic approach to its use. This in turn has led to a certain insensitivity to the actual meaning of many words used in its description and interpretation. Herein, we analyze carefully the possible meanings of those terms used in analysis of EPR's contention, that Quantum Mechanics is incomplete, as well as Bell's work descendant therefrom. As a result, many inconsistencies and errors in contemporary discussions of nonlocality, as well as in Bell's Ansatz with respect to the laws of probability, are identified. Evading these errors precludes serious conflicts between Quantum Mechanics and Special Relativity and Philosophy.

  16. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  17. Estimating Probabilities in Recommendation Systems

    CERN Document Server

    Sun, Mingxuan; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.

  18. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  19. Generalized Bures products from free probability

    CERN Document Server

    Jarosz, Andrzej

    2012-01-01

    Inspired by the theory of quantum information, I use two non-Hermitian random matrix models - a weighted sum of circular unitary ensembles and a product of rectangular Ginibre unitary ensembles - as building blocks of three new products of random matrices which are generalizations of the Bures model. I apply the tools of both Hermitian and non-Hermitian free probability to calculate the mean densities of their eigenvalues and singular values in the thermodynamic limit, along with their divergences at zero; the results are supported by Monte Carlo simulations. I pose and test conjectures concerning the relationship between the two densities (exploiting the notion of the N-transform), the shape of the mean domain of the eigenvalues (an extension of the single ring theorem), and the universal behavior of the mean spectral density close to the domain's borderline (using the complementary error function).

  20. Some improved transition probabilities for neutral carbon

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Charlotte Froese [Atomic Physics Division, National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)

    2006-05-14

    An earlier paper (Zatsarinny O and Froese Fischer C 2002 J. Phys. B: At. Mol. Opt. Phys. 35 4669) presented oscillator strengths for transitions from the 2p{sup 2} 3P term to high-lying excited states of carbon. The emphasis was on the accurate prediction of energy levels relative to the ionization limit and allowed transition data from the ground state. The present paper reports some refined transition probability calculations for transitions from 2p{sup 2}, {sup 3}P, 1{sup D}, and {sup 1}S to all odd levels up to 2p3d{sup 3}P. Particular attention is given to intercombination lines where relativistic effects are most important.

  1. Using L/E Oscillation Probability Distributions

    CERN Document Server

    Aguilar-Arevalo, A A; Bugel, L; Cheng, G; Church, E D; Conrad, J M; Dharmapalan, R; Djurcic, Z; Finley, D A; Ford, R; Garcia, F G; Garvey, G T; Grange, J; Huelsnitz, W; Ignarra, C; Imlay, R; Johnson, R A; Karagiorgi, G; Katori, T; Kobilarcik, T; Louis, W C; Mariani, C; Marsh, W; Mills, G B; Mirabal, J; Moore, C D; Mousseau, J; Nienaber, P; Osmanov, B; Pavlovic, Z; Perevalov, D; Polly, C C; Ray, H; Roe, B P; Russell, A D; Shaevitz, M H; Spitz, J; Stancu, I; Tayloe, R; Van de Water, R G; White, D H; Wickremasinghe, D A; Zeller, G P; Zimmerman, E D

    2014-01-01

    This paper explores the use of $L/E$ oscillation probability distributions to compare experimental measurements and to evaluate oscillation models. In this case, $L$ is the distance of neutrino travel and $E$ is a measure of the interacting neutrino's energy. While comparisons using allowed and excluded regions for oscillation model parameters are likely the only rigorous method for these comparisons, the $L/E$ distributions are shown to give qualitative information on the agreement of an experiment's data with a simple two-neutrino oscillation model. In more detail, this paper also outlines how the $L/E$ distributions can be best calculated and used for model comparisons. Specifically, the paper presents the $L/E$ data points for the final MiniBooNE data samples and, in the Appendix, explains and corrects the mistaken analysis published by the ICARUS collaboration.

  2. An Alternative Teaching Method of Conditional Probabilities and Bayes' Rule: An Application of the Truth Table

    Science.gov (United States)

    Satake, Eiki; Vashlishan Murray, Amy

    2015-01-01

    This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…

  3. Electric quadrupole transition probabilities for singly ionized magnesium

    International Nuclear Information System (INIS)

    Electric quadrupole transition probabilities for Mg II have been calculated within the weakest bound electron potential model (WBEPM) theory using experimental energy levels and theoretical expectation values of orbital radii corresponding to those energy levels under the assumption of the LS coupling scheme. In this work, the WBEPM theory has been applied to forbidden transitions for the first time. The present results are consistent with earlier theoretical calculations. Some of these results are reported for the first time.

  4. Predicting most probable conformations of a given peptide sequence in the random coil state.

    Science.gov (United States)

    Bayrak, Cigdem Sevim; Erman, Burak

    2012-11-01

    In this work, we present a computational scheme for finding high probability conformations of peptides. The scheme calculates the probability of a given conformation of the given peptide sequence using the probability distribution of torsion states. Dependence of the states of a residue on the states of its first neighbors along the chain is considered. Prior probabilities of torsion states are obtained from a coil library. Posterior probabilities are calculated by the matrix multiplication Rotational Isomeric States Model of polymer theory. The conformation of a peptide with highest probability is determined by using a hidden Markov model Viterbi algorithm. First, the probability distribution of the torsion states of the residues is obtained. Using the highest probability torsion state, one can generate, step by step, states with lower probabilities. To validate the method, the highest probability state of residues in a given sequence is calculated and compared with probabilities obtained from the Coil Databank. Predictions based on the method are 32% better than predictions based on the most probable states of residues. The ensemble of "n" high probability conformations of a given protein is also determined using the Viterbi algorithm with multistep backtracking. PMID:22955874

  5. Burnout calculation

    International Nuclear Information System (INIS)

    Reviewed is the effect of heat flux of different system parameters on critical density in order to give an initial view on the value of several parameters. A thorough analysis of different equations is carried out to calculate burnout is steam-water flows in uniformly heated tubes, annular, and rectangular channels and rod bundles. Effect of heat flux density distribution and flux twisting on burnout and storage determination according to burnout are commended

  6. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  7. Failure probability of PWR reactor coolant loop piping

    International Nuclear Information System (INIS)

    This paper describes the results of assessments performed on the PWR coolant loop piping of Westinghouse and Combustion Engineering plants. For direct double-ended guillotine break (DEGB), consideration was given to crack existence probability, initial crack size distribution, hydrostatic proof test, preservice inspection, leak detection probability, crack growth characteristics, and failure criteria based on the net section stress failure and tearing modulus stability concept. For indirect DEGB, fragilities of major component supports were estimated. The system level fragility was then calculated based on the Boolean expression involving these fragilities. Indirect DEGB due to seismic effects was calculated by convolving the system level fragility and the seismic hazard curve. The results indicate that the probability of occurrence of both direct and indirect DEGB is extremely small, thus, postulation of DEGB in design should be eliminated and replaced by more realistic criteria

  8. Importance function by collision probabilities for Monte Carloby code TRIPOLI

    International Nuclear Information System (INIS)

    We present a completely automatic biasing technique where the parameters of the biased simulation are deduced from the solution of the adjoint transport equation calculated by collision probabilities. In this study we shall estimate the importance function through collision probabilities method and we shall evaluate its possibilities thanks to a Monte Carlo calculation. We have run simulations with this new biasing method for one-group transport problems with isotropic shocks (one dimension geometry and X-Y geometry) and for multigroup problems with anisotropic shocks (one dimension geometry). For the anisotropic problems we solve the adjoint equation with anisotropic collision probabilities. The results show that for the one-group and the homogeneous geometry transport problems the method is quite optimal without Splitting and Russian Roulette technique for both the multigroup and heterogeneous X-Y geometry ones the figures of merit are higher if we add Splitting and Russian Roulette technique. (orig.)

  9. Importance function by collision probabilities for Monte Carlo code Tripoli

    International Nuclear Information System (INIS)

    We present a completely automatic biasing technique where the parameters of the biased simulation are deduced from the solution of the adjoint transport equation calculated by collision probabilities. In this study we shall estimate the importance function through collision probabilities method and we shall evaluate its possibilities thanks to a Monte Carlo calculation. We have run simulations with this new biasing method for one-group transport problems with isotropic shocks (one dimension geometry and X-Y geometry) and for multigroup problems with anisotropic shocks (one dimension geometry). For the anisotropic problems we solve the adjoint equation with anisotropic collision probabilities. The results show that for the one-group and homogeneous geometry transport problems the method is quite optimal without Splitting and Russian Roulette technique but for the multigroup and heterogeneous X-Y geometry ones the figures of merit are higher if we add Splitting and Russian Roulette technique

  10. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  11. Associativity and normative credal probability.

    Science.gov (United States)

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  12. Measuring Robustness of Timetables in Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    vulnerable to delays (less robust) while the other will be less vulnerable (more robust), but this cannot be measured by the above methods. In the light of this, the article will describe a new method where the complexity of a given station with a given timetable can be calculated based on a probability...

  13. Probability of induced nuclear fission in diffusion model

    International Nuclear Information System (INIS)

    The apparatus of the fission diffusion model taking into account nonequilibrium stage of the process as applied to the description of the probability of induced nuclear fission is described. The results of calculation of the energy dependence of 212Po nuclear fissility according to the new approach are presented

  14. Modeling collision probability for Earth-impactor 2008 TC3

    CERN Document Server

    Oszkiewicz, Dagmara; Virtanen, Jenni; Granvik, Mikael; Bowell, Edward

    2012-01-01

    We study the evolution of the Earth collision probability of asteroid 2008 TC3 using a short observational arc and small numbers of observations. To assess impact probability, we use techniques that rely on the orbital-element probability density function characterized using both Markov-chain Monte-Carlo orbital ranging and Monte-Carlo ranging. First, we evaluate the orbital uncertainties for the object from the night of discovery onwards and examine the collapse of the orbital-element distributions in time. Second, we examine the sensitivity of the results to the assumed astrometric noise. Each of the orbits obtained from the MCMC ranging method is propagated into the future (within chosen time bounds of the expected impact), and the collision probability is calculated as a weighted fraction of the orbits leading to a collision from the Earth. We compare the results obtained with both methods.

  15. Wigner function and the probability representation of quantum states

    Directory of Open Access Journals (Sweden)

    Man’ko Margarita A.

    2014-01-01

    Full Text Available The relation of theWigner function with the fair probability distribution called tomographic distribution or quantum tomogram associated with the quantum state is reviewed. The connection of the tomographic picture of quantum mechanics with the integral Radon transform of the Wigner quasidistribution is discussed. The Wigner–Moyal equation for the Wigner function is presented in the form of kinetic equation for the tomographic probability distribution both in quantum mechanics and in the classical limit of the Liouville equation. The calculation of moments of physical observables in terms of integrals with the state tomographic probability distributions is constructed having a standard form of averaging in the probability theory. New uncertainty relations for the position and momentum are written in terms of optical tomograms suitable for directexperimental check. Some recent experiments on checking the uncertainty relations including the entropic uncertainty relations are discussed.

  16. Prior probabilities modulate cortical surprise responses: A study of event-related potentials.

    Science.gov (United States)

    Seer, Caroline; Lange, Florian; Boos, Moritz; Dengler, Reinhard; Kopp, Bruno

    2016-07-01

    The human brain predicts events in its environment based on expectations, and unexpected events are surprising. When probabilistic contingencies in the environment are precisely instructed, the individual can form expectations based on quantitative probabilistic information ('inference-based learning'). In contrast, when probabilistic contingencies are imprecisely instructed, expectations are formed based on the individual's cumulative experience ('experience-based learning'). Here, we used the urn-ball paradigm to investigate how variations in prior probabilities and in the precision of information about these priors modulate choice behavior and event-related potential (ERP) correlates of surprise. In the urn-ball paradigm, participants are repeatedly forced to infer hidden states responsible for generating observable events, given small samples of factual observations. We manipulated prior probabilities of the states, and we rendered the priors calculable or incalculable, respectively. The analysis of choice behavior revealed that the tendency to consider prior probabilities when making decisions about hidden states was stronger when prior probabilities were calculable, at least in some of our participants. Surprise-related P3b amplitudes were observed in both the calculable and the incalculable prior probability condition. In contrast, calculability of prior probabilities modulated anteriorly distributed ERP amplitudes: when prior probabilities were calculable, surprising events elicited enhanced P3a amplitudes. However, when prior probabilities were incalculable, surprise was associated with enhanced N2 amplitudes. Furthermore, interindividual variability in reliance on prior probabilities was associated with attenuated P3b surprise responses under calculable in comparison to incalculable prior probabilities. Our results suggest two distinct neural systems for probabilistic learning that are recruited depending on contextual cues such as the precision of

  17. THE TRANSITION PROBABILITY MATRIX OF A MARKOV CHAIN MODEL IN AN ATM NETWORK

    Institute of Scientific and Technical Information of China (English)

    YUE Dequan; ZHANG Huachen; TU Fengsheng

    2003-01-01

    In this paper we consider a Markov chain model in an ATM network, which has been studied by Dag and Stavrakakis. On the basis of the iterative formulas obtained by Dag and Stavrakakis, we obtain the explicit analytical expression of the transition probability matrix. It is very simple to calculate the transition probabilities of the Markov chain by these expressions. In addition, we obtain some results about the structure of the transition probability matrix, which are helpful in numerical calculation and theoretical analysis.

  18. Relative transition probabilities for krypton.

    Science.gov (United States)

    Miller, M. H.; Roig, R. A.; Bengtson, R. D.

    1972-01-01

    First experimental line strength data for the visible Kr II lines and for several of the more prominent Kr I lines are given. The spectroscopic light source used is the thermal plasma behind the reflected shock wave in a gas-driven shock tube. A 3/4-m spectrograph and a 1-m spectrograph were employed simultaneously to provide redundant photometry. The data are compared with other measurements and with theoretical calculations.

  19. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  20. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  1. Cosmological dynamics in tomographic probability representation

    OpenAIRE

    Man'ko, V. I.; G. Marmo(Università di Napoli and INFN, Napoli, Italy); Stornaiolo, C.

    2004-01-01

    The probability representation for quantum states of the universe in which the states are described by a fair probability distribution instead of wave function (or density matrix) is developed to consider cosmological dynamics. The evolution of the universe state is described by standard positive transition probability (tomographic transition probability) instead of the complex transition probability amplitude (Feynman path integral) of the standard approach. The latter one is expressed in te...

  2. Probability fracture mechanics analysis of plates with surface cracks

    International Nuclear Information System (INIS)

    Background: The uncertainties of input parameters in an deterministic structural integrity assessment of pressure vessels may affect the assessment results. This can be improved by performing probability fracture mechanics (PFM) analysis. Purpose: This work investigates the effect of uncertainties of load, defect size, fracture toughness and failure criteria on the failure probability of semi-elliptical surface cracks in plates under combined tension and bending. Methods: The correction factor method provided by EPRI is used to estimate the stress intensity factor (SIF). The J-integral values at the deepest point of the surface crack tip are evaluated using the reference stress method and the globe limit load solution developed by Goodall and Webster and Lei. PFM analysis is performed with considering the uncertainty of crack size, yield strength and fracture toughness and Monte-Carlo (MC) simulation is used to calculate the failure probability. Results: Failure probability increases with increase of load level, Lr, for all load ratio values considered in this work for a given failure criterion. However, the failure probability based on the elastic-plastic fracture criterion is higher than that based on the linear elastic fracture criterion for a given load lever, Lr. Conclusions: The load level and the failure criteria have significant effect on the failure probability. However, the load ratio makes a little contribution to the failure probability for a given failure criterion. (authors)

  3. The feature on the posterior conditional probability of finite state Markov channel

    Institute of Scientific and Technical Information of China (English)

    MU Li-hua; SHEN Ji-hong; YUAN Yan-hua

    2005-01-01

    The feature of finite state Markov channel probability distribution is discussed on condition that original I/O are known. The probability is called posterior condition probability. It is also proved by Bayes formula that posterior condition probability forms stationary Markov sequence if channel input is independently and identically distributed. On the contrary, Markov property of posterior condition probability isn' t kept if the input isn't independently and identically distributed and a numerical example is utilized to explain this case. The properties of posterior condition probability will aid the study of the numerical calculated recurrence formula of finite state Markov channel capacity.

  4. Evaluation of the Permanent Deformations and Aging Conditions of Batu Pahat Soft Clay-Modified Asphalt Mixture by Using a Dynamic Creep Test

    Directory of Open Access Journals (Sweden)

    Al Allam A. M.

    2016-01-01

    Full Text Available This study aimed to evaluate the permanent deformation and aging conditions of BatuPahat soft clay–modified asphalt mixture, also called BatuPahat soft clay (BPSC particles; these particles are used in powder form as an additive to hot-mix asphalt mixture. In this experiment, five percentage compositions of BPSC (0%, 2%, 4%, 6%, and 8% by weight of bitumen were used. A novel design was established to modify the hot-mix asphalt by using the Superpave method for each additive ratio. Several laboratory tests evaluating different properties, such as indirect tensile strength, resilient stiffness modulus, and dynamic creep, was conducted to assess the performance of the samples mixed through the Superpave method. In the resilient modulus test, fatigue and rutting resistance were reduced by the BPSC particles. The added BPSC particles increased the indirect tensile strength. Among the mixtures, 4% BPSC particles yielded the highest performance. In the dynamic creep test, 4% BPSC particles added to the unaged and short-term aged specimens also showed the highest performance. Based on these results, our conclusion is that the BPSC particles can alleviate the permanent deformation (rutting of roads.

  5. Calculator calculus

    CERN Document Server

    McCarty, George

    1982-01-01

    How THIS BOOK DIFFERS This book is about the calculus. What distinguishes it, however, from other books is that it uses the pocket calculator to illustrate the theory. A computation that requires hours of labor when done by hand with tables is quite inappropriate as an example or exercise in a beginning calculus course. But that same computation can become a delicate illustration of the theory when the student does it in seconds on his calculator. t Furthermore, the student's own personal involvement and easy accomplishment give hi~ reassurance and en­ couragement. The machine is like a microscope, and its magnification is a hundred millionfold. We shall be interested in limits, and no stage of numerical approximation proves anything about the limit. However, the derivative of fex) = 67.SgX, for instance, acquires real meaning when a student first appreciates its values as numbers, as limits of 10 100 1000 t A quick example is 1.1 , 1.01 , 1.001 , •••• Another example is t = 0.1, 0.01, in the functio...

  6. Reliability calculations

    International Nuclear Information System (INIS)

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  7. Avoiding Negative Probabilities in Quantum Mechanics

    CERN Document Server

    Nyambuya, Golden Gadzirayi

    2013-01-01

    As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless question, "Do negative probabilities exist in quantum mechanics?" In an effort to answer this question, we arrive at the conclusion that depending on the choice one makes of the quantum probability current, one will obtain negative probabilities. We thus propose a new quantum probability current of the Klein-Gordon theory. This quantum probability current leads directly to positive definite quantum probabilities. Because these negative probabilities are in the bare Klein-Gordon theory, intrinsically a result of negative energie...

  8. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  9. Probable Linezolid-Induced Pancytopenia

    Directory of Open Access Journals (Sweden)

    Nita Lakhani

    2005-01-01

    Full Text Available A 75-year-old male outpatient with cardiac disease, diabetes, chronic renal insufficiency and iron deficiency anemia was prescribed linezolid 600 mg twice daily for a methicillin-resistant Staphylococcus aureus diabetic foot osteomyelitis. After one week, his blood counts were consistent with baseline values. The patient failed to return for subsequent blood work. On day 26, he was admitted to hospital with acute renal failure secondary to dehydration, and was found to be pancytopenic (erythrocytes 2.5x1012/L, leukocytes 2.9x109/L, platelets 59x109/L, hemoglobin 71 g/L. The patient was transfused, and linezolid was discontinued. His blood counts improved over the week and remained at baseline two months later. The patient's decline in blood counts from baseline levels met previously established criteria for clinical significance. Application of the Naranjo scale indicated a probable relationship between pancytopenia and linezolid. Clinicians should be aware of this rare effect with linezolid, and prospectively identify patients at risk and emphasize weekly hematological monitoring.

  10. Brief communication: On direct impact probability of landslides on vehicles

    Science.gov (United States)

    Nicolet, Pierrick; Jaboyedoff, Michel; Cloutier, Catherine; Crosta, Giovanni B.; Lévy, Sébastien

    2016-04-01

    When calculating the risk of railway or road users of being killed by a natural hazard, one has to calculate a temporal spatial probability, i.e. the probability of a vehicle being in the path of the falling mass when the mass falls, or the expected number of affected vehicles in case such of an event. To calculate this, different methods are used in the literature, and, most of the time, they consider only the dimensions of the falling mass or the dimensions of the vehicles. Some authors do however consider both dimensions at the same time, and the use of their approach is recommended. Finally, a method considering an impact on the front of the vehicle is discussed.

  11. Transition probabilities, oscillator strengths and lifetimes for singly ionized magnesium

    International Nuclear Information System (INIS)

    The electric dipole transition probabilities, oscillator strengths and lifetimes have been calculated using the weakest bound electron potential model theory (WBEPMT) for singly ionized magnesium. In the calculations both multiplet and fine structure transitions are studied. We have employed both the numerical Coulomb approximation (NCA) method and numerical non-relativistic Hartree-Fock (NRHF) wave functions for expectation values of radii. The calculated oscillator strengths and lifetimes have been compared with MCHF results given by Fischer et al. (2006). A good agreement has been obtained with the MCHF results. Moreover, some new transition probabilities, oscillator strengths and lifetime values, not existing in the data bases for highly excited levels in singly ionized magnesium, have been obtained using this method.

  12. Relativistic calculations of 3s2 1S0-3s3p 1P1 and 3s2 1S0-3s3p 3P1,2 transition probabilities in the Mg isoelectronic sequence

    Institute of Scientific and Technical Information of China (English)

    Cheng Cheng; Gao Xiang; Qing Bo; Zhang Xiao-Le; Li Jia-Ming

    2011-01-01

    Using the multi-configuration Dirac-Fock self-consistent field method and the relativistic configuration-interaction method, calculations of transition energies, oscillator strengths and rates are performed for the 3s2 1S0-3s3p 1P1 spinallowed transition, 3s2 1S0-3s3p 3P1,2 intercombination and magnetic quadrupole transition in the Mg isoelectronic sequence (Mg Ⅰ, Al Ⅱ, Si ⅢⅢ, P Ⅳ and S Ⅴ). Electron correlations are treated adequately, including intravalence electron correlations. The influence of the Breit interaction on oscillator strengths and transition energies are investigated. Quantum electrodynamics corrections are added as corrections. The calculation results are found to be in good agreement with the experimental data and other theoretical calculations.

  13. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Horn, J. E.; Harter, T.

    2011-06-01

    Onsite wastewater treatment systems such as septic systems are common in rural and semi-rural areas around the world; in the US, about 25-30 % of households are served by a septic system and a private drinking water well. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. Particularly in areas with small lots, thus a high septic system density, these typically shallow wells are prone to contamination by septic system leachate. Typically, mass balance approaches are used to determine a maximum septic system density that would prevent contamination of the aquifer. In this study, we estimate the probability of a well pumping partially septic system leachate. A detailed groundwater and transport model is used to calculate the capture zone of a typical drinking water well. A spatial probability analysis is performed to assess the probability that a capture zone overlaps with a septic system drainfield depending on aquifer properties, lot and drainfield size. We show that a high septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We conclude that mass balances calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances which experience limited attenuation, and those being harmful even in low concentrations.

  14. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Horn

    2011-06-01

    Full Text Available Onsite wastewater treatment systems such as septic systems are common in rural and semi-rural areas around the world; in the US, about 25–30 % of households are served by a septic system and a private drinking water well. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. Particularly in areas with small lots, thus a high septic system density, these typically shallow wells are prone to contamination by septic system leachate. Typically, mass balance approaches are used to determine a maximum septic system density that would prevent contamination of the aquifer. In this study, we estimate the probability of a well pumping partially septic system leachate. A detailed groundwater and transport model is used to calculate the capture zone of a typical drinking water well. A spatial probability analysis is performed to assess the probability that a capture zone overlaps with a septic system drainfield depending on aquifer properties, lot and drainfield size. We show that a high septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We conclude that mass balances calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances which experience limited attenuation, and those being harmful even in low concentrations.

  15. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  16. Estimation of failure probabilities of reactor pressure vessel

    International Nuclear Information System (INIS)

    Full text: Probabilistic structural analysis of components used in nuclear industry is finding increasing popularity. One of the uses of this analysis is the estimation of probability of failure over the lifetime of the structure, considering the time dependent deteriorating mechanisms. The estimation of probability of failure of the nuclear reactor components over its service life is a very important issue. It is being used to optimize the design, optimize the schedules of in-service inspections, make decision regarding fitness for service and estimation of residual life. This has been traditionally been evaluated using the sophisticated Monte Carlo simulation programs on fastest available computers or on parallel processing machines. The time taken to make these calculation runs into days as the probability of failure expected is less than 10-6. The probability calculations involve solution of a multi-dimensional definite integral. This paper proposes the use of Lepage's VEGAS numerical integration algorithm for solution of these integrals. It essentially uses Monte Carlo simulation with adaptive importance sampling as the solution technique. The method is reliable and converges quickly. The paper demonstrates the use of this algorithm in estimating the probability of reactor components. The mode of failure considered is fracture mechanics. The deteriorating mechanisms considered are fatigue and embrittlement due to nuclear radiation. The probability of failure is obtained over the lifetime of the reactor. The results are compared with those obtained from Monte Carlo simulation, reported in literature. The results show a very good match with the published literature. The time taken for calculations by VEGAS algorithm is a few minutes on a Pentium based personal computer

  17. The estimation of yearly probability gain for seismic statistical model

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the calculation method of information gain in the stochastic process presented by Vere-Jones, the relation between information gain and probability gain is studied, which is very common in earthquake prediction, and the yearly probability gain for seismic statistical model is proposed. The method is applied to the non-stationary Poisson model with whole-process exponential increase and stress release model. In addition, the prediction method of stress release model is obtained based on the inverse function simulation method of stochastic variable.

  18. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    Estimation of blood velocities by time-domain cross-correlation of successive high frequency sampled ultrasound signals is investigated. It is shown that any velocity can result from the estimator regardless of the true velocity due to the nonlinear technique employed. Using a simple simulation...... as a filter with a transfer function depending on the actual velocity. This influences the detection probability, which gets lower at certain velocities. An index directly reflecting the probability of detection can easily be calculated from the cross-correlation estimated. This makes it possible to assess...... the reliability of the velocity estimate in real time...

  19. Non-Equilibrium Random Matrix Theory : Transition Probabilities

    CERN Document Server

    Pedro, Francisco Gil

    2016-01-01

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large $N$ limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  20. Probability output of multi-class support vector machines

    Institute of Scientific and Technical Information of China (English)

    忻栋; 吴朝晖; 潘云鹤

    2002-01-01

    A novel approach to interpret the outputs of multi-class support vector machines is proposed in this paper. Using the geometrical interpretation of the classifying heperplane and the distance of the pattern from the hyperplane, one can calculate the posterior probability in binary classification case. This paper focuses on the probability output in multi-class phase where both the one-against-one and one-against-rest strategies are considered. Experiment on the speaker verification showed that this method has high performance.

  1. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  2. Estimating Small Probabilities for Langevin Dynamics

    OpenAIRE

    Aristoff, David

    2012-01-01

    The problem of estimating small transition probabilities for overdamped Langevin dynamics is considered. A simplification of Girsanov's formula is obtained in which the relationship between the infinitesimal generator of the underlying diffusion and the change of probability measure corresponding to a change in the potential energy is made explicit. From this formula an asymptotic expression for transition probability densities is derived. Separately the problem of estimating the probability ...

  3. The Cognitive Substrate of Subjective Probability

    Science.gov (United States)

    Nilsson, Hakan; Olsson, Henrik; Juslin, Peter

    2005-01-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as…

  4. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  5. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  6. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd;

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  7. On the measurement probability of quantum phases

    OpenAIRE

    Schürmann, Thomas

    2006-01-01

    We consider the probability by which quantum phase measurements of a given precision can be done successfully. The least upper bound of this probability is derived and the associated optimal state vectors are determined. The probability bound represents an unique and continuous transition between macroscopic and microscopic measurement precisions.

  8. Uniqueness in ergodic decomposition of invariant probabilities

    OpenAIRE

    Zimmermann, Dieter

    1992-01-01

    We show that for any set of transition probabilities on a common measurable space and any invariant probability, there is at most one representing measure on the set of extremal, invariant probabilities with the $\\sigma$-algebra generated by the evaluations. The proof uses nonstandard analysis.

  9. Transition probabilities and radiative lifetimes of Mg III

    Science.gov (United States)

    Alonso-Medina, A.; Colón, C.; Moreno-Díaz, C.

    2015-03-01

    There have been calculated transition probabilities for 365 lines arising from 2p5 n s(n = 3 , 4 , 5) , 2p5 n p(n = 3 , 4) , 2p5 n d(n = 3 , 4) , 2p5 n f(n = 4 , 5) and 2p5 5g configurations of Mg III and radiative lifetimes corresponding to 89 levels. These values were obtained in intermediate coupling (IC) by using ab initio relativistic Hartree-Fock (HFR) calculations. Later, we use the standard method of least square fitting of experimental energy levels for the IC calculations by means of Cowan's computer codes. The vast majority of the calculated transition probabilities correspond to lines lying in the ultraviolet range (UV) which are of high interest in astrophysics. Our results are compared to those previously reported in the literature. Furthermore, the values of transition probabilities of configuration levels 2p5 4d, 2p5 n f(n = 4 , 5) and 2p5 5g are presented for the first time. In light of these findings, it is possible to extend the range of wavelengths which allows us to estimate the temperature in plasma diagnostic. In addition, our results for radiative lifetimes have been compared to the available experimental values.

  10. Lectures on probability and statistics. Revision

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.

  11. Lectures on probability and statistics. Revision

    International Nuclear Information System (INIS)

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion

  12. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  13. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  14. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  15. Effect of conditioning methods on the microtensile bond strength of phosphate monomer-based cement on zirconia ceramic in dry and aged conditions.

    Science.gov (United States)

    Amaral, Regina; Ozcan, Mutlu; Valandro, Luiz Felipe; Balducci, Ivan; Bottino, Marco Antonio

    2008-04-01

    The objective of this study was to evaluate the durability of bond strength between a resin cement and aluminous ceramic submitted to various surface conditioning methods. Twenty-four blocks (5 x 5 x 4 mm(3)) of a glass-infiltrated zirconia-alumina ceramic (In-Ceram Zirconia Classic) were randomly divided into three surface treatment groups: ST1-Air-abrasion with 110-mum Al2O3 particles + silanization; ST2-Laboratory tribochemical silica coating method (110-microm Al2O3, 110-microm silica) (Rocatec) + silanization; ST3-Chairside tribochemical silica coating method (30-microm SiO(x)) (CoJet) + silanization. Each treated ceramic block was placed in its silicone mold with the treated surface exposed. The resin cement (Panavia F) was prepared and injected into the mold over the treated surface. Specimens were sectioned to achieve nontrimmed bar specimens (14 sp/block) that were randomly divided into two conditions: (a) Dry-microtensile test after sectioning; (b) Thermocycling (TC)-(6,000x, 5-55 degrees C) and water storage (150 days). Thus, six experimental groups were obtained (n = 50): Gr1-ST1 + dry; Gr2-ST1 + TC(;) Gr3-ST2 + dry; Gr4-ST2 + TC; Gr5-ST3 + dry; Gr6-ST3 + TC. After microtensile testing, the failure types were noted. ST2 (25.1 +/- 11) and ST3 (24.1 +/- 7.4) presented statistically higher bond strength (MPa) than that of ST1 (17.5 +/- 8) regardless of aging conditions (p < 0.0001). While Gr2 revealed the lowest results (13.3 +/- 6.4), the other groups (21.7 +/- 7.4-25. 9 +/- 9.1) showed statistically no significant differences (two-way ANOVA and Tukey's test, alpha = 0.05). The majority of the failures were mixed (82%) followed by adhesive failures (18%). Gr2 presented significantly higher incidence of ADHESIVE failures (54%) than those of other groups (p = 0.0001). Both laboratory and chairside silica coating plus silanization showed durable bond strength. After aging, air-abrasion with 110-microm Al(2)O(3) + silanization showed the largest decrease

  16. Helicity probabilities for heavy quark fragmentation into excited mesons

    CERN Document Server

    Yuan, T C

    1995-01-01

    Abstract: In the fragmentation of a heavy quark into a heavy meson whose light degrees of freedom have angular momentum 3/2, all the helicity probabilities are completely determined in the heavy quark limit up to a single probability w_{3/2}. We point out that this probability depends on the longitudinal momentum fraction z of the meson and on its transverse momentum p_\\bot relative to the jet axis. We calculate w_{3/2} as a function of scaling variables corresponding to z and p_\\bot for the heavy quark limit of the perturbative QCD fragmentation functions for b quark to fragment into (b \\bar c) mesons. In this model, the light degrees of freedom prefer to have their angular momentum aligned transverse to, rather than along, the jet axis. Implications for the production of excited heavy mesons, like D^{**} and B^{**}, are discussed.

  17. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  18. Some theoretical aspects of the group-IIIA-ion atomic clocks: Intercombination transition probabilities

    International Nuclear Information System (INIS)

    The main focus of this paper is the theoretical study of the 3P1→1S0 intercombination transition probabilities of the group-IIIA ions that are excellent candidates for high-accuracy atomic clocks. The importance of relativistic effects on the intercombination transition probabilities is made apparent by comparing their calculated values with those of the allowed 1P1→1S0 transition probabilities. In striking contrast to the allowed transition probabilities, the intercombination transition probabilities exhibit a strong Z dependence

  19. Generic Degraded Configuration Probability Analysis for the Codisposal Waste Package

    International Nuclear Information System (INIS)

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M and O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by keff in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package

  20. Bell Could Become the Copernicus of Probability

    Science.gov (United States)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  1. Collision strengths and transition probabilities for Co III forbidden lines

    CERN Document Server

    Storey, P J

    2016-01-01

    In this paper we compute the collision strengths and their thermally-averaged Maxwellian values for electron transitions between the fifteen lowest levels of doubly-ionised cobalt, Co^{2+}, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.

  2. Collision strengths and transition probabilities for Co III forbidden lines

    Science.gov (United States)

    Storey, P. J.; Sochi, Taha

    2016-07-01

    In this paper we compute the collision strengths and their thermally averaged Maxwellian values for electron transitions between the 15 lowest levels of doubly ionized cobalt, Co2+, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.

  3. Classification of Forest Type From Space Using Recollision Probability

    Science.gov (United States)

    Schull, M. A.; Ganguly, S.; Samanta, A.; Myneni, R. B.; Knyazikhin, Y.

    2008-12-01

    A classification of forest types has been produced based on the scattering characteristics within the forest canopy. The scattering processes are described by the spectral invariants of the radiative transfer theory. The spectral invariants, recollision and escape probabilities, explain the effect that canopy structural hierarchy has on the bidirectional reflectance factor (BRF). Here we show that the recollision probability can delineate between needleleaf and broadleaf forest types given the same effective LAI. Since the recollision probability tells about the multiple scattering in the canopy, we have found that the recollision probability is sensitive to hierarchal canopy structure. Given the fact that needleleafs have 1 more hierarchal level (needles within the shoot as apposed to a flat leaf) there is more scattering within a needleleaf than a broadleaf forest for the same effective LAI allowing for separation between forest types. Promising results were attained yielding a high level of confidence by simply applying a threshold of recollision probability values calculated from AVIRIS hyperspectral data. The results are shown for AVIRIS campaigns in the Northeast region of the US flown in August of 2003.

  4. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  5. Probability and Quantum Paradigms: the Interplay

    Science.gov (United States)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  6. Introduction: Research and Developments in Probability Education

    OpenAIRE

    Manfred Borovcnik; Ramesh Kapadia

    2009-01-01

    In the topic study group on probability at ICME 11 a variety of ideas on probability education were presented. Some of the papers have been developed further by the driving ideas of interactivity and use of the potential of electronic publishing. As often happens, the medium of research influences the results and thus – not surprisingly – the research change its character during this process. This paper provides a summary of the main threads of research in probability education across the wor...

  7. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  8. Time and probability in quantum cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Greensite, J. (San Francisco State Univ., CA (USA). Dept. of Physics and Astronomy)

    1990-10-01

    A time function, an exactly conserved probability measure, and a time-evolution equation (related to the Wheeler-DeWitt equation) are proposed for quantum cosmology. The time-integral of the probability measure is the measure proposed by Hawking and Page. The evolution equation reduces to the Schroedinger equation, and probability measure to the Born measure, in the WKB approximation. The existence of this 'Schroedinger-limit', which involves a cancellation of time-dependencies in the probability density between the WKB prefactor and integration measure, is a consequence of laplacian factor ordering in the Wheeler-DeWitt equation. (orig.).

  9. Real analysis and probability solutions to problems

    CERN Document Server

    Ash, Robert P

    1972-01-01

    Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.

  10. Bayesian logistic betting strategy against probability forecasting

    CERN Document Server

    Kumon, Masayuki; Takemura, Akimichi; Takeuchi, Kei

    2012-01-01

    We propose a betting strategy based on Bayesian logistic regression modeling for the probability forecasting game in the framework of game-theoretic probability by Shafer and Vovk (2001). We prove some results concerning the strong law of large numbers in the probability forecasting game with side information based on our strategy. We also apply our strategy for assessing the quality of probability forecasting by the Japan Meteorological Agency. We find that our strategy beats the agency by exploiting its tendency of avoiding clear-cut forecasts.

  11. Some New Results on Transition Probability

    Institute of Scientific and Technical Information of China (English)

    Yu Quan XIE

    2008-01-01

    In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.

  12. Quantum Statistical Mechanics. III. Equilibrium Probability

    OpenAIRE

    Attard, Phil

    2014-01-01

    Given are a first principles derivation and formulation of the probabilistic concepts that underly equilibrium quantum statistical mechanics. The transition to non-equilibrium probability is traversed briefly.

  13. Photon recollision probability in heterogeneous forest canopies: Compatibility with a hybrid GO model

    Science.gov (United States)

    Mõttus, Matti; Stenberg, Pauline; Rautiainen, Miina

    2007-02-01

    Photon recollision probability, or the probability by which a photon scattered from a phytoelement in the canopy will interact within the canopy again, has previously been shown to approximate well the fractions of radiation scattered and absorbed by homogeneous plant covers. To test the applicability of the recollision probability theory to more complicated canopy structures, a set of modeled stands was generated using allometric relations for Scots pine trees growing in central Finland. A hybrid geometric-optical model (FRT, or the Kuusk-Nilson model) was used to simulate the reflectance and transmittance of the modeled forests consisting of ellipsoidal tree crowns and, on the basis of the simulations, the recollision probability (p) was calculated for the canopies. As the recollision probability theory assumes energy conservation, a method to check and ensure energy conservation in the model was first developed. The method enabled matching the geometric-optical and two-stream submodels of the hybrid FRT model, and more importantly, allowed calculation of the recollision probability from model output. Next, to assess the effect of canopy structure on the recollision probability, the obtained p-values were compared to those calculated for structureless (homogeneous) canopies with similar effective LAI using a simple two-stream radiation transfer model. Canopy structure was shown to increase the recollision probability, implying that structured canopies absorb more efficiently the radiation interacting with the canopy, and it also changed the escape probabilities for different scattering orders. Most importantly, the study demonstrated that the concept of recollision probability is coherent with physically based canopy reflectance models which use the classical radiative transfer theory. Furthermore, it was shown that as a first approximation, the recollision probability can be considered to be independent of wavelength. Finally, different algorithms for

  14. Study on the Winning Probability for a Bid in Procurement Combinational Auction with Tree Structure

    Institute of Scientific and Technical Information of China (English)

    CHEN Jian; HUANG He

    2004-01-01

    In this paper, the processes to determine winning probability for the correspondingbidder's deterministic bid are presented. The analysis of the winning probability is Crucial for studying the bidding equilibria and designing the mechanism of procurement combinational auctions (CAs), and it also provides the decision making support for bidders who are in commercial synergies surrounding.Finally, an example is used to illustrate the feasibility and detailed processes of calculating winning probability.

  15. Noise figure and photon probability distribution in Coherent Anti-Stokes Raman Scattering (CARS)

    OpenAIRE

    Dimitropoulos, D.; Solli, D. R.; Claps, R.; Jalali, B.

    2006-01-01

    The noise figure and photon probability distribution are calculated for coherent anti-Stokes Raman scattering (CARS) where an anti-Stokes signal is converted to Stokes. We find that the minimum noise figure is ~ 3dB.

  16. Probability of paternity in paternity testing using the DNA fingerprint procedure.

    Science.gov (United States)

    Honma, M; Ishiyama, I

    1989-01-01

    For the purpose of applying DNA fingerprinting to paternity testing, we established a general formula to calculate the probability of paternity and evaluated the ability of DNA fingerprinting to determine paternity. PMID:2591980

  17. CONDOR: neutronic code for fuel elements calculation with rods

    International Nuclear Information System (INIS)

    CONDOR neutronic code is used for the calculation of fuel elements formed by fuel rods. The method employed to obtain the neutronic flux is that of collision probabilities in a multigroup scheme on two-dimensional geometry. This code utilizes new calculation algorithms and normalization of such collision probabilities. Burn-up calculations can be made before the alternative of applying variational methods for response flux calculations or those corresponding to collision normalization. (Author)

  18. Technique of account of a leak's probability of a steam generator due to destruction of a studs of a collector cover

    International Nuclear Information System (INIS)

    The approach estimating the leak probability of flanged joint due to the destruction of fastening studs is described. The mentioned approach consists of two stages. The probability of destroying one stud is calculated at the first stage, and the probability of different combination interpositions of intact and destroyed studs is calculated at the second one. The probability calculation of leak in the area of collector cover of steam generator PGV-1000 is used as an example of developed approach

  19. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  20. Interelectronic-interaction effect on the transition probability in high-Z He-like ions

    OpenAIRE

    Indelicato, Paul; Shabaev, V. M.; Volotka, A. V.

    2004-01-01

    The interelectronic-interaction effect on the transition probabilities in high-Z He-like ions is investigated within a systematic quantum electrodynamic approach. The calculation formulas for the interelectronic-interaction corrections of first order in $1/Z$ are derived using the two-time Green function method. These formulas are employed for numerical evaluations of the magnetic transition probabilities in heliumlike ions. The results of the calculations are compared with experimental value...

  1. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  2. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  3. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  4. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  5. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  6. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  7. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  8. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship-ship c...

  9. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...

  10. The probability premium: a graphical representation

    NARCIS (Netherlands)

    L.R. Eeckhoudt; R.J.A. Laeven

    2015-01-01

    We illustrate that Pratt’s probability premium can be given a simple graphical representation allowing a direct comparison to the equivalent but more prevalent concept of risk premium under expected utility. We also show that the probability premium’s graphical representation under the dual theory m

  11. Conditional probability of the tornado missile impact given a tornado occurrence

    International Nuclear Information System (INIS)

    Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets

  12. Quantum Theory and Probability Theory: Their Relationship and Origin in Symmetry

    Directory of Open Access Journals (Sweden)

    Philip Goyal

    2011-04-01

    Full Text Available Quantum theory is a probabilistic calculus that enables the calculation of the probabilities of the possible outcomes of a measurement performed on a physical system. But what is the relationship between this probabilistic calculus and probability theory itself? Is quantum theory compatible with probability theory? If so, does it extend or generalize probability theory? In this paper, we answer these questions, and precisely determine the relationship between quantum theory and probability theory, by explicitly deriving both theories from first principles. In both cases, the derivation depends upon identifying and harnessing the appropriate symmetries that are operative in each domain. We prove, for example, that quantum theory is compatible with probability theory by explicitly deriving quantum theory on the assumption that probability theory is generally valid.

  13. Laboratory-Tutorial activities for teaching probability

    CERN Document Server

    Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...

  14. Angular anisotropy representation by probability tables

    International Nuclear Information System (INIS)

    In this paper, we improve point-wise or group-wise angular anisotropy representation by using probability tables. The starting point of this study was to give more flexibility (sensitivity analysis) and more accuracy (ray effect) to group-wise anisotropy representation by Dirac functions, independently introduced at CEA (Mao, 1998) and at IRSN (Le Cocq, 1998) ten years ago. Basing ourselves on our experience of cross-section description, acquired in CALENDF (Sublet et al., 2006), we introduce two kinds of moment based probability tables, Dirac (DPT) and Step-wise (SPT) Probability Tables where the angular probability distribution is respectively represented by Dirac functions or by a step-wise function. First, we show how we can improve equi-probable cosine representation of point-wise anisotropy by using step-wise probability tables. Then we show, by Monte Carlo techniques, how we can obtain a more accurate description of group-wise anisotropy than the one usually given by a finite expansion on a Legendre polynomial basis (that can induce negative values) and finally, we describe it by Dirac probability tables. This study is carried out in the framework of GALILEE project R and D activities (Coste-Delclaux, 2008). (authors)

  15. Survival probability in patients with liver trauma.

    Science.gov (United States)

    Buci, Skender; Kukeli, Agim

    2016-08-01

    Purpose - The purpose of this paper is to assess the survival probability among patients with liver trauma injury using the anatomical and psychological scores of conditions, characteristics and treatment modes. Design/methodology/approach - A logistic model is used to estimate 173 patients' survival probability. Data are taken from patient records. Only emergency room patients admitted to University Hospital of Trauma (former Military Hospital) in Tirana are included. Data are recorded anonymously, preserving the patients' privacy. Findings - When correctly predicted, the logistic models show that survival probability varies from 70.5 percent up to 95.4 percent. The degree of trauma injury, trauma with liver and other organs, total days the patient was hospitalized, and treatment method (conservative vs intervention) are statistically important in explaining survival probability. Practical implications - The study gives patients, their relatives and physicians ample and sound information they can use to predict survival chances, the best treatment and resource management. Originality/value - This study, which has not been done previously, explores survival probability, success probability for conservative and non-conservative treatment, and success probability for single vs multiple injuries from liver trauma.

  16. Probability assessment of results of radiological analysis of surface waters affected by radioactive raw materials mining

    International Nuclear Information System (INIS)

    Water quality for classification purposes is determined by the average of the prescribed number of most unfavourable values of the indicator. In hydrology data are processed using a probability evaluation. The results are given of the probability evaluation of the occurrence of natural radionuclides for the example of a set of values obtained in a model catchment (Berounka river). The calculation of volume activities of radionuclides with a chosen non-exceedance probability may be completed with a calculation of reliability for the chosen significance level. (M.D.)

  17. Are All Probabilities Fundamentally Quantum Mechanical?

    CERN Document Server

    Pradhan, Rajat Kumar

    2011-01-01

    The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.

  18. Advanced Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the prob

  19. Basic Probability Theory for Biomedical Engineers

    CERN Document Server

    Enderle, John

    2006-01-01

    This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems--as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first

  20. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  1. Probability Distributions for a Surjective Unimodal Map

    Institute of Scientific and Technical Information of China (English)

    HongyanSUN; LongWANG

    1996-01-01

    In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types,δ function,asymmetric and symmetric type,by identifying the binary structures of its initial values.The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps,and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.

  2. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  3. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  4. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  5. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...

  6. Size constrained unequal probability sampling with a non-integer sum of inclusion probabilities

    OpenAIRE

    Grafström, Anton; Qualité, Lionel; Tillé, Yves; Matei, Alina

    2012-01-01

    More than 50 methods have been developed to draw unequal probability samples with fixed sample size. All these methods require the sum of the inclusion probabilities to be an integer number. There are cases, however, where the sum of desired inclusion probabilities is not an integer. Then, classical algorithms for drawing samples cannot be directly applied. We present two methods to overcome the problem of sample selection with unequal inclusion probabilities when their sum is not an integer ...

  7. Choosing information variables for transition probabilities in a time-varying transition probability Markov switching model

    OpenAIRE

    Andrew J. Filardo

    1998-01-01

    This paper discusses a practical estimation issue for time-varying transition probability (TVTP) Markov switching models. Time-varying transition probabilities allow researchers to capture important economic behavior that may be missed using constant (or fixed) transition probabilities. Despite its use, Hamilton’s (1989) filtering method for estimating fixed transition probability Markov switching models may not apply to TVTP models. This paper provides a set of sufficient conditions to justi...

  8. Estimation of long-term probabilities for inadvertent intrusion into radioactive waste management areas

    International Nuclear Information System (INIS)

    The risk to human health from radioactive waste management sites can be calculated as the product of the probability of accidental exposure (intrusion) times the probability of a health effect from such exposure. This report reviews the literature and evaluates methods used to predict the probabilities for unintentional intrusion into radioactive waste management areas in Canada over a 10,000-year period. Methods to predict such probabilities are available. They generally assume a long-term stability in terms of existing resource uses and society in the management area. The major potential for errors results from the unlikeliness of these assumptions holding true over such lengthy periods of prediction

  9. Probability of spent fuel transportation accidents

    Energy Technology Data Exchange (ETDEWEB)

    McClure, J. D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10/sup -7/ spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10/sup -9//mile.

  10. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  11. Transition probabilities in superfluid He4

    International Nuclear Information System (INIS)

    The transition probabilities between various states of superfluid helium-4 are found by using the approximation method of Bogolyubov and making use of his canonical transformations for different states of transitions. (author)

  12. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  13. Inclusion probability with dropout: an operational formula.

    Science.gov (United States)

    Milot, E; Courteau, J; Crispino, F; Mailly, F

    2015-05-01

    In forensic genetics, a mixture of two or more contributors to a DNA profile is often interpreted using the inclusion probabilities theory. In this paper, we present a general formula for estimating the probability of inclusion (PI, also known as the RMNE probability) from a subset of visible alleles when dropouts are possible. This one-locus formula can easily be extended to multiple loci using the cumulative probability of inclusion. We show that an exact formulation requires fixing the number of contributors, hence to slightly modify the classic interpretation of the PI. We discuss the implications of our results for the enduring debate over the use of PI vs likelihood ratio approaches within the context of low template amplifications.

  14. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  15. Asymmetry of the work probability distribution

    OpenAIRE

    Saha, Arnab; Bhattacharjee, J. K.

    2006-01-01

    We show, both analytically and numerically, that for a nonlinear system making a transition from one equilibrium state to another under the action of an external time dependent force, the work probability distribution is in general asymmetric.

  16. Transition Probability and the ESR Experiment

    Science.gov (United States)

    McBrierty, Vincent J.

    1974-01-01

    Discusses the use of a modified electron spin resonance apparatus to demonstrate some features of the expression for the transition probability per second between two energy levels. Applications to the third year laboratory program are suggested. (CC)

  17. Transition Probability Estimates for Reversible Markov Chains

    OpenAIRE

    Telcs, Andras

    2000-01-01

    This paper provides transition probability estimates of transient reversible Markov chains. The key condition of the result is the spatial symmetry and polynomial decay of the Green's function of the chain.

  18. The Animism Controversy Revisited: A Probability Analysis

    Science.gov (United States)

    Smeets, Paul M.

    1973-01-01

    Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)

  19. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  20. Improving Sensitivity to Weak Pulsations with Photon Probability Weighting

    CERN Document Server

    Kerr, Matthew

    2011-01-01

    All gamma-ray telescopes suffer from source confusion due to their inability to focus incident high-energy radiation, and the resulting background contamination can obscure the periodic emission from faint pulsars. In the context of the Fermi Large Area Telescope, we outline enhanced statistical tests for pulsation in which each photon is weighted by its probability to have originated from the candidate pulsar. The probabilities are calculated using the instrument response function and a full spectral model, enabling powerful background rejection. With Monte Carlo methods, we demonstrate that the new tests increase the sensitivity to pulsars by more than 50% under a wide range of conditions. This improvement may appreciably increase the completeness of the sample of radio-loud gamma-ray pulsars. Finally, we derive the asymptotic null distribution for the H-test, expanding its domain of validity to arbitrarily complex light curves.

  1. On the Low False Positive Probabilities of Kepler Planet Candidates

    CERN Document Server

    Morton, Timothy D

    2011-01-01

    We present a framework to conservatively estimate the probability that any particular planet-like transit signal observed by the Kepler mission is in fact a planet, prior to any ground-based follow-up efforts. We use Monte Carlo methods based on stellar population synthesis and Galactic structure models, and we provide empirical analytic fits to our results that may be applied to the as-yet-unconfirmed Kepler candidates. We find that the false positive probability for candidates that pass preliminary Kepler vetting procedures is generally 20% to < 2%, assuming a continuous power law for the planet mass function with index alpha = -1.5. Since Kepler will detect many more planetary signals than can be positively confirmed with ground-based follow-up efforts in the near term, these calculations will be crucial to using the ensemble of Kepler data to determine population characteristic s of planetary systems.

  2. Ground state occupation probabilities of neutrinoless double beta decay candidates

    Science.gov (United States)

    Kotila, Jenni; Barea, Jose

    2015-10-01

    A better understanding of nuclear structure can offer important constraints on the calculation of 0 νββ nuclear matrix elements. A simple way to consider differences between initial and final states of neutrinoless double beta decay candidates is to look at the ground state occupation probabilities of initial and final nuclei. As is well known, microscopic interacting boson model (IBM-2) has found to be very useful in the description of detailed aspects of nuclear structure. In this talk I will present results for ground state occupation probabilities obtained using IBM-2 for several interesting candidates of 0 νββ -decay. Comparison with recent experimental results is also made. This work was supported Academy of Finland (Project 266437) and Chilean Ministry of Education (Fondecyt Grant No. 1150564),

  3. The Dental Trauma Internet Calculator

    DEFF Research Database (Denmark)

    Gerds, Thomas Alexander; Lauridsen, Eva Fejerskov; Christensen, Søren Steno Ahrensburg;

    2012-01-01

    Background/Aim Prediction tools are increasingly used to inform patients about the future dental health outcome. Advanced statistical methods are required to arrive at unbiased predictions based on follow-up studies. Material and Methods The Internet risk calculator at the Dental Trauma Guide......) in the period between 1972 and 1991. Subgroup analyses and estimates of event probabilities were based on the Kaplan-Meier and the Aalen-Johansen method. Results The Internet risk calculator shows individualized prognoses for the short and long-term healing outcome of traumatized teeth with the following...... were based on the tooth’s root development stage and other risk factors at the time of the injury. Conclusions This article explains the data base, the functionality and the statistical approach of the Internet risk calculator....

  4. Avoiding Negative Probabilities in Quantum Mechanics

    OpenAIRE

    2013-01-01

    As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless questi...

  5. Probability, clinical decision making and hypothesis testing

    Directory of Open Access Journals (Sweden)

    A Banerjee

    2009-01-01

    Full Text Available Few clinicians grasp the true concept of probability expressed in the ′P value.′ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing.

  6. Breakdown Point Theory for Implied Probability Bootstrap

    OpenAIRE

    Lorenzo Camponovo; Taisuke Otsu

    2011-01-01

    This paper studies robustness of bootstrap inference methods under moment conditions. In particular, we compare the uniform weight and implied probability bootstraps by analyzing behaviors of the bootstrap quantiles when outliers take arbitrarily large values, and derive the breakdown points for those bootstrap quantiles. The breakdown point properties characterize the situation where the implied probability bootstrap is more robust than the uniform weight bootstrap against outliers. Simulati...

  7. Characteristic Functions over C*-Probability Spaces

    Institute of Scientific and Technical Information of China (English)

    王勤; 李绍宽

    2003-01-01

    Various properties of the characteristic functions of random variables in a non-commutative C*-probability space are studied in this paper. It turns out that the distributions of random variables are uniquely determined by their characteristic functions. By using the properties of characteristic functions, a central limit theorem for a sequence of independent identically distributed random variables in a C*-probability space is established as well.

  8. The Pauli equation for probability distributions

    International Nuclear Information System (INIS)

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  9. The Pauli equation for probability distributions

    Energy Technology Data Exchange (ETDEWEB)

    Mancini, S. [INFM, Dipartimento di Fisica, Universita di Milano, Milan (Italy). E-mail: Stefano.Mancini@mi.infn.it; Man' ko, O.V. [P.N. Lebedev Physical Institute, Moscow (Russian Federation). E-mail: Olga.Manko@sci.lebedev.ru; Man' ko, V.I. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Vladimir.Manko@sci.lebedev.ru; Tombesi, P. [INFM, Dipartimento di Matematica e Fisica, Universita di Camerino, Camerino (Italy). E-mail: Paolo.Tombesi@campus.unicam.it

    2001-04-27

    The tomographic-probability distribution for a measurable coordinate and spin projection is introduced to describe quantum states as an alternative to the density matrix. An analogue of the Pauli equation for the spin-1/2 particle is obtained for such a probability distribution instead of the usual equation for the wavefunction. Examples of the tomographic description of Landau levels and coherent states of a charged particle moving in a constant magnetic field are presented. (author)

  10. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  11. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  12. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  13. Atomic transition probabilities of neutral samarium

    International Nuclear Information System (INIS)

    Absolute atomic transition probabilities from a combination of new emission branching fraction measurements using Fourier transform spectrometer data with radiative lifetimes from recent laser induced fluorescence measurements are reported for 299 lines of the first spectrum of samarium (Sm i). Improved values for the upper and lower energy levels of these lines are also reported. Comparisons to published transition probabilities from earlier experiments show satisfactory and good agreement with two of the four published data sets. (paper)

  14. Generalized couplings and convergence of transition probabilities

    OpenAIRE

    Kulik, Alexei; Scheutzow, Michael

    2015-01-01

    We provide sufficient conditions for the uniqueness of an invariant measure of a Markov process as well as for the weak convergence of transition probabilities to the invariant measure. Our conditions are formulated in terms of generalized couplings. We apply our results to several SPDEs for which unique ergodicity has been proven in a recent paper by Glatt-Holtz, Mattingly, and Richards and show that under essentially the same assumptions the weak convergence of transition probabilities actu...

  15. Country Default Probabilities: Assessing and Backtesting

    OpenAIRE

    Vogl, Konstantin; Maltritz, Dominik; Huschens, Stefan; Karmann, Alexander

    2006-01-01

    We address the problem how to estimate default probabilities for sovereign countries based on market data of traded debt. A structural Merton-type model is applied to a sample of emerging market and transition countries. In this context, only few and heterogeneous default probabilities are derived, which is problematic for backtesting. To deal with this problem, we construct likelihood ratio test statistics and quick backtesting procedures.

  16. Transition probability studies in 175Au

    OpenAIRE

    Grahn, Tuomas; Watkins, H.; Joss, David; Page, Robert; Carroll, R. J.; Dewald, A.; Greenlees, Paul; Hackstein, M.; Herzberg, Rolf-Dietmar; Jakobsson, Ulrika; Jones, Peter; Julin, Rauno; Juutinen, Sakari; Ketelhut, Steffen; Kröll, Th

    2013-01-01

    Transition probabilities have been measured between the low-lying yrast states in 175Au by employing the recoil distance Doppler-shift method combined with the selective recoil-decay tagging technique. Reduced transition probabilities and magnitudes of transition quadrupole moments have been extracted from measured lifetimes allowing dramatic changes in nuclear structure within a low excitation-energy range to probed. The transition quadrupole moment data are discussed in terms...

  17. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  18. Site occupancy models with heterogeneous detection probabilities

    Science.gov (United States)

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  19. Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-05-23

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{sub eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.

  20. HENRY'S LAW CALCULATOR

    Science.gov (United States)

    On-Site was developed to provide modelers and model reviewers with prepackaged tools ("calculators") for performing site assessment calculations. The philosophy behind OnSite is that the convenience of the prepackaged calculators helps provide consistency for simple calculations,...

  1. Analysis of probability of defects in the disposal canisters

    International Nuclear Information System (INIS)

    This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)

  2. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  3. A local collision probability approximation for predicting momentum transfer cross sections.

    Science.gov (United States)

    Bleiholder, Christian

    2015-10-21

    The local collision probability approximation (LCPA) method is introduced to compute molecular momentum transfer cross sections for comparison to ion mobility experiments. The LCPA replaces the (non-local) scattering trajectory used in the trajectory method to describe the collision process by a (local) collision probability function. This momentum transfer probability is computed using the exact same analyte-buffer interaction potential as used in the trajectory method. Subsequently, the momentum transfer cross section ΩLCPA(T) is calculated in a projection-type manner (corrected for shape effects through a shape factor). Benchmark calculations on a set of 208 carbon clusters with a range of molecular size and degree of concavity demonstrate that LCPA and trajectory calculations agree closely with one another. The results discussed here indicate that the LCPA is suitable to efficiently calculate momentum transfer cross sections for use in ion mobility spectrometry in conjunction with different buffer gases.

  4. Evolvement simulation of the probability of neutron-initiating persistent fission chain

    International Nuclear Information System (INIS)

    Background: Probability of neutron-initiating persistent fission chain, which has to be calculated in analysis of critical safety, start-up of reactor, burst waiting time on pulse reactor, bursting time on pulse reactor, etc., is an inherent parameter in a multiplying assembly. Purpose: We aim to derive time-dependent integro-differential equation for such probability in relative velocity space according to the probability conservation, and develop the deterministic code Dynamic Segment Number Probability (DSNP) based on the multi-group SN method. Methods: The reliable convergence of dynamic calculation was analyzed and numerical simulation of the evolvement process of dynamic probability for varying concentration was performed under different initial conditions. Results: On Highly Enriched Uranium (HEU) Bare Spheres, when the time is long enough, the results of dynamic calculation approach to those of static calculation. The most difference of such results between DSNP and Partisn code is less than 2%. On Baker model, over the range of about 1 μs after the first criticality, the most difference between the dynamic and static calculation is about 300%. As for a super critical system, the finite fission chains decrease and the persistent fission chains increase as the reactivity aggrandizes, the dynamic evolvement curve of initiation probability is close to the static curve within the difference of 5% when the Keff is more than 1.2. The cumulative probability curve also indicates that the difference of integral results between the dynamic calculation and the static calculation decreases from 35% to 5% as the Keff increases. This demonstrated that the ability of initiating a self-sustaining fission chain reaction approaches stabilization, while the former difference (35%) showed the important difference of the dynamic results near the first criticality with the static ones. The DSNP code agrees well with Partisn code. Conclusions: There are large numbers of finite

  5. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Roger E. Feeley

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  6. Analytical formulas for calculating the blocking probability of a dynamic star network

    Institute of Scientific and Technical Information of China (English)

    Jiajia Chen; Xiang Lü; Sailing He

    2005-01-01

    For a dynamic routing and wavelength assignment (RWA) a star topology is shown to be more efficient in comparison with a ring topology. Analytical formulas for a dynamic RWA in a star network are presented and verified with virtual simulation.

  7. Optimization and Calculation of Probability Performances of Processes of Storage and Processing of Refrigerator Containerized Cargoes

    Science.gov (United States)

    Nyrkov, A. P.; Sokolov, S. S.; Chernyi, S. G.; Shnurenko, A. A.; Pavlova, L. A.

    2016-08-01

    In the work the queueing system of the disconnected multi-channel type to which irregular, uniform or not uniform flows of requests with a unlimited latency period arrive is considered. The system is considered on an example of the container terminal having conditional-functional sections with a definite mark-to-space ratio on which the irregular inhomogeneous traffic flow with resultant intensity acts.

  8. Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities: An Experimenter’s View

    OpenAIRE

    Elmar Träbert

    2014-01-01

    The interpretation of atomic observations by theory and the testing of computational predictions by experiment are interactive processes. It is necessary to gain experience with “the other side” before claims of achievement can be validated and judged. The discussion covers some general problems in the field as well as many specific examples, mostly organized by isoelectronic sequence, of what level of accuracy recently has been reached or which atomic structure or level lifetime problem need...

  9. UMTS Uplink Loading Probability Calculation Using Log-Normal Interferers Contributions

    OpenAIRE

    Mosleh M. Al-Harthi

    2012-01-01

    In this paper we introduce the probabilistic notion of uplink loading in a UMTS network subject to uplink interferers from both the home cell and other neighbouring cells. Our study is based on the assumption that, for a given UMTS cell, all the interferers in the uplink are log-normally distributed and that the Gaussian noise is also taken into account. The latter is considered as a constant contribution to be added to the sum of all interferers that are summed up using Wilkinson’s [1] calcu...

  10. Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities: An Experimenter’s View

    Directory of Open Access Journals (Sweden)

    Elmar Träbert

    2014-03-01

    Full Text Available The interpretation of atomic observations by theory and the testing of computational predictions by experiment are interactive processes. It is necessary to gain experience with “the other side” before claims of achievement can be validated and judged. The discussion covers some general problems in the field as well as many specific examples, mostly organized by isoelectronic sequence, of what level of accuracy recently has been reached or which atomic structure or level lifetime problem needs more attention.

  11. Probability distribution function for reorientations in Maier-Saupe potential

    Science.gov (United States)

    Sitnitsky, A. E.

    2016-06-01

    Exact analytic solution for the probability distribution function of the non-inertial rotational diffusion equation, i.e., of the Smoluchowski one, in a symmetric Maier-Saupe uniaxial potential of mean torque is obtained via the confluent Heun's function. Both the ordinary Maier-Saupe potential and the double-well one with variable barrier width are considered. Thus, the present article substantially extends the scope of the potentials amenable to the treatment by reducing Smoluchowski equation to the confluent Heun's one. The solution is uniformly valid for any barrier height. We use it for the calculation of the mean first passage time. Also the higher eigenvalues for the relaxation decay modes in the case of ordinary Maier-Saupe potential are calculated. The results obtained are in full agreement with those of the approach developed by Coffey, Kalmykov, Déjardin and their coauthors in the whole range of barrier heights.

  12. Channel Capacity Estimation using Free Probability Theory

    CERN Document Server

    Ryan, Øyvind

    2007-01-01

    In many channel measurement applications, one needs to estimate some characteristics of the channels based on a limited set of measurements. This is mainly due to the highly time varying characteristics of the channel. In this contribution, it will be shown how free probability can be used for channel capacity estimation in MIMO systems. Free probability has already been applied in various application fields such as digital communications, nuclear physics and mathematical finance, and has been shown to be an invaluable tool for describing the asymptotic behaviour of many systems when the dimensions of the system get large (i.e. the number of antennas). In particular, introducing the notion of free deconvolution, we provide hereafter an asymptotically (in the number of antennas) unbiased capacity estimator (w.r.t. the number of observations) for MIMO channels impaired with noise. Another unbiased estimator (for any number of observations) is also constructed by slightly modifying the free probability based est...

  13. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  14. Consistent probabilities in loop quantum cosmology

    CERN Document Server

    Craig, David A

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler-DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce vs. a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation v...

  15. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  16. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  17. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  18. A Revisit to Probability - Possibility Consistency Principles

    Directory of Open Access Journals (Sweden)

    Mamoni Dhar

    2013-03-01

    Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.

  19. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  20. Probabilities and Signalling in Quantum Field Theory

    CERN Document Server

    Dickinson, Robert; Millington, Peter

    2016-01-01

    We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators in scalar field theory. This approach allows one to see clearly how faster-than-light signalling is prevented, because it leads to a diagrammatic expansion in which the retarded propagator plays a prominent role. We illustrate the formalism using the simple case of the much-studied Fermi two-atom problem.