WorldWideScience

Sample records for calculating age-conditional probabilities

  1. Estimating age conditional probability of developing disease from surveillance data

    Directory of Open Access Journals (Sweden)

    Fay Michael P

    2004-07-01

    Full Text Available Abstract Fay, Pfeiffer, Cronin, Le, and Feuer (Statistics in Medicine 2003; 22; 1837–1848 developed a formula to calculate the age-conditional probability of developing a disease for the first time (ACPDvD for a hypothetical cohort. The novelty of the formula of Fay et al (2003 is that one need not know the rates of first incidence of disease per person-years alive and disease-free, but may input the rates of first incidence per person-years alive only. Similarly the formula uses rates of death from disease and death from other causes per person-years alive. The rates per person-years alive are much easier to estimate than per person-years alive and disease-free. Fay et al (2003 used simple piecewise constant models for all three rate functions which have constant rates within each age group. In this paper, we detail a method for estimating rate functions which does not have jumps at the beginning of age groupings, and need not be constant within age groupings. We call this method the mid-age group joinpoint (MAJ model for the rates. The drawback of the MAJ model is that numerical integration must be used to estimate the resulting ACPDvD. To increase computational speed, we offer a piecewise approximation to the MAJ model, which we call the piecewise mid-age group joinpoint (PMAJ model. The PMAJ model for the rates input into the formula for ACPDvD described in Fay et al (2003 is the current method used in the freely available DevCan software made available by the National Cancer Institute.

  2. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall...

  3. Validation of fluorescence transition probability calculations

    OpenAIRE

    M. G. PiaINFN, Sezione di Genova; P. Saracco(INFN, Sezione di Genova); Manju Sudhaka(INFN, Sezione di Genova)

    2015-01-01

    A systematic and quantitative validation of the K and L shell X-ray transition probability calculations according to different theoretical methods has been performed against experimental data. This study is relevant to the optimization of data libraries used by software systems, namely Monte Carlo codes, dealing with X-ray fluorescence. The results support the adoption of transition probabilities calculated according to the Hartree-Fock approach, which manifest better agreement with experimen...

  4. Necessity of Exact Calculation for Transition Probability

    Institute of Scientific and Technical Information of China (English)

    LIU Fu-Sui; CHEN Wan-Fang

    2003-01-01

    This paper shows that exact calculation for transition probability can make some systems deviate fromFermi golden rule seriously. This paper also shows that the corresponding exact calculation of hopping rate inducedby phonons for deuteron in Pd-D system with the many-body electron screening, proposed by Ichimaru, can explainthe experimental fact observed in Pd-D system, and predicts that perfection and low-dimension of Pd lattice are veryimportant for the phonon-induced hopping rate enhancement in Pd-D system.

  5. Calculation of radiative transition probabilities and lifetimes

    Science.gov (United States)

    Zemke, W. T.; Verma, K. K.; Stwalley, W. C.

    1982-01-01

    Procedures for calculating bound-bound and bound-continuum (free) radiative transition probabilities and radiative lifetimes are summarized. Calculations include rotational dependence and R-dependent electronic transition moments (no Franck-Condon or R-centroid approximation). Detailed comparisons of theoretical results with experimental measurements are made for bound-bound transitions in the A-X systems of LiH and Na2. New bound-free results are presented for LiH. New bound-free results and comparisons with very recent fluorescence experiments are presented for Na2.

  6. Pade approximant calculations for neutron escape probability

    International Nuclear Information System (INIS)

    The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)

  7. Calculating nuclear accident probabilities from empirical frequencies

    OpenAIRE

    Ha-Duong, Minh; Journé, V.

    2014-01-01

    International audience Since there is no authoritative, comprehensive and public historical record of nuclear power plant accidents, we reconstructed a nuclear accident data set from peer-reviewed and other literature. We found that, in a sample of five random years, the worldwide historical frequency of a nuclear major accident, defined as an INES level 7 event, is 14 %. The probability of at least one nuclear accident rated at level ≥4 on the INES scale is 67 %. These numbers are subject...

  8. Computational methods for probability of instability calculations

    Science.gov (United States)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  9. Cross section probability tables in multi-group transport calculations

    International Nuclear Information System (INIS)

    The use of cross section probability tables in multigroup transport calculations is presented. Emphasis is placed on how probability table parameters are generated in a multigroup cross section processor and how existing transport codes must be modifed to use them. In order to illustrate the accuracy obtained by using probability tables, results are presented for a variety of neutron and photon transport problems

  10. Calculating state-to-state transition probabilities within TDDFT

    OpenAIRE

    Rohringer, Nina; Peter, Simone; Burgdörfer, Joachim

    2005-01-01

    The determination of the elements of the S-matrix within the framework of time-dependent density-functional theory (TDDFT) has remained a widely open question. We explore two different methods to calculate state-to-state transition probabilities. The first method closely follows the extraction of the S-matrix from the time-dependent Hartree-Fock approximation. This method suffers from cross-channel correlations resulting in oscillating transition probabilities in the asymptotic channels. An a...

  11. Calculating the probability of detecting radio signals from alien civilizations

    CERN Document Server

    Horvat, Marko

    2006-01-01

    Although it might not be self-evident, it is in fact entirely possible to calculate the probability of detecting alien radio signals by understanding what types of extraterrestrial radio emissions can be expected and what properties these emissions can have. Using the Drake equation as the obvious starting point, and logically identifying and enumerating constraints of interstellar radio communications can yield the probability of detecting a genuine alien radio signal.

  12. Calculating the probability of detecting radio signals from alien civilizations

    OpenAIRE

    Horvat, Marko

    2007-01-01

    Although it might not be self-evident, it is in fact entirely possible to calculate the probability of detecting alien radio signals by understanding what types of extraterrestrial radio emissions can be expected and what properties these emissions can have. Using the Drake equation as the obvious starting point, and logically identifying and enumerating constraints of interstellar radio communications can yield the probability of detecting a genuine alien radio signal.

  13. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  14. Fostering Positive Attitude in Probability Learning Using Graphing Calculator

    Science.gov (United States)

    Tan, Choo-Kim; Harji, Madhubala Bava; Lau, Siong-Hoe

    2011-01-01

    Although a plethora of research evidence highlights positive and significant outcomes of the incorporation of the Graphing Calculator (GC) in mathematics education, its use in the teaching and learning process appears to be limited. The obvious need to revisit the teaching and learning of Probability has resulted in this study, i.e. to incorporate…

  15. Calculation of the isotope cluster for polypeptides by probability grouping.

    Science.gov (United States)

    Olson, Matthew T; Yergey, Alfred L

    2009-02-01

    This paper presents a novel theoretical basis for accurately calculating the isotope cluster of polypeptides. In contrast to previous approaches to this problem, which consider exhaustive or near exhaustive combinations of isotopic species, the program, Neutron Cluster, groups probabilities to yield highly accurate information without elucidating any fine structure within a nominal mass unit. This is a fundamental difference from any previously described algorithm for calculating the isotope cluster. As a result of this difference, the accurate isotope clusters for high molecular weight polypeptides can be calculated rapidly without any pruning. When applied to isotope enriched polypeptides, the algorithm introduces "grouping error", which is described, quantified, and avoided by using probability partitioning. PMID:19026561

  16. Selection of minimum earthquake intensity in calculating pipe failure probabilities

    International Nuclear Information System (INIS)

    In a piping reliability analysis, it is sometimes necessary to specify a minimum ground motion intensity, usually the peak acceleration, below which the ground motions are not considered as earthquakes and, hence, are neglected. The calculated probability of failure of a piping system is dependent on this selected minimum earthquake intensity chosen for the analysis. A study was conducted to determine the effects of the minimum earthquake intensity on the probability of pipe failure. The results indicated that the probability of failure of the piping system is not very sensitive to the variations of the selected minimum peak ground acceleration. However, it does have significant effects on various scenarios that make up the system failure

  17. Selection of minimum earthquake intensity in calculating pipe failure probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lo, T.Y.

    1985-01-01

    In a piping reliability analysis, it is sometimes necessary to specify a minimum ground motion intensity, usually the peak acceleration, below which the ground motions are not considered as earthquakes and, hence, are neglected. The calculated probability of failure of a piping system is dependent on this selected minimum earthquake intensity chosen for the analysis. A study was conducted to determine the effects of the minimum earthquake intensity on the probability of pipe failure. The results indicated that the probability of failure of the piping system is not very sensitive to the variations of the selected minimum peak ground acceleration. However, it does have significant effects on various scenarios that make up the system failure.

  18. Calculating the probability of injected carbon dioxide plumes encountering faults

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, P.D.

    2011-04-01

    One of the main concerns of storage in saline aquifers is leakage via faults. In the early stages of site selection, site-specific fault coverages are often not available for these aquifers. This necessitates a method using available fault data to estimate the probability of injected carbon dioxide encountering and migrating up a fault. The probability of encounter can be calculated from areal fault density statistics from available data, and carbon dioxide plume dimensions from numerical simulation. Given a number of assumptions, the dimension of the plume perpendicular to a fault times the areal density of faults with offsets greater than some threshold of interest provides probability of the plume encountering such a fault. Application of this result to a previously planned large-scale pilot injection in the southern portion of the San Joaquin Basin yielded a 3% and 7% chance of the plume encountering a fully and half seal offsetting fault, respectively. Subsequently available data indicated a half seal-offsetting fault at a distance from the injection well that implied a 20% probability of encounter for a plume sufficiently large to reach it.

  19. On calculating the probability of a set of orthologous sequences

    Directory of Open Access Journals (Sweden)

    Junfeng Liu

    2009-02-01

    Full Text Available Junfeng Liu1,2, Liang Chen3, Hongyu Zhao4, Dirk F Moore1,2, Yong Lin1,2, Weichung Joe Shih1,21Biometrics Division, The Cancer, Institute of New Jersey, New Brunswick, NJ, USA; 2Department of Biostatistics, School of Public Health, University of Medicine and Dentistry of New Jersey, Piscataway, NJ, USA; 3Department of Biological Sciences, University of Southern California, Los Angeles, CA, USA; 4Department of Epidemiology and Public Health, Yale University School of Medicine, New Haven, CT, USAAbstract: Probabilistic DNA sequence models have been intensively applied to genome research. Within the evolutionary biology framework, this article investigates the feasibility for rigorously estimating the probability of a set of orthologous DNA sequences which evolve from a common progenitor. We propose Monte Carlo integration algorithms to sample the unknown ancestral and/or root sequences a posteriori conditional on a reference sequence and apply pairwise Needleman–Wunsch alignment between the sampled and nonreference species sequences to estimate the probability. We test our algorithms on both simulated and real sequences and compare calculated probabilities from Monte Carlo integration to those induced by single multiple alignment.Keywords: evolution, Jukes–Cantor model, Monte Carlo integration, Needleman–Wunsch alignment, orthologous

  20. Calculation of cranial nerve complication probability for acoustic neuroma radiosurgery

    International Nuclear Information System (INIS)

    Purpose: Estimations of complications from stereotactic radiosurgery usually rely simply on dose-volume or dose-diameter isoeffect curves. Due to the sparse clinical data available, these curves have typically not considered the target location in the brain, target histology, or treatment plan conformality as parameters in the calculation. In this study, a predictive model was generated to estimate the probability of cranial neuropathies as a result of acoustic schwannoma radiosurgery. Methods and Materials: The dose-volume histogram reduction scheme was used to calculate the normal tissue complication probability (NTCP) from brainstem dose-volume histograms. The model's fitting parameters were optimized to provide the best fit to the observed complication data for acoustic neuroma patients treated with stereotactic radiosurgery at the University of Florida. The calculation was then applied to the remainder of the patients in the database. Results: The best fit to our clinical data was obtained using n = 0.04, m = 0.15, and no. alphano. /no. betano. = 2.1 Gy-1. Although the fitting parameter m is relatively consistent with ranges found in the literature, both the volume parameter, n, and no. alphano. /no. betano. are much smaller than the values quoted in the literature. The fit to our clinical data indicates that brainstem, or possibly a specific portion of the brainstem, is more radiosensitive than the parameters in the literature indicate, and that there is very little volume effect; in other words, irradiation of a small fraction of the brainstem yields NTCPs that are nearly as high as those calculated for entire volume irradiation. These new fitting parameters are specific to acoustic neuroma radiosurgery, and the small volume effect that we observe may be an artifact of the fixed relationship of acoustic tumors to specific regions of the brainstem. Applying the model to our patient database, we calculate an average NTCP of 7.2% for patients who had no cranial

  1. Revised transition probabilities for Fe XXV: Relativistic CI calculations

    International Nuclear Information System (INIS)

    Revised data are provided for transition probabilities between fine-structure components of levels with n ≤ 6 in Fe XXV. Earlier published data for transitions between fine-structure levels in Fe XXV are found to be in error, especially for certain classes of transitions. The purpose of the present note is to provide a corrected database for transitions in Fe XXV. Wavefunctions and energies for states with n ≤ 6 and J = 0, 1, 2, 3 are determined using a relativistic configuration interaction (CI) expansion that includes the Breit interaction. To measure and control the numerical accuracy of the calculations, we compare our CI energies and matrix elements with values calculated using relativistic second-order many-body perturbation theory (MBPT), also including the Breit interaction. We obtain good agreement between our CI and MBPT calculations but disagree with earlier calculations for transitions with ΔL = 2 and for intercombination transitions (ΔS = 1). We provide wavelengths, line strengths, and transitions rates for fine-structure transition between levels with n ≤ 6 in Fe XXV

  2. CALCULATION OF PER PARCEL PROBABILITY FOR DUD BOMBS IN GERMANY

    Directory of Open Access Journals (Sweden)

    S. M. Tavakkoli Sabour

    2014-10-01

    Full Text Available Unexploded aerial Bombs, also known as duds or unfused bombs, of the bombardments in the past wars remain explosive for decades after the war under the earth’s surface threatening the civil activities especially if dredging works are involved. Interpretation of the aerial photos taken shortly after bombardments has been proven to be useful for finding the duds. Unfortunately, the reliability of this method is limited by some factors. The chance of finding a dud on an aerial photo depends strongly on the photography system, the size of the bomb and the landcover. On the other hand, exploded bombs are considerably better detectable on aerial photos and confidently represent the extent and density of a bombardment. Considering an empirical quota of unfused bombs, the expected number of duds can be calculated by the number of exploded bombs. This can help to have a better calculation of cost-risk ratio and to classify the areas for clearance. This article is about a method for calculation of a per parcel probability of dud bombs according to the distribution and density of exploded bombs. No similar work has been reported in this field by other authors.

  3. Jet identification based on probability calculations using Bayes' theorem

    International Nuclear Information System (INIS)

    The problem of identifying jets at LEP and HERA has been studied. Identification using jet energies and fragmentation properties was treated separately in order to investigate the degree of quark-gluon separation that can be achieved by either of these approaches. In the case of the fragmentation-based identification, a neural network was used, and a test of the dependence on the jet production process and the fragmentation model was done. Instead of working with the separation variables directly, these have been used to calculate probabilities of having a specific type of jet, according to Bayes' theorem. This offers a direct interpretation of the performance of the jet identification and provides a simple means of combining the results of the energy- and fragmentation-based identifications. (orig.)

  4. Calculation of transition probabilities using the multiconfiguration Dirac-Fock method

    International Nuclear Information System (INIS)

    The performance of the multiconfiguration Dirac-Fock (MCDF) method in calculating transition probabilities of atoms is reviewed. In general, the MCDF wave functions will lead to transition probabilities accurate to ∼ 10% or better for strong, electric-dipole allowed transitions for small atoms. However, it is more difficult to get reliable transition probabilities for weak transitions. Also, some MCDF wave functions for a specific J quantum number may not reduce to the appropriate L and S quantum numbers in the nonrelativistic limit. Transition probabilities calculated from such MCDF wave functions for nonrelativistically forbidden transitions are unreliable. Remedies for such cases are discussed

  5. Calculation of transition probabilities using the multiconfiguration Dirac-Fock method

    International Nuclear Information System (INIS)

    The performance of the multiconfiguration Dirac-Fock (MCDF) method in calculating transition probabilities of atoms is reviewed. In general, the MCDF wave functions will lead to transition probabilities accurate to ∼10% or better for strong, electric-dipole allowed transitions for small atoms. However, it is more difficult to get reliable transition probabilities for weak transitions. Also, some MCDF wave functions for specific J quantum number may not reduce to the appropriate L and S quantum numbers in the nonrelativistic limit. Transition probabilities calculation from such MCDF wave functions for nonrelativistically forbidden transitions are unreliable. Remedies for such cases are discussed.

  6. A method to improve cutset probability calculation in probabilistic safety assessment of nuclear power plants

    International Nuclear Information System (INIS)

    In order to calculate the more accurate top event probability from cutsets or minimal cut sets (MCSs) than the conventional method that adopts the rare event approximation (REA) or min cut upper bound (MCUB) calculation, advanced cutset upper bound estimator (ACUBE) software had been developed several years ago and shortly became a vital tool for calculating the accurate core damage frequency of nuclear power plants in probabilistic safety assessment (PSA). Usually, the whole cutsets in the industry PSA models cannot be converted into a Binary decision diagram (BDD) due to the limited computational memory. So, the ACUBE selects the major cutsets whose probabilities are larger than the others, and then converts the major cutsets into a BDD in order to calculate more accurate top event probability from cutsets. This study (1) suggests when and where the ACUBE should be employed by predicting the amount of overestimation of the top event probability depending on the cutset structure, (2) explains the details of the ACUBE algorithm, and (3) demonstrates the efficiency of the ACUBE by calculating the top event probability of some PSA cutsets. - Highlights: • EPRI report [32] introduces many successful events in the seismic PSA cutsets. • This results in drastically overestimated top event probability. • In order to overcome this problem, the author developed ACUBE software. • ACUBE calculation can be determined according to the cutset structure (Section 4). • ACUBE calculation removes unnecessary conservatism in the top event probability

  7. Duality-based calculations for transition probabilities in birth-death processes

    OpenAIRE

    Ohkubo, Jun

    2015-01-01

    Transition probabilities in birth-death processes are fomulated via the corresponding dual birth-death processes. In order to obtain the corresponding dual processes, the Doi-Peliti formalism is employed. Conventional numerical evaluation enables us to obtain the transition probabilities from a fixed initial state; on the other hand, the duality relation gives us a useful method to calculate the transition probabilities to a fixed final state. Furthermore, it is clarified that the transition ...

  8. Inclusive probability calculations for the K-vacancy transfer in collisions of S{^15+} on Ar

    OpenAIRE

    Kürpick, Peter; Sepp, Wolf-Dieter; Fricke, Burkhard

    1992-01-01

    Using the single-particle amplitudes from a 20-level coupled-channel calculation with ab initio relativistic self consistent LCAO-MO Dirac-Fock-Slater energy eigenvalues and matrix elements we calculate within the frame of the inclusive probability formalism impact-parameter-dependent K-hole transfer probabilities. As an example we show results for the heavy asymmetric collision system S{^15+} on Ar for impact energies from 4.7 to 16 MeV. The inclusive probability formalism whi...

  9. Quantum dynamics calculation of reaction probability for H+Cl2→HCl+Cl

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We present in this paper a time-dependent quantum wave packet calculation of the initial state selected reaction probability for H + Cl2 based on the GHNS potential energy surface with total angular momentum J = 0. The effects of the translational, vibrational and rotational excitation of Cl2 on the reaction probability have been investigated. In a broad region of the translational energy, the rotational excitation enhances the reaction probability while the vibrational excitation depresses the reaction probability. The theoretical results agree well with the fact that it is an early down-hill reaction.

  10. Quantum dynamics calculation of reaction probability for H+Cl2→HC1+Cl

    Institute of Scientific and Technical Information of China (English)

    王胜龙; 赵新生

    2001-01-01

    We present in this paper a time-dependent quantum wave packet calculation of the initial state selected reaction probability for H + CI2 based on the GHNS potential energy surface with total angular momentum J= 0. The effects of the translational, vibrational and rotational excitation of CI2 on the reaction probability have been investigated. In a broad region of the translational energy, the rotational excitation enhances the reaction probability while the vibrational excitation depresses the reaction probability. The theoretical results agree well with the fact that it is an early down-hill reaction.

  11. The risk of major nuclear accident: calculation and perception of probabilities

    International Nuclear Information System (INIS)

    Whereas before the Fukushima accident, already eight major accidents occurred in nuclear power plants, a number which is higher than that expected by experts and rather close to that corresponding of people perception of risk, the author discusses how to understand these differences and reconcile observations, objective probability of accidents and subjective assessment of risks, why experts have been over-optimistic, whether public opinion is irrational regarding nuclear risk, and how to measure risk and its perception. Thus, he addresses and discusses the following issues: risk calculation (cost, calculated frequency of major accident, bias between the number of observed accidents and model predictions), perceived probabilities and aversion for disasters (perception biases of probability, perception biases unfavourable to nuclear), the Bayes contribution and its application (Bayes-Laplace law, statistics, choice of an a priori probability, prediction of the next event, probability of a core fusion tomorrow)

  12. Calculating Probability Tables for the Unresolved-Resonance Region Using Monte Carlo Methods

    International Nuclear Information System (INIS)

    A new module, Probability tables for the Unresolved Region using Monte Carlo (PURM), has been developed for the AMPX-2000 cross-section-processing system. PURM uses a Monte Carlo approach to calculate probability tables on an evaluator-defined energy grid in the unresolved-resonance region. For each probability table, PURM samples a Wigner spacing distribution for pairs of resonances surrounding the reference energy (i.e., energy specified in the cross-section evaluation). The resonance distribution is sampled for each spin sequence (i.e., l-J pair), and PURM uses the δ3-statistics test to determine the number of resonances to sample for each spin sequence. For each resonance, PURM samples the resonance widths from a chi-square distribution for a specified number of degrees of freedom. Once the resonance parameters are sampled, PURM calculates the total, capture, fission, and scatter cross sections at the reference energy using the single-level Breit-Wigner formalism with appropriate treatment for temperature effects. Probability tables have been calculated and compared with NJOY. The probability tables and cross-section values that are calculated by PURM and NJOY are in agreement, and the verification studies with NJOY establish the computational capability for generating probability tables using the new AMPX module PURM

  13. Notes on Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities

    OpenAIRE

    Hyun-Kyung Chung; Per Jönsson; Alexander Kramida

    2013-01-01

    Atomic structure and transition probabilities are fundamental physical data required in many fields of science and technology. Atomic physics codes are freely available to other community users to generate atomic data for their interest, but the quality of these data is rarely verified. This special issue addresses estimation of uncertainties in atomic structure and transition probability calculations, and discusses methods and strategies to assess and ensure the quality of theoretical atomic...

  14. 'PRIZE': A program for calculating collision probabilities in R-Z geometry

    International Nuclear Information System (INIS)

    PRIZE is an IBM7090 program which computes collision probabilities for systems with axial symmetry and outputs them on cards in suitable format for the PIP1 program. Its method of working, data requirements, output, running time and accuracy are described. The program has been used to compute non-escape (self-collision) probabilities of finite circular cylinders, and a table is given by which non-escape probabilities of slabs, finite and infinite circular cylinders, infinite square cylinders, cubes, spheres and hemispheres may quickly be calculated to 1/2% or better. (author)

  15. Torpedo's Search Trajectory Design Based on Acquisition and Hit Probability Calculation

    Institute of Scientific and Technical Information of China (English)

    LI Wen-zhe; ZHANG Yu-wen; FAN Hui; WANG Yong-hu

    2008-01-01

    Taking aim at light torpedo search trajectory characteristic of warship, by analyzing common used torpedo search trajectory, a better torpedo search trajectory is designed, a mathematic model is built up, and the simulation calculation taking MK46 torpedo for example is carried out. The calculation results testify that this method can increase acquisition probability and hit probability by about 10%-30% at some situations and becomes feasible for the torpedo trajectory design. The research is of great reference value for the acoustic homing torpedo trajectory design and the torpedo combat efficiency research.

  16. Calculation of failure probability of a redundant system with maximum time allowed for repair: an application

    International Nuclear Information System (INIS)

    The paper illustrates an application of new mathematical formulae to calculate the failure probability of a system when the time for repair is limited to a maximum value. Failure of all redundant sub-systems does not cause instant damage; the system must remain in the failed state for a given finite time T for damage to occur. If repair to any sub-system is completed within T the damage will be averted; otherwise the system will suffer irrepairable damage. The example chosen does not refer to any particular system although the data used are taken from real components. It consists of a vessel accommodating a constant heat source for a period of 6 months. The system failure considered is damage caused by excessive temperature. The probability of exceeding this temperature for the system is calculated and the results are compared with simple unreliability calculations. The effect of different repair conditions on the calculations is also considered. (author)

  17. A semi-mechanistic approach to calculate the probability of fuel defects

    International Nuclear Information System (INIS)

    In this paper the authors describe the status of a semi-mechanistic approach to the calculation of the probability of fuel defects. This approach expresses the defect probability in terms of fundamental parameters such as local stresses, local strains, and fission product concentration. The calculations of defect probability continue to reflect the influences of the conventional parameters like power ramp, burnup and CANLUB. In addition, the new approach provides a mechanism to account for the impacts of additional factors involving detailed fuel design and reactor operation, for example pellet density, pellet shape and size, sheath diameter and thickness, pellet/sheath clearance, and coolant temperature and pressure. The approach has been validated against a previous empirical correlation. AN illustrative example shows how the defect thresholds are influenced by changes in the internal design of the element and in the coolant pressure. (Author) (7 figs., tab., 12 refs.)

  18. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    any such distribution. We prove that formulas from renewal theory, and with a particular attention to ruin probabilities, which are true for common phase-type distributions also hold true for the infinite-dimensional case. We provide algorithms for calculating functionals of interest such as the...... renewal density and the ruin probability. It might be of interest to approximate a given heavy tailed distribution of some other type by a distribution from the class of infinite-dimensional phase-type distributions and to this end we provide a calibration procedure which works for the approximation of...... distributions with a slowly varying tail. An example from risk theory, comparing ruin probabilities for a classical risk process with Pareto distributed claim sizes, is presented and exact known ruin probabilities for the Pareto case are compared to the ones obtained by approximating by an infinite...

  19. PABS: A Computer Program to Normalize Emission Probabilities and Calculate Realistic Uncertainties

    International Nuclear Information System (INIS)

    The program PABS normalizes relative particle emission probabilities to an absolute scale and calculates the relevant uncertainties on this scale. The program is written in Java using the JDK 1.6 library. For additional information about system requirements, the code itself, and compiling from source, see the README file distributed with this program. The mathematical procedures used are given.

  20. Calculation of Quantum Probability in O(2,2) String Cosmology with a Dilaton Potential

    Institute of Scientific and Technical Information of China (English)

    YAN Jun

    2006-01-01

    The quantum properties of O(2,2) string cosmology with a dilaton potential are studied in this paper. The cosmological solutions are obtained on three-dimensional space-time. Moreover, the quantum probability of transition between two duality universe is calculated through a Wheeler-De Witt approach.

  1. Theoretical calculation of the rotational excitation probability of the lithium chloride molecule in terahertz frequency combs

    International Nuclear Information System (INIS)

    We investigated how the pulse parameters of optical frequency combs affect the rotational excitation probability of the lithium chloride (7Li37Cl) molecule. Time evolution of the rotational population distribution was calculated by the close-coupling method. It was confirmed that the rotational excitation is restricted owing to the centrifugal distortion of the rotating molecule. (author)

  2. Calculation of the Multivariate Probability Distribution Function Values and their Gradient Vectors

    OpenAIRE

    Szantai, T.

    1987-01-01

    The described collection of subroutines developed for calculation of values of multivariate normal, Dirichlet and gamma distribution functions and their gradient vectors is an unique tool that can be used e.g. to compute the Loss-of-Load Probability of electric networks and to solve optimization problems with a reliability constraint.

  3. PABS: A Computer Program to Normalize Emission Probabilities and Calculate Realistic Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Caron, D. S.; Browne, E.; Norman, E. B.

    2009-08-21

    The program PABS normalizes relative particle emission probabilities to an absolute scale and calculates the relevant uncertainties on this scale. The program is written in Java using the JDK 1.6 library. For additional information about system requirements, the code itself, and compiling from source, see the README file distributed with this program. The mathematical procedures used are given below.

  4. Calculation of the escape probabilities of Fe ⅹⅦ resonance lines for the Voigt profile

    Institute of Scientific and Technical Information of China (English)

    Jian HE; Qing-guo ZHANG

    2008-01-01

    Using the Voigt profile we obtained, we cal-culate the escape probabilities of Fe ⅹⅦ resonance lines at 15.02, 13.28, 12.12, 11.13, 11.02 and 10.12 A for op-tically thick plasma, both for slab and cylindrical ge-ometry. The oscillator strength, the number density of the absorbing atoms in the ground state, and the optical depth in the line center are discussed in this calculation. Results show that the escape probabilities for the slab geometry are larger than that for the cylindrical geom-etry. This calculation is useful for the study of the Fe ⅹⅦ resonance lines.

  5. An algorithm for calculating steady state probabilities of $M|E_r|c|K$ queueing systems

    OpenAIRE

    Hochrainer, Stefan; Hochreiter, Ronald; Pflug, Georg

    2014-01-01

    This paper presents a method for calculating steady state probabilities of $M|E_r|c|K$ queueing systems. The infinitesimal generator matrix is used to define all possible states in the system and their transition probabilities. While this matrix can be written down immediately for many other $M|PH|c|K$ queueing systems with phase-type service times (e.g. Coxian, Hypoexponential, \\ldots), it requires a more careful analysis for systems with Erlangian service times. The constructed matrix may t...

  6. Accurate multiconfiguration Dirac–Hartree–Fock calculations of transition probabilities for magnesium-like ions

    International Nuclear Information System (INIS)

    Results from multiconfiguration Dirac–Hartree–Fock (MCDHF) and relativistic configuration interaction (RCI) calculations are presented for the n=3 to n′=3 transitions in the Mg isoelectronic sequence. The calculated values for the lowest 35 levels including core–valence correlation are found to be similar and to compare very well with other theoretical and experimental values. The Breit interaction and leading quantum electrodynamic effects are included as perturbations. The calculations can provide useful data for the experimental study of determining the fine structure levels in future work. - Highlights: • Multiconfiguration Dirac–Hartree–Fock (MCDHF) and relativistic configuration interaction calculations were used. • The valence–valence and core–valence correlations are considered. • Energy levels and transitions probabilities are calculated for 35 levels of magnesium-like ions. • Detail QED and total energy for four configurations were presented

  7. Emission of delayed neutrons: calculation of the energy spectra and emission probabilities of the precursors

    International Nuclear Information System (INIS)

    The calculations given in the paper are intended to explain the delayed neutron emission (energy spectra and emission probabilities Pn) which follows the β- disintegration of the precursors produced by fission. The probability of β- transition, the level density ω (E, J) of the emitter and the competition (β-, γ) and (β-, n) de-energizations are analysed for each precursor studied. All the possible channels open to the process of neutron emission on grounds of energy considerations (Qβ-, Bn) are taken into account through the introduction of the spin and parity selection rules at each stage of the sequence: precursor, emitter, final nucleus. The results of the calculations are compared with the known experimental measurements of the neutron energy spectra and probabilities Pn. The precursors 87Br, 88Br, 137I and 93-97Rb were selected for this examination. This comparison shows in particular that the structure of experimental energy spectra can be well reproduced by the calculations given in the paper. Moreover, it emerges that the spectra calculated are very sensitive to the choice of the spins of the precursor and the final nucleus. (author)

  8. Corrections to vibrational transition probabilities calculated from a three-dimensional model.

    Science.gov (United States)

    Stallcop, J. R.

    1972-01-01

    Corrections to the collision-induced vibration transition probability calculated by Hansen and Pearson from a three-dimensional semiclassical model are examined. These corrections come from the retention of higher order terms in the expansion of the interaction potential and the use of the actual value of the deflection angle in the calculation of the transition probability. It is found that the contribution to the transition cross section from previously neglected potential terms can be significant for short range potentials and for the large relative collision velocities encountered at high temperatures. The correction to the transition cross section obtained from the use of actual deflection angles will not be appreciable unless the change in the rotational quantum number is large.

  9. Comparison of three methods for calculation of electron transfer probability in H+ + Ne

    International Nuclear Information System (INIS)

    We have developed a theoretical model of ion-atom collisions where we described electron dynamics by the time-dependent density-functional theory (TDDFT) and the ion dynamics by classical mechanics through the Ehrenfest method. We have compared three methods to calculate the probability of electron transfer during H+ + Ne collision. By discussing these issues we shall be able to understand how these methods work, what their limitations are and whether they admit of any improvements. -- Highlights: ► We have developed a theoretical model of ion-atom collisions based on TDDFT. ► We have compared three methods to calculate the probability of electron transfer in H+ + Ne. ► Electron transfer cross sections showed a good agreement with available experimental data.

  10. The Calculations of Oscillator Strengths and Transition Probabilities for Atomic Fluorine

    OpenAIRE

    ÇELİK, Gültekin; KILIÇ, H. Şükür; Akin, Erhan

    2006-01-01

    Oscillator strengths for transitions between individual lines belonging to some doublet and quartet terms, and multiplet transition probabilities of atomic fluorine have been calculated using weakest bound electron potential model theory (WBEPMT). In the determination of relevant parameters, we employed numerical non-relativistic Hartree-Fock (NRHF) wave functions for expectation values of radii and the necessary energy values have been taken from experimental energy data in the liter...

  11. Analytical calculation of nonadiabatic transition probabilities from the monodromy of differential equations

    International Nuclear Information System (INIS)

    The nonadiabatic transition probabilities in the two-level systems are calculated analytically by using the monodromy matrix determining the global feature of the underlying differential equation. We study the time-dependent 2 x 2 Hamiltonian with the tanh-type plus sech-type energy difference and with constant off-diagonal elements as an example to show the efficiency of the monodromy approach. We also discuss the application of this method to multi-level systems

  12. Efficient Probability of Failure Calculations for QMU using Computational Geometry LDRD 13-0144 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Scott A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rushdi, Ahmad A. [Univ. of Texas, Austin, TX (United States); Abdelkader, Ahmad [Univ. of Maryland, College Park, MD (United States)

    2015-09-01

    This SAND report summarizes our work on the Sandia National Laboratory LDRD project titled "Efficient Probability of Failure Calculations for QMU using Computational Geometry" which was project #165617 and proposal #13-0144. This report merely summarizes our work. Those interested in the technical details are encouraged to read the full published results, and contact the report authors for the status of the software and follow-on projects.

  13. EROS --- automated software system for ephemeris calculation and estimation of probability domain (Abstract)

    Science.gov (United States)

    Skripnichenko, P.; Galushina, T.; Loginova, M.

    2015-08-01

    This work is devoted to the description of the software EROS (Ephemeris Research and Observation Services), which is being developed both by the astronomy department of Ural Federal University and Tomsk State University. This software provides the ephemeris support for the positional observations. The most interesting feature of the software is an automatization of all the processes preparation for observations from the determination of the night duration to the ephemeris calculation and forming of a program observation schedule. The accuracy of ephemeris calculation mostly depends on initial data precision that defined from errors of observations which used to determination of orbital elements. In the case if object has a small number of observations which spread at short arc of orbit there is a real necessity to calculate not only at nominal orbit but probability domain both. In this paper under review ephemeris we will be understand a field on the celestial sphere which calculated based on the probability domain. Our software EROS has a relevant functional for estimation of review ephemeris. This work contains description of software system and results of the program using.

  14. Validation of the ROVER-F code for ROP trip probability calculations

    International Nuclear Information System (INIS)

    An important task in the operation of CANDU reactors is the prevention of fuel damage as a result of fuel dryout that can occur when the fuel sheath temperature exceeds the temperature at which the coolant can efficiently remove heat. The power at which fuel dryout is expected to occur is called the critical channel power and is a function of flux shape and the fuel channel thermalhydraulics. In CANDU reactors, protection against overpowers large enough to cause dryout is provided by two regional overpower protection (ROP) systems of in-core flux detectors, arrayed through the core, each organized into three safety (or logic) channels. Each of the two independent ROP systems is associated with one of the two independent shutdown systems (SDS-1 and SDS-2). The detectors in one ROP system (associated with SDS-1) are placed in vertical penetrations, whereas the other system (associated with SDS-2) uses detectors in horizontal penetrations in the core. Each ROP system is capable of initiating the shutdown of the reactor by actuating the corresponding shutdown system. Each ROP system must be so designed that in each safety channel at l east one detector will reach its setpoint before there is damaging overpower in any fuel channel. The trip of a single detector in a safety channel will trip that channel, and the trip of two of the three safety channels in an ROP system will trip that ROP system. ROVER-F is a FORTRAN program which calculates the trip probability and the setpoint adjustment required to attain the target trip probability, for a given set of flux shapes. This calculation is performed with the assumption that the most effective safety channel is unavailable and therefore the remaining two safety channels must both trip. The calculation of trip probability itself is non-iterative, but once the trip probability of the specified system has been calculated, a convergence iteration using a binomial search is used to determine the adjustment to the trip setpoints

  15. Calculation of Auger-neutralization probabilities for He{sup +}-ions in LEIS

    Energy Technology Data Exchange (ETDEWEB)

    Goebl, D., E-mail: dominik.goebl@jku.at [Institut Fuer Experimentalphysik, Abt. Atom- und Oberflaechenphysik, Johannes Kepler Universitaet, A-4040 Linz (Austria); Monreal, R.C.; Valdes, D.; Primetzhofer, D. [Departamento de Fisica Teorica de la Materia Condensada, Universidad Autonoma de Madrid, E-28049 Madrid (Spain); Bauer, P. [Institut Fuer Experimentalphysik, Abt. Atom- und Oberflaechenphysik, Johannes Kepler Universitaet, A-4040 Linz (Austria)

    2011-06-01

    In Low Energy Ion Scattering (LEIS), Auger-neutralization is an omnipresent charge exchange mechanism, especially when noble gas ions are used as projectiles, with a primary energy below the threshold energy, E{sub th}, for collision induced charge exchange processes (neutralization and reionization). Recent experiments revealed a significant dependence of the ion survival probability, P{sup +}, on the crystal plane, when He{sup +} ions are scattered from a metal surface. This is in contrast to the fact, that the neutralization probability in LEIS is usually assumed to be independent of the chemical environment of the collision partner (absence of matrix effects). In order to investigate this crystal effect, an existent theory on Auger-neutralization (based on a Linear Combination of Atomic Orbitals) is adapted to the LEIS geometry. With this model, Auger-neutralization rates are calculated for a Ag(1 1 0) surface. Trajectories for He particles scattered from this surface into different azimuth directions are obtained by means of Molecular Dynamics simulations. Subsequently, the ion survival probability is calculated and compared to measurements. Good agreement is obtained which gives confidence in the applicability of this model in the LEIS regime. Moreover, it was possible to obtain detailed information on the properties of the neutralization process.

  16. Computing Moment-Based Probability Tables for Self-Shielding Calculations in Lattice Codes

    International Nuclear Information System (INIS)

    As part of the self-shielding model used in the APOLLO2 lattice code, probability tables are required to compute self-shielded cross sections for coarse energy groups (typically with 99 or 172 groups). This paper describes the replacement of the multiband tables (typically with 51 subgroups) with moment-based tables in release 2.5 of APOLLO2. An improved Ribon method is proposed to compute moment-based probability tables, allowing important savings in CPU resources while maintaining the accuracy of the self-shielding algorithm. Finally, a validation is presented where the absorption rates obtained with each of these techniques are compared with exact values obtained using a fine-group elastic slowing-down calculation in the resolved energy domain. Other results, relative to the Rowland's benchmark and to three assembly production cases, are also presented

  17. Improved collision probability method for thermal-neutron-flux calculation in a cylindrical reactor cell

    International Nuclear Information System (INIS)

    An improved collision probability method for thermal-neutron-flux calculation in a cylindrical reactor cell has been developed. Expanding the neutron flux and source into a series of even powers of the radius, one' gets a convenient method for integration of the one-energy group integral transport equation. It is shown that it is possible to perform an analytical integration in the x-y plane in one variable and to use the effective Gaussian integration over another one. Choosing a convenient distribution of space points in fuel and moderator the transport matrix calculation and cell reaction rate integration were condensed. On the basis of the proposed method, the computer program DISKRET for the ZUSE-Z 23 K computer has been written. The suitability of the proposed method for the calculation of the thermal-neutron-flux distribution in a reactor cell can be seen from the test results obtained. Compared with the other collision probability methods, the proposed treatment excels with a mathematical simplicity and a faster convergence. (author)

  18. Calculations of hydrogen atom multiphoton energy level shifts, transition amplitudes and ionization probabilities

    International Nuclear Information System (INIS)

    Analyses of the resonant multiphoton ionization of atoms require knowledge of ac Stark energy shifts and of multiphoton, bound-to-bound state, transition amplitudes. In this paper, we consider the three-photon photoionization of hydrogen atoms at frequencies that are at and surrounding the two-photon 1s to 2s resonance. AC energy shift sums of both the 1s and 2s states are calculated as a function of the laser frequency along with two-photon 1s → 2s resonant transition amplitude sums. These quantities are calculated using an extended version of a method, which has often been employed in a variety of ways, of calculating these sums by expressing them in terms of solutions to a variety of differential equations that are derived from the different sums being evaluated. We demonstrate how exact solutions are obtained to these differential equations, which lead to exact evaluations of the corresponding sums. A variety of different cases are analysed, some involving analytic continuation, some involving real number analysis and some involving complex number analysis. A dc Stark sum calculation of the 2s state is carried out to illustrate the case where analytic continuation, pole isolation and pole subtraction are required and where the calculation can be carried out analytically; the 2s state, ac Stark shift sum calculations involve a case where no analytic continuation is required, but where the solution to the differential equation produces complex numbers owing to the finite photoionization lifetime of the 2s state. Results from these calculations are then used to calculate three-photon ionization probabilities of relevance to an analysis of the multiphoton ionization data published by Kyrala and Nichols (1991 Phys. Rev. A 44, R1450)

  19. Calculation of oscillation probabilities of atmospheric neutrinos using nuCraft

    CERN Document Server

    Wallraff, Marius

    2014-01-01

    NuCraft (nucraft.hepforge.org) is an open-source Python project that calculates neutrino oscillation probabilities for neutrinos from cosmic-ray interactions in the atmosphere for their propagation through Earth. The solution is obtained by numerically solving the Schr\\"odinger equation. The code supports arbitrary numbers of neutrino flavors including additional sterile neutrinos, CP violation, arbitrary mass hierarchies, matter effects with a configurable Earth model, and takes into account the production height distribution of neutrinos in the Earth's atmosphere.

  20. Failures probability calculation of the energy supply of the Angra-1 reactor rods assembly

    International Nuclear Information System (INIS)

    This work analyses the electric power system of the Angra I PWR plant. It is demonstrated that this system is closely coupled with the safety engineering features, which are the equipments provided to prevent, limit, or mitigate the release of radioactive material and to permit the safe reactor shutdown. Event trees are used to analyse the operation of those systems which can lead to the release of radioactivity following a specified initial event. The fault trees technique is used to calculate the failure probability of the on-site electric power system

  1. Theoretical Calculations of Transition Probabilities and Oscillator Strengths for Sc(Ⅲ) and Y(Ⅲ)

    Institute of Scientific and Technical Information of China (English)

    Tian-yi Zhang; Neng-wu Zheng

    2009-01-01

    The Weakest Bound Electron Potential Model theory is used to calculate transition probability-values and oscillator strength-values for individual lines of Sc(Ⅲ) and Y(Ⅲ). In this method, by solving the SchrSdinger equation of the weakest bound electron, the expressions of energy eigenvalue and the radial function can be obtained. And a coupled equation is used to determine the parameters which are needed in the calculations. The ob-tained results of Sc(Ⅲ) from this work agree very well with the accepted values taken from the National Institute of Standards and Technoligy (NIST) data base, most deviations are within the accepted level. For Y(Ⅲ) there are no accepted values reported by the NIST data base. So we compared our results of Y(Ⅲ) with other theoretical results, good agreement is also obtained.

  2. Research on calculation modes of tumor control probability following internal radionuclide therapy

    International Nuclear Information System (INIS)

    Objective: To construct a calculation model of tumor control probability (TCP) with consideration of the effect of dose and biological factors. It is expected that the result calculated by the model following internal radionuclide therapy would be appropriate to the actual value. Methods: Referring to the TCP models of radiotherapy and the models of cell survival rate reported by other authors, a TCP model (model 3) following internal radionuclide therapy, which considered the effect factors of dose, repair of DNA single-strand damage, radiosensitivity has been set up. Then using nuclide 32P as an example, model 3 will be compared with the other two models (model 1 and 2) by calculation the TCP value under the same treatment conditions. Results: Compared with model 1 and 2, the result calculated by model 3 shows that: 1. With a certain treatment activity (3.7 x 107 Bq), the radius of tumor which can be cured is closer to the experiment result. 2. If the given activity is infinitely great, the largest tumor radius which can be cured is smaller and coincide with the largest range of nuclide 32P. So the result calculated by model 3 is fitter with the actual value than that calculated by the other two models. Conclusion: The model 3 has considered more factors than others, so if the biological parameters are known, it could be used to calculate the TCP value under a certain kind of treatment condition to predict the internal radionuclide therapy result and estimate whether the treatment plan is reasonable

  3. Direct calculation of the probability of pionium ionization in the target

    International Nuclear Information System (INIS)

    The goal of the DIRAC experiment at CERN is the lifetime measurement of pionium (π+π- atom). Its lifetime is mainly defined by the charge-exchange process π+π-→ π0π0. The value of the lifetime in the ground state is predicted in the framework of chiral perturbation theory with high precision: τ1S = (2.9 ± 0.1) x 10-15 s. The method used by DIRAC is based on analysis of π+π--pair spectra with small relative momenta in their center-of-mass system in order to find out the signal from pionium ionization (breakup) in the target. Pioniums are produced in proton-nucleus collisions and have relativistic velocities (γ > 10). For fixed values of the pionium momentum and the target thickness, the probability of pionium ionization in the target depends on its lifetime in a unique way; thus, the pionium lifetime can be deduced from the experimentally defined probability of pionium ionization. On the basis of ionization cross sections of pionium with target atoms, we perform the first direct calculation of the pionium ionization probability in the target.

  4. New energy levels, calculated lifetimes and transition probabilities in Xe IX

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, M; Raineri, M; Reyna Almandos, J [Centro de Investigaciones Opticas (CIOp), CC 3 (1897) Gonnet, La Plata (Argentina); Biemont, E [IPNAS, Universite de Liege, B15 Sart Tilman, B-4000 Liege (Belgium)

    2011-02-28

    Twenty-one new experimental energy levels belonging to the 4d{sup 9}6p, 4d{sup 9}4f and 4d{sup 9}5f configurations of Xe IX are presented. They have been deduced from 75 newly classified lines involving the configurations 4d{sup 9}5p, 4d{sup 9}6p, 4d{sup 9}4f, 4d{sup 9}5f and 4d{sup 9}5d, 4d{sup 9}5s, 4d{sup 9}6s for the odd and even parities, respectively. The radiative lifetimes of these levels as well as the weighted oscillator strengths and transition probabilities for all the observed spectral lines have been calculated with optimized parameters deduced from a least-squares fitting procedure applied in the framework of a relativistic Hartree-Fock method including core-polarization effects. The scale of transition probabilities has also been assessed through comparisons with lifetimes calculated using a relativistic multiconfigurational Dirac-Fock approach.

  5. Calculation of the number of Monte Carlo histories for a planetary protection probability of impact estimation

    Science.gov (United States)

    Barengoltz, Jack

    2016-07-01

    Monte Carlo (MC) is a common method to estimate probability, effectively by a simulation. For planetary protection, it may be used to estimate the probability of impact P{}_{I} by a launch vehicle (upper stage) of a protected planet. The object of the analysis is to provide a value for P{}_{I} with a given level of confidence (LOC) that the true value does not exceed the maximum allowed value of P{}_{I}. In order to determine the number of MC histories required, one must also guess the maximum number of hits that will occur in the analysis. This extra parameter is needed because a LOC is desired. If more hits occur, the MC analysis would indicate that the true value may exceed the specification value with a higher probability than the LOC. (In the worst case, even the mean value of the estimated P{}_{I} might exceed the specification value.) After the analysis is conducted, the actual number of hits is, of course, the mean. The number of hits arises from a small probability per history and a large number of histories; these are the classic requirements for a Poisson distribution. For a known Poisson distribution (the mean is the only parameter), the probability for some interval in the number of hits is calculable. Before the analysis, this is not possible. Fortunately, there are methods that can bound the unknown mean for a Poisson distribution. F. Garwoodfootnote{ F. Garwood (1936), ``Fiduciary limits for the Poisson distribution.'' Biometrika 28, 437-442.} published an appropriate method that uses the Chi-squared function, actually its inversefootnote{ The integral chi-squared function would yield probability α as a function of the mean µ and an actual value n.} (despite the notation used): This formula for the upper and lower limits of the mean μ with the two-tailed probability 1-α depends on the LOC α and an estimated value of the number of "successes" n. In a MC analysis for planetary protection, only the upper limit is of interest, i.e., the single

  6. Calculating inspector probability of detection using performance demonstration program pass rates

    Science.gov (United States)

    Cumblidge, Stephen; D'Agostino, Amy

    2016-02-01

    The United States Nuclear Regulatory Commission (NRC) staff has been working since the 1970's to ensure that nondestructive testing performed on nuclear power plants in the United States will provide reasonable assurance of structural integrity of the nuclear power plant components. One tool used by the NRC has been the development and implementation of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code Section XI Appendix VIII[1] (Appendix VIII) blind testing requirements for ultrasonic procedures, equipment, and personnel. Some concerns have been raised, over the years, by the relatively low pass rates for the Appendix VIII qualification testing. The NRC staff has applied statistical tools and simulations to determine the expected probability of detection (POD) for ultrasonic examinations under ideal conditions based on the pass rates for the Appendix VIII qualification tests for the ultrasonic testing personnel. This work was primarily performed to answer three questions. First, given a test design and pass rate, what is the expected overall POD for inspectors? Second, can we calculate the probability of detection for flaws of different sizes using this information? Finally, if a previously qualified inspector fails a requalification test, does this call their earlier inspections into question? The calculations have shown that one can expect good performance from inspectors who have passed appendix VIII testing in a laboratory-like environment, and the requalification pass rates show that the inspectors have maintained their skills between tests. While these calculations showed that the PODs for the ultrasonic inspections are very good under laboratory conditions, the field inspections are conducted in a very different environment. The NRC staff has initiated a project to systematically analyze the human factors differences between qualification testing and field examinations. This work will be used to evaluate and prioritize

  7. Calculations of subsonic and supersonic turbulent reacting mixing layers using probability density function methods

    Science.gov (United States)

    Delarue, B. J.; Pope, S. B.

    1998-02-01

    A particle method applying the probability density function (PDF) approach to turbulent compressible reacting flows is presented. The method is applied to low and high Mach number reacting plane mixing layers. Good agreement is obtained between the model calculations and the available experimental data. The PDF equation is solved using a Lagrangian Monte Carlo method. To represent the effects of compressibility on the flow, the velocity PDF formulation is extended to include thermodynamic variables such as the pressure and the internal energy. Full closure of the joint PDF transport equation is made possible in this way without coupling to a finite-difference-type solver. The stochastic differential equations (SDE) that model the evolution of Lagrangian particle properties are based on existing models for the effects of compressibility on turbulence. The chemistry studied is the fast hydrogen-fluorine reaction. For the low Mach number runs, low heat release calculations are performed with equivalence ratios different from one. Heat release is then increased to study the effect of chemical reaction on the mixing layer growth rate. The subsonic results are compared with experimental data, and good overall agreement is obtained. The calculations are then performed at a higher Mach number, and the results are compared with the subsonic results. Our purpose in this paper is not to assess the performances of existing models for compressible or reacting flows. It is rather to present a new approach extending the domain of applicability of PDF methods to high-speed combustion.

  8. Calculation of probability density functions for temperature and precipitation change under global warming

    International Nuclear Information System (INIS)

    Full text: he IPCC Fourth Assessment Report (Meehl ef al. 2007) presents multi-model means of the CMIP3 simulations as projections of the global climate change over the 21st century under several SRES emission scenarios. To assess the possible range of change for Australia based on the CMIP3 ensemble, we can follow Whetton etal. (2005) and use the 'pattern scaling' approach, which separates the uncertainty in the global mean warming from that in the local change per degree of warming. This study presents several ways of representing these two factors as probability density functions (PDFs). The beta distribution, a smooth, bounded, function allowing skewness, is found to provide a useful representation of the range of CMIP3 results. A weighting of models based on their skill in simulating seasonal means in the present climate over Australia is included. Dessai ef al. (2005) and others have used Monte-Carlo sampling to recombine such global warming and scaled change factors into values of net change. Here, we use a direct integration of the product across the joint probability space defined by the two PDFs. The result is a cumulative distribution function (CDF) for change, for each variable, location, and season. The median of this distribution provides a best estimate of change, while the 10th and 90th percentiles represent a likely range. The probability of exceeding a specified threshold can also be extracted from the CDF. The presentation focuses on changes in Australian temperature and precipitation at 2070 under the A1B scenario. However, the assumption of linearity behind pattern scaling allows results for different scenarios and times to be simply obtained. In the case of precipitation, which must remain non-negative, a simple modification of the calculations (based on decreases being exponential with warming) is used to avoid unrealistic results. These approaches are currently being used for the new CSIRO/ Bureau of Meteorology climate projections

  9. Free Probability based Capacity Calculation of Multiantenna Gaussian Fading Channels with Cochannel Interference

    CERN Document Server

    Chatzinotas, Symeon

    2010-01-01

    During the last decade, it has been well understood that communication over multiple antennas can increase linearly the multiplexing capacity gain and provide large spectral efficiency improvements. However, the majority of studies in this area were carried out ignoring cochannel interference. Only a small number of investigations have considered cochannel interference, but even therein simple channel models were employed, assuming identically distributed fading coefficients. In this paper, a generic model for a multi-antenna channel is presented incorporating four impairments, namely additive white Gaussian noise, flat fading, path loss and cochannel interference. Both point-to-point and multiple-access MIMO channels are considered, including the case of cooperating Base Station clusters. The asymptotic capacity limit of this channel is calculated based on an asymptotic free probability approach which exploits the additive and multiplicative free convolution in the R- and S-transform domain respectively, as ...

  10. Peculiarities of high-overtone transition probabilities in carbon monoxide revealed by high-precision calculation

    International Nuclear Information System (INIS)

    In the recent work devoted to the calculation of the rovibrational line list of the CO molecule [G. Li et al., Astrophys. J., Suppl. Ser. 216, 15 (2015)], rigorous validation of the calculated parameters including intensities was carried out. In particular, the Normal Intensity Distribution Law (NIDL) [E. S. Medvedev, J. Chem. Phys. 137, 174307 (2012)] was employed for the validation purposes, and it was found that, in the original CO line list calculated for large changes of the vibrational quantum number up to Δn = 41, intensities with Δn > 11 were unphysical. Therefore, very high overtone transitions were removed from the published list in Li et al. Here, we show how this type of validation is carried out and prove that the quadruple precision is indispensably required to predict the reliable intensities using the conventional 32-bit computers. Based on these calculations, the NIDL is shown to hold up for the 0 → n transitions till the dissociation limit around n = 83, covering 45 orders of magnitude in the intensity. The low-intensity 0 → n transition predicted in the work of Medvedev [Determination of a new molecular constant for diatomic systems. Normal intensity distribution law for overtone spectra of diatomic and polyatomic molecules and anomalies in overtone absorption spectra of diatomic molecules, Institute of Chemical Physics, Russian Academy of Sciences, Chernogolovka, 1984] at n = 5 is confirmed, and two additional “abnormal” intensities are found at n = 14 and 23. Criteria for the appearance of such “anomalies” are formulated. The results could be useful to revise the high-overtone molecular transition probabilities provided in spectroscopic databases

  11. Peculiarities of high-overtone transition probabilities in carbon monoxide revealed by high-precision calculation

    Science.gov (United States)

    Medvedev, Emile S.; Meshkov, Vladimir V.; Stolyarov, Andrey V.; Gordon, Iouli E.

    2015-10-01

    In the recent work devoted to the calculation of the rovibrational line list of the CO molecule [G. Li et al., Astrophys. J., Suppl. Ser. 216, 15 (2015)], rigorous validation of the calculated parameters including intensities was carried out. In particular, the Normal Intensity Distribution Law (NIDL) [E. S. Medvedev, J. Chem. Phys. 137, 174307 (2012)] was employed for the validation purposes, and it was found that, in the original CO line list calculated for large changes of the vibrational quantum number up to Δn = 41, intensities with Δn > 11 were unphysical. Therefore, very high overtone transitions were removed from the published list in Li et al. Here, we show how this type of validation is carried out and prove that the quadruple precision is indispensably required to predict the reliable intensities using the conventional 32-bit computers. Based on these calculations, the NIDL is shown to hold up for the 0 → n transitions till the dissociation limit around n = 83, covering 45 orders of magnitude in the intensity. The low-intensity 0 → n transition predicted in the work of Medvedev [Determination of a new molecular constant for diatomic systems. Normal intensity distribution law for overtone spectra of diatomic and polyatomic molecules and anomalies in overtone absorption spectra of diatomic molecules, Institute of Chemical Physics, Russian Academy of Sciences, Chernogolovka, 1984] at n = 5 is confirmed, and two additional "abnormal" intensities are found at n = 14 and 23. Criteria for the appearance of such "anomalies" are formulated. The results could be useful to revise the high-overtone molecular transition probabilities provided in spectroscopic databases.

  12. Peculiarities of high-overtone transition probabilities in carbon monoxide revealed by high-precision calculation

    Energy Technology Data Exchange (ETDEWEB)

    Medvedev, Emile S., E-mail: esmedved@orc.ru [The Institute of Problems of Chemical Physics, Russian Academy of Sciences, Prospect Akademika Semenova 1, 142432 Chernogolovka (Russian Federation); Meshkov, Vladimir V.; Stolyarov, Andrey V. [Department of Chemistry, Lomonosov Moscow State University, Leninskie gory 1/3, 119991 Moscow (Russian Federation); Gordon, Iouli E. [Atomic and Molecular Physics Division, Harvard-Smithsonian Center for Astrophysics, 60 Garden St, Cambridge, Massachusetts 02138 (United States)

    2015-10-21

    In the recent work devoted to the calculation of the rovibrational line list of the CO molecule [G. Li et al., Astrophys. J., Suppl. Ser. 216, 15 (2015)], rigorous validation of the calculated parameters including intensities was carried out. In particular, the Normal Intensity Distribution Law (NIDL) [E. S. Medvedev, J. Chem. Phys. 137, 174307 (2012)] was employed for the validation purposes, and it was found that, in the original CO line list calculated for large changes of the vibrational quantum number up to Δn = 41, intensities with Δn > 11 were unphysical. Therefore, very high overtone transitions were removed from the published list in Li et al. Here, we show how this type of validation is carried out and prove that the quadruple precision is indispensably required to predict the reliable intensities using the conventional 32-bit computers. Based on these calculations, the NIDL is shown to hold up for the 0 → n transitions till the dissociation limit around n = 83, covering 45 orders of magnitude in the intensity. The low-intensity 0 → n transition predicted in the work of Medvedev [Determination of a new molecular constant for diatomic systems. Normal intensity distribution law for overtone spectra of diatomic and polyatomic molecules and anomalies in overtone absorption spectra of diatomic molecules, Institute of Chemical Physics, Russian Academy of Sciences, Chernogolovka, 1984] at n = 5 is confirmed, and two additional “abnormal” intensities are found at n = 14 and 23. Criteria for the appearance of such “anomalies” are formulated. The results could be useful to revise the high-overtone molecular transition probabilities provided in spectroscopic databases.

  13. Theoretical Calculations of Thermal Broadenings and Transition Probabilities of R, R' and B Line-Groups for Ruby

    Institute of Scientific and Technical Information of China (English)

    MA Dong-Ping; LIU Yan-Yun; CHEN Ju-Rong

    2001-01-01

    On the basis of the unified calculation of the thermal shifts of R1 line, R2 line and ground-state-splitting transition probabilities of direct and Raman processes have theoretically been calculated. The thermal broadenings of R,The theoretically predicted transition probabilities are in good agreement with the experimental ones.PACS numbers: 71.70.Ch, 78.20.Nv, 63.20.Mt, 63.20.Kr

  14. A simple method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation

    International Nuclear Information System (INIS)

    Since volumetric dose distributions are available with 3-dimensional radiotherapy treatment planning they can be used in statistical evaluation of response to radiation. This report presents a method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation. The mathematical expression for the calculation of normal tissue complication probability has been derived combining the Lyman model with the histogram reduction method of Kutcher et al. and using the normalized total dose (NTD) instead of the total dose. The fitting of published tolerance data, in case of homogeneous or partial brain irradiation, has been considered. For the same total or partial volume homogeneous irradiation of the brain, curves of normal tissue complication probability have been calculated with fraction size of 1.5 Gy and of 3 Gy instead of 2 Gy, to show the influence of fraction size. The influence of dose distribution inhomogeneity and α/β value has also been simulated: Considering α/β=1.6 Gy or α/β=4.1 Gy for kidney clinical nephritis, the calculated curves of normal tissue complication probability are shown. Combining NTD calculations and histogram reduction techniques, normal tissue complication probability can be estimated taking into account the most relevant contributing factors, including the volume effect. (orig.)

  15. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P(d,N)m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  16. Prospective validation of a risk calculator which calculates the probability of a positive prostate biopsy in a contemporary clinical cohort

    NARCIS (Netherlands)

    van Vugt, Heidi A.; Kranse, Ries; Steyerberg, Ewout W.; van der Poel, Henk G.; Busstra, Martijn; Kil, Paul; Oomens, Eric H.; de Jong, Igle J.; Bangma, Chris H.; Roobol, Monique J.

    2012-01-01

    Background: Prediction models need validation to assess their value outside the development setting. Objective: To assess the external validity of the European Randomised study of Screening for Prostate Cancer (ERSPC) Risk Calculator (RC) in a contemporary clinical cohort. Methods: The RC calculates

  17. A polynomial time algorithm for calculating the probability of a ranked gene tree given a species tree

    OpenAIRE

    Stadler, Tanja; Degnan, James H

    2012-01-01

    Background The ancestries of genes form gene trees which do not necessarily have the same topology as the species tree due to incomplete lineage sorting. Available algorithms determining the probability of a gene tree given a species tree require exponential computational runtime. Results In this paper, we provide a polynomial time algorithm to calculate the probability of a ranked gene tree topology for a given species tree, where a ranked tree topology is a tree topology with the internal v...

  18. The risk of a major nuclear accident: calculation and perception of probabilities

    International Nuclear Information System (INIS)

    The accident at Fukushima Daiichi, Japan, occurred on 11 March 2011. This nuclear disaster, the third on such a scale, left a lasting mark in the minds of hundreds of millions of people. Much as Three Mile Island or Chernobyl, yet another place will be permanently associated with a nuclear power plant which went out of control. Fukushima Daiichi revived the issue of the hazards of civil nuclear power, stirring up all the associated passion and emotion. The whole of this paper is devoted to the risk of a major nuclear accident. By this we mean a failure initiating core meltdown, a situation in which the fuel rods melt and mix with the metal in their cladding. Such accidents are classified as at least level 5 on the International Nuclear Event Scale. The Three Mile Island accident, which occurred in 1979 in the United States, reached this level of severity. The explosion of reactor 4 at the Chernobyl plant in Ukraine in 1986 and the recent accident in Japan were classified as class 7, the highest grade on this logarithmic scale. The main difference between the top two levels and level 5 relates to a significant or major release of radioactive material to the environment. In the event of a level-5 accident, damage is restricted to the inside of the plant, whereas, in the case of level-7 accidents, huge areas of land, above or below the surface, and/or sea may be contaminated. Before the meltdown of reactors 1, 2 and 3 at Fukushima Daiichi, eight major accidents affecting nuclear power plants had occurred worldwide. This is a high figure compared with the one calculated by the experts. Observations in the field do not appear to fit the results of the probabilistic models of nuclear accidents produced since the 1970's. Oddly enough the number of major accidents is closer to the risk as perceived by the general public. In general we tend to overestimate any risk relating to rare, fearsome accidents. What are we to make of this divergence? How are we to reconcile

  19. On the calculation of transition probabilities with correlated angular-momentum-projected wave functions and realistic forces

    International Nuclear Information System (INIS)

    In this paper we propose the use of angular-momentum-projected generator coordinator method (GCM) wave functions for the evaluation of transition probabilities in heavy nuclei. We derive the relevant equations and discuss ways to cope with the technical difficulties which appear in the application of the theory. We show the feasibility of the method by applying it to the calculation of B(E3) transition probabilities in light nuclei within the GCM, in the gaussian overlap approximation (GOA). In the calculations we use the density-dependent Gogny force. The theoretical projected results are in much better agreement with experiment than the unprojected ones. (orig.)

  20. Improved techniques for outgoing wave variational principle calculations of converged state-to-state transition probabilities for chemical reactions

    Science.gov (United States)

    Mielke, Steven L.; Truhlar, Donald G.; Schwenke, David W.

    1991-01-01

    Improved techniques and well-optimized basis sets are presented for application of the outgoing wave variational principle to calculate converged quantum mechanical reaction probabilities. They are illustrated with calculations for the reactions D + H2 yields HD + H with total angular momentum J = 3 and F + H2 yields HF + H with J = 0 and 3. The optimization involves the choice of distortion potential, the grid for calculating half-integrated Green's functions, the placement, width, and number of primitive distributed Gaussians, and the computationally most efficient partition between dynamically adapted and primitive basis functions. Benchmark calculations with 224-1064 channels are presented.

  1. ELIPGRID-PC: A PC program for calculating hot spot probabilities

    International Nuclear Information System (INIS)

    ELIPGRID-PC, a new personal computer program has been developed to provide easy access to Singer's 1972 ELIPGRID algorithm for hot-spot detection probabilities. Three features of the program are the ability to determine: (1) the grid size required for specified conditions, (2) the smallest hot spot that can be sampled with a given probability, and (3) the approximate grid size resulting from specified conditions and sampling cost. ELIPGRID-PC also provides probability of hit versus cost data for graphing with spread-sheets or graphics software. The program has been successfully tested using Singer's published ELIPGRID results. An apparent error in the original ELIPGRID code has been uncovered and an appropriate modification incorporated into the new program

  2. Chronology of Postglacial Eruptive Activity and Calculation of Eruption Probabilities for Medicine Lake Volcano, Northern California

    Science.gov (United States)

    Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.

    2007-01-01

    Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.

  3. Calculation of rotational transition probabilities in molecular collisions - Application to N2 + N2

    Science.gov (United States)

    Itikawa, Y.

    1975-01-01

    A computational method is proposed to obtain rotational transition probabilities in collisions between two diatomic molecules. The potential method of Rabitz and an exponential approximation are used to solve the semiclassical coupled equations without invoking any perturbational technique. The collision trajectory is determined in the classical modified-wave-number approximation. The method can treat systems involving strong interactions and provide probabilities for transitions even with a multiquantum jump. A simultaneous transition in the rotational states of both molecules, i.e., the rotational-rotational energy transfer, is taken into account. An application to the system N2 + N2 is presented.

  4. On a best-estimate approach to the calculation of dryout probability during BWR transients

    International Nuclear Information System (INIS)

    A method is proposed whereby uncertainty of any dryout margin measure (figure of merit) may be quantified when the only experimental information available for validation is whether dryout has occurred or not. The method does not involve the heater temperature, except as a discrete dryout indicator. This is an advantage when analysing anticipated operational occurrences for which the acceptance criterion refers exclusively to the probability of dryout occurrence. The derived uncertainty provides a direct relation between the simulated dryout margin and the aforementioned probability. Furthermore, the method, which is based on logistic regression, has been designed to be consistent with more common parametric methods of uncertainty analysis that are likely to be used for other parts of a thermal hydraulic model. One example is provided where the method is utilized to assess statistical properties, which would have been difficult to quantify by other means. (author)

  5. Monte Carlo calculation of the total probability for gamma-Ray interaction in toluene

    International Nuclear Information System (INIS)

    Interaction and absorption probabilities for gamma-rays with energies between 1 and 1000 KeV have been computed and tabulated. Toluene based scintillator solution has been assumed in the computation. Both, point sources and homogeneously dispersed radioactive material have been assumed. These tables may be applied to cylinders with radii between 1.25 cm and 0.25 cm and heights between 4.07 cm and 0.20 cm. (Author) 26 refs

  6. Special Issue on Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities

    OpenAIRE

    Per Jönsson; Hyun-Kyung Chung

    2013-01-01

    There exist several codes in the atomic physics community to generate atomic structure and transition probabilities freely and readily distributed to researchers outside atomic physics community, in plasma, astrophysical or nuclear physics communities. Users take these atomic physics codes to generate the necessary atomic data or modify the codes for their own applications. However, there has been very little effort to validate and verify the data sets generated by non-expert users. [...

  7. Optimization of next-event estimation probability in Monte Carlo shielding calculations

    International Nuclear Information System (INIS)

    In Monte Carlo radiation transport calculations with point detectors, the next-event estimation is employed to estimate the response to each detector from all collision sites. The computation time required for this estimation process is substantial and often exceeds the time required to generate and process particle histories in a calculation. This estimation from all collision sites is, therefore, very wasteful in Monte Carlo shielding calculations. For example, in the source region and in regions far away from the detectors, the next-event contribution of a particle is often very small and insignificant. A method for reducing this inefficiency is described

  8. Integral transport multiregion geometrical shadowing factor for the approximate collision probability matrix calculation of infinite closely packed lattices

    International Nuclear Information System (INIS)

    An integral transport method of calculating the geometrical shadowing factor in multiregion annular cells for infinite closely packed lattices in cylindrical geometry is developed. This analytical method has been programmed in the TPGS code. This method is based upon a consideration of the properties of the integral transport method for a nonuniform body, which together with Bonalumi's approximations allows the determination of the approximate multiregion collision probability matrix for infinite closely packed lattices with sufficient accuracy. The multiregion geometrical shadowing factors have been calculated for variations in fuel pin annular segment rings in a geometry of annular cells. These shadowing factors can then be used in the calculation of neutron transport from one annulus to another in an infinite lattice. The result of this new geometrical shadowing and collision probability matrix are compared with the Dancoff-Ginsburg correction and the probability matrix using constant shadowing on Yankee fuel elements in an infinite lattice. In these cases the Dancoff-Ginsburg correction factor and collision probability matrix using constant shadowing are in difference by at most 6.2% and 6%, respectively

  9. Large-scale Breit-Pauli R-matrix calculations for transition probabilities of Fe V

    OpenAIRE

    Nahar, Sultana N.; Pradhan, Anil K.

    2000-01-01

    Ab initio theoretical calculations are reported for the electric (E1) dipole allowed and intercombination fine structure transitions in Fe V using the Breit-Pauli R-matrix (BPRM) method. We obtain 3865 bound fine structure levels of Fe V and $1.46 x 10^6$ oscillator strengths, Einstein A-coefficients and line strengths. In addition to the relativistic effects, the intermediate coupling calculations include extensive electron correlation effects that represent the complex configuration interac...

  10. Relativistic Calculation Of K$\\beta$ Hypersatellite Energies and Transition Probabilities for Selected Atoms with 13<=Z<=80

    CERN Document Server

    Costa, A M; Santos, J P; Indelicato, P J; Parente, F; Indelicato, Paul

    2006-01-01

    Energies and transition probabilities of K$\\beta$ hypersatellite lines are computed using the Dirac-Fock model for several values of $Z$ throughout the periodic table. The influence of the Breit interaction on the energy shifts from the corresponding diagram lines and on the K$\\beta\\_{1}^{\\rm h}$/K$\\beta\\_{3}^{\\rm h}$ intensity ratio is evaluated. The widths of the double-K hole levels are calculated for Al and Sc. The results are compared to experiment and to other theoretical calculations.

  11. Internationally comparable diagnosis-specific survival probabilities for calculation of the ICD-10-based Injury Severity Score

    DEFF Research Database (Denmark)

    Gedeborg, R.; Warner, M.; Chen, L. H.;

    2014-01-01

    BACKGROUND: The International Statistical Classification of Diseases, 10th Revision (ICD-10) -based Injury Severity Score (ICISS) performs well but requires diagnosis-specific survival probabilities (DSPs), which are empirically derived, for its calculation. The objective was to examine if DSPs...... country's own DSPs for ICISS calculation, the pooled DSPs resulted in somewhat reduced discrimination in predicting mortality (difference in c statistic varied from 0.006 to 0.04). Calibration was generally good when the predicted mortality risk was less than 20%. When Danish and Swedish data were used...

  12. Calculated level energies, transition probabilities, and lifetimes of silicon-like ions

    International Nuclear Information System (INIS)

    The authors present theoretical excitation energies and lifetimes for the 27 low-lying levels of silicon-like ions of S, Ar, Ca, Ti, Fe, Zn, and Kr (16 ≤ Z ≤ 36). Special attention has been paid to provide a complete tabulation of all electric-dipole (E1) allowed transitions from levels of the 3s3p3 and 3s23p3d excited configurations to those of the 3s23p2 ground-state configuration, including all weak and intercombination transitions. Large-scale multiconfiguration Dirac-Fock wave functions are applied to compute transition energies and probabilities. They further investigate the decay of the 3s23p3dJ = 4 level which is connected to the ground-state configuration only via forbidden M2 transitions but otherwise mainly decays via M1 to lower-lying levels of the same parity. For a few selected data, they compare the results with experiment and with previous computations

  13. Numerical calculation of vibrational transition probability for the forced morse oscillator by use of the anharmonic boson operators

    International Nuclear Information System (INIS)

    The vibrational transition probability expressions for the forced Morse oscillator have been derived using the commutation relations of the anharmonic Boson operators. The formulation is based on the collinear collision model with the exponential repulsive potential in the framework of semiclassical collision dynamics. The sample calculation results for H2 + He collision system, where the anharmonicity is large, are in excellent agreement with those from an exact, numerical quantum mechanical study by Clark and Dickinson, using the reactance matrix. Our results,however, are markedly different from those of Ree, Kim, and Shin's in which they approximate the commutation operator Io as unity, the harmonic oscillator limit. We have concluded that the quantum number dependence in Io must be retained to get accurate vibrational transition probabilities for the Morse oscillator

  14. A massively parallel algorithm for the collision probability calculations in the Apollo-II code using the PVM library

    International Nuclear Information System (INIS)

    The collision probability method in neutron transport, as applied to 2D geometries, consume a great amount of computer time, for a typical 2D assembly calculation about 90% of the computing time is consumed in the collision probability evaluations. Consequently RZ or 3D calculations became prohibitive. In this paper the author presents a simple but efficient parallel algorithm based on the message passing host/node programmation model. Parallelization was applied to the energy group treatment. Such approach permits parallelization of the existing code, requiring only limited modifications. Sequential/parallel computer portability is preserved, which is a necessary condition for a industrial code. Sequential performances are also preserved. The algorithm is implemented on a CRAY 90 coupled to a 128 processor T3D computer, a 16 processor IBM SPI and a network of workstations, using the Public Domain PVM library. The tests were executed for a 2D geometry with the standard 99-group library. All results were very satisfactory, the best ones with IBM SPI. Because of heterogeneity of the workstation network, the author did not ask high performances for this architecture. The same source code was used for all computers. A more impressive advantage of this algorithm will appear in the calculations of the SAPHYR project (with the future fine multigroup library of about 8000 groups) with a massively parallel computer, using several hundreds of processors

  15. Calculation of transition probabilities and ac Stark shifts in two-photon laser transitions of antiprotonic helium

    OpenAIRE

    HORI, MASAKI; Korobov, Vladimir I.

    2010-01-01

    Numerical ab initio variational calculations of the transition probabilities and ac Stark shifts in two-photon transitions of antiprotonic helium atoms driven by two counter-propagating laser beams are presented. We found that sub-Doppler spectroscopy is in principle possible by exciting transitions of the type (n,L)->(n-2,L-2) between antiprotonic states of principal and angular momentum quantum numbers n~L-1~35, first by using highly monochromatic, nanosecond laser beams of intensities 10^4...

  16. Fine-structure calculations of energy levels, oscillator strengths, and transition probabilities for sulfur-like iron, Fe XI

    International Nuclear Information System (INIS)

    Energy levels, oscillator strengths, and transition probabilities for transitions among the 14 LS states belonging to configurations of sulfur-like iron, Fe XI, have been calculated. These states are represented by configuration interaction wavefunctions and have configurations 3s23p4, 3s3p5, 3s23p33d, 3s23p34s, 3s23p34p, and 3s23p34d, which give rise to 123 fine-structure energy levels. Extensive configuration interaction calculations using the CIV3 code have been performed. To assess the importance of relativistic effects, the intermediate coupling scheme by means of the Breit–Pauli Hamiltonian terms, such as the one-body mass correction and Darwin term, and spin–orbit, spin–other-orbit, and spin–spin corrections, are incorporated within the code. These incorporations adjusted the energy levels, therefore the calculated values are close to the available experimental data. Comparisons between the present calculated energy levels as well as oscillator strengths and both experimental and theoretical data have been performed. Our results show good agreement with earlier works, and they might be useful in thermonuclear fusion research and astrophysical applications. -- Highlights: •Accurate atomic data of iron ions are needed for identification of solar corona. •Extensive configuration interaction wavefunctions including 123 fine-structure levels have been calculated. •The relativistic effects by means of the Breit–Pauli Hamiltonian terms are incorporated. •This incorporation adjusts the energy levels, therefore the calculated values are close to experimental values

  17. Calculation of transition probabilities and ac Stark shifts in two-photon laser transitions of antiprotonic helium

    International Nuclear Information System (INIS)

    Numerical ab initio variational calculations of the transition probabilities and ac Stark shifts in two-photon transitions of antiprotonic helium atoms driven by two counter-propagating laser beams are presented. We found that sub-Doppler spectroscopy is, in principle, possible by exciting transitions of the type (n,L)→(n-2,L-2) between antiprotonic states of principal and angular momentum quantum numbers n∼L-1∼35, first by using highly monochromatic, nanosecond laser beams of intensities 104-105 W/cm2, and then by tuning the virtual intermediate state close (e.g., within 10-20 GHz) to the real state (n-1,L-1) to enhance the nonlinear transition probability. We expect that ac Stark shifts of a few MHz or more will become an important source of systematic error at fractional precisions of better than a few parts in 109. These shifts can, in principle, be minimized and even canceled by selecting an optimum combination of laser intensities and frequencies. We simulated the resonance profiles of some two-photon transitions in the regions n=30-40 of the p4He+ and p3He+ isotopes to find the best conditions that would allow this.

  18. Seismic Hazard Evaluation in Western Turkey as Revealed by Stress Transfer and Time-dependent Probability Calculations

    Science.gov (United States)

    Paradisopoulou, P. M.; Papadimitriou, E. E.; Karakostas, V. G.; Taymaz, T.; Kilias, A.; Yolsal, S.

    2010-08-01

    hazards is given by translating the calculated stress changes into earthquake probability using an earthquake nucleation constitutive relation, which includes permanent and transient effects of the sudden stress changes.

  19. Neutron Flux Interpolation with Finite Element Method in the Nuclear Fuel Cell Calculation using Collision Probability Method

    International Nuclear Information System (INIS)

    Nuclear reactor design and analysis of next-generation reactors require a comprehensive computing which is better to be executed in a high performance computing. Flat flux (FF) approach is a common approach in solving an integral transport equation with collision probability (CP) method. In fact, the neutron flux distribution is not flat, even though the neutron cross section is assumed to be equal in all regions and the neutron source is uniform throughout the nuclear fuel cell. In non-flat flux (NFF) approach, the distribution of neutrons in each region will be different depending on the desired interpolation model selection. In this study, the linear interpolation using Finite Element Method (FEM) has been carried out to be treated the neutron distribution. The CP method is compatible to solve the neutron transport equation for cylindrical geometry, because the angle integration can be done analytically. Distribution of neutrons in each region of can be explained by the NFF approach with FEM and the calculation results are in a good agreement with the result from the SRAC code. In this study, the effects of the mesh on the keff and other parameters are investigated.

  20. Configuration-interaction plus many-body-perturbation-theory calculations of Si i transition probabilities, oscillator strengths, and lifetimes

    Science.gov (United States)

    Savukov, I. M.

    2016-02-01

    The precision of the mixed configuration-interaction plus many-body-perturbation-theory (CI+MBPT) method is limited in multivalence atoms by the large size of valence CI space. Previously, to study this problem, the CI+MBPT method was applied to calculations of energies in a four-valence electron atom, Si i. It was found that by using a relatively small cavity of 30 a.u. and by choosing carefully the configuration space, quite accurate agreement between theory and experiment at the level of 100 cm-1 can be obtained, especially after subtraction of systematic shifts for groups of states of the same J and parity. However, other properties are also important to investigate. In this work, the CI+MBPT method is applied to studies of transition probabilities, oscillator strengths, and lifetimes. A close agreement with accurate experimental measurements and other elaborate theories is obtained. The long-term goal is to extend the CI+MBPT approach to applications in more complex atoms, such as lantanides and actinides.

  1. Relativistic Many-body Moller-Plesset Perturbation Theory Calculations of the Energy Levels and Transition Probabilities in Na- to P-like Xe Ions

    Energy Technology Data Exchange (ETDEWEB)

    Vilkas, M J; Ishikawa, Y; Trabert, E

    2007-03-27

    Relativistic multireference many-body perturbation theory calculations have been performed on Xe{sup 43+}-Xe{sup 39+} ions, resulting in energy levels, electric dipole transition probabilities, and level lifetimes. The second-order many-body perturbation theory calculation of energy levels included mass shifts, frequency-dependent Breit correction and Lamb shifts. The calculated transition energies and E1 transition rates are used to present synthetic spectra in the extreme ultraviolet range for some of the Xe ions.

  2. Theoretical calculation for forbidden transition probability of ΔL=±2, ΔS=±1 lines in N II

    International Nuclear Information System (INIS)

    The forbidden transition probabilities have been calculated for the 2p3d-2p2 and 2p3d-2p3p (ΔL = +-2, ΔS = +-1) spectra of N II by using a large-scale multiconfiguration Dirac-Fock (MCDF) method. The most important effects of relativity, correlation, and relaxation are considered in the calculation Comparing with other calculations, a remarkable improvement is achieved

  3. Calculation of radiation induced complication probabilities for brain, liver and kidney, and the use of a reliability model to estimate critical volume fractions

    International Nuclear Information System (INIS)

    Radiation induced normal tissue complication probability is calculated for three different organs: brain, liver and kidney. The model applied is a reliability model where the volume effect of the tissue is described by the structural parameter, k, which reflects the architecture of the functional subunits of the organ. The complication probability depends on k, the inactivation probability of the functional subunits (p) and the irradiated volume fraction (n). For partial, homogeneous irradiation of the brain, a k-value close to unity was found, and the respective values for liver and kidney were 0.92 and 0.77. An extension of the reliability model to account for individual inactivation probability of the subunits allows calculation of complication probability for inhomogeneous dose distributions. For the brain, intercomparisons of a three-field and a two-field technique demonstrated a small reduction in complication probability for the former at low total doses. At high total doses a minimum complication probability was achieved applying a three-field technique, being three times less than that associated with the two-field technique. (author)

  4. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    International Nuclear Information System (INIS)

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features

  5. PAPIN: A Fortran-IV program to calculate cross section probability tables, Bondarenko and transmission self-shielding factors for fertile isotopes in the unresolved resonance region

    International Nuclear Information System (INIS)

    The Fortran IV code PAPIN has been developed to calculate cross section probability tables, Bondarenko self-shielding factors and average self-indication ratios for non-fissile isotopes, below the inelastic threshold, on the basis of the ENDF/B prescriptions for the unresolved resonance region. Monte-Carlo methods are utilized to generate ladders of resonance parameters in the unresolved resonance region, from average resonance parameters and their appropriate distribution functions. The neutron cross-sections are calculated by the single level Breit-Wigner (SLBW) formalism, with s, p and d-wave contributions. The cross section probability tables are constructed by sampling the Doppler-broadened cross sections. The various self-shielded factors are computed numerically as Lebesgue integrals over the cross section probability tables. The program PAPIN has been validated through extensive comparisons with several deterministic codes

  6. PAPIN: A FORTRAN-4 program to calculate cross section probability tables, Bondarenko and transmission self-shielding factors for fertile isotopes in the unresolved resonance region

    Science.gov (United States)

    Munoz-Cobos, J. G.

    1981-08-01

    A FORTRAN 4 code was developed to calculate cross section probability tables, Bondarenko self-shielding factors, and average self-indication ratios for non-fissile isotopes, below the inelastic threshold, on the basis of prescriptions for the unresolved resonance region. Monte-Carlo methods are utilized to generate ladders of resonance parameters in he unresolved resonance region, from average resonance parameters and their appropriate distribution functions. The neutron cross sections are calculated by the single level Breit-Wigner formalism, with s, p and d-wave contributions. The cross section probability tables are constructed by sampling the Doppler-broadened cross sections. The various self-shielded factors are computed numerically as Lebesgue integrals over the cross section probability tables. The program was validated through extensive comparisons with several deterministic codes.

  7. A program for calculation of the E1, E2, and M1 transition probabilities in odd-odd nuclei taking the Coriolis mixing into account

    International Nuclear Information System (INIS)

    The program makes it possible to calculate the E1, E2 and M1 reduced transition probabilities in odd-odd deformed nuclei. The mixed wave functions used result from a least-squares fit of energy levels (taking the Coriolis effect into account) to the experimental ones, performed with the modified ODDODDCORI subprogram. (orig./HSI)

  8. Calculation of the transition probabilities of superfluid Fermi gas with orbital angular momentum l=1 at low temperatures

    Directory of Open Access Journals (Sweden)

    S Nasirimoghadam

    2011-09-01

    Full Text Available  The ultracold atoms fermion gas such as 6Li undergo superfluidity state. The transport quantities of these fluids have a direct dependence on the transition probabilities. Here, by obtaining possible processes in p-wave superfluid, we have shown that only binary processes are dominate at low temperatures.

  9. Calculation of probabilities of transfer, recurrence intervals, and positional indices for linear compartment models. Environmental Sciences Division Publication no. 1544

    International Nuclear Information System (INIS)

    Six indices are presented for linear compartment systems that quantify the probable pathways of matter or energy transfer, the likelihood of recurrence if the model contains feedback loops, and the number of steps (transfers) through the system. General examples are used to illustrate how these indices can simplify the comparison of complex systems or organisms in unrelated systems

  10. Calculation of probabilities of transfer, recurrence intervals, and positional indices for linear compartment models. Environmental Sciences Division Publication no. 1544

    Energy Technology Data Exchange (ETDEWEB)

    Carney, J.H.; DeAngelis, D.L.; Gardner, R.H.; Mankin, J.B.; Post, W.M.

    1981-02-01

    Six indices are presented for linear compartment systems that quantify the probable pathways of matter or energy transfer, the likelihood of recurrence if the model contains feedback loops, and the number of steps (transfers) through the system. General examples are used to illustrate how these indices can simplify the comparison of complex systems or organisms in unrelated systems.

  11. Grit-mediated frictional ignition of a polymer-bonded explosive during oblique impacts: Probability calculations for safety engineering

    International Nuclear Information System (INIS)

    Frictional heating of high-melting-point grit particles during oblique impacts of consolidated explosives is considered to be the major source of ignition in accidents involving dropped explosives. It has been shown in other work that the lower temperature melting point of two frictionally interacting surfaces will cap the maximum temperature reached, which provides a simple way to mitigate the danger in facilities by implementing surfaces with melting points below the ignition temperature of the explosive. However, a recent series of skid testing experiments has shown that ignition can occur on low-melting-point surfaces with a high concentration of grit particles, most likely due to a grit–grit collision mechanism. For risk-based safety engineering purposes, the authors present a method to estimate the probability of grit contact and/or grit–grit collision during an oblique impact. These expressions are applied to potentially high-consequence oblique impact scenarios in order to give the probability of striking one or more grit particles (for high-melting-point surfaces), or the probability of one or more grit–grit collisions occurring (for low-melting-point surfaces). The probability is dependent on a variety of factors, many of which can be controlled for mitigation to achieve acceptable risk levels for safe explosives handling operations. - Highlights: • Unexpectedly, grit-mediated ignition of a PBX occurred on low-melting point surfaces. • On high-melting surfaces frictional heating is due to a grit–surface interaction. • For low-melting point surfaces the heating mechanism is grit–grit collisions. • A method for estimating the probability of ignition is presented for both surfaces

  12. Lab Retriever: a software tool for calculating likelihood ratios incorporating a probability of drop-out for forensic DNA profiles

    OpenAIRE

    Inman, Keith; Rudin, Norah; Cheng, Ken; Robinson, Chris; Kirschner, Adam; Inman-Semerau, Luke; Lohmueller, Kirk E.

    2015-01-01

    Background Technological advances have enabled the analysis of very small amounts of DNA in forensic cases. However, the DNA profiles from such evidence are frequently incomplete and can contain contributions from multiple individuals. The complexity of such samples confounds the assessment of the statistical weight of such evidence. One approach to account for this uncertainty is to use a likelihood ratio framework to compare the probability of the evidence profile under different scenarios....

  13. Qualification of the calculational methods of the fluence in the pressurised water reactors. Improvement of the cross sections treatment by the probability table method

    International Nuclear Information System (INIS)

    It is indispensable to know the fluence on the nuclear reactor pressure vessel. The cross sections and their treatment have an important rule to this problem. In this study, two ''benchmarks'' have been interpreted by the Monte Carlo transport program TRIPOLI to qualify the calculational method and the cross sections used in the calculations. For the treatment of the cross sections, the multigroup method is usually used but it exists some problems such as the difficulty to choose the weighting function and the necessity of a great number of energy to represent well the cross section's fluctuation. In this thesis, we propose a new method called ''Probability Table Method'' to treat the neutron cross sections. For the qualification, a program of the simulation of neutron transport by the Monte Carlo method in one dimension has been written; the comparison of multigroup's results and probability table's results shows the advantages of this new method. The probability table has also been introduced in the TRIPOLI program; the calculational results of the iron deep penetration benchmark has been improved by comparing with the experimental results. So it is interest to use this new method in the shielding and neutronic calculation. (author). 42 refs., 109 figs., 36 tabs

  14. Concise calculation of the scaling function, exponents, and probability functional of the Edwards-Wilkinson equation with correlated noise

    International Nuclear Information System (INIS)

    The linear Langevin equation proposed by Edwards and Wilkinson [Proc. R. Soc. London A 381, 17 (1982)] is solved in closed form for noise of arbitrary space and time correlation. Furthermore, the temporal development of the full probability functional describing the height fluctuations is derived exactly, exhibiting an interesting evolution between two distinct Gaussian forms. We determine explicitly the dynamic scaling function for the interfacial width for any given initial condition, isolate the early-time behavior, and discover an invariance that was unsuspected in this problem of arbitrary spatiotemporal noise

  15. Are classical molecular dynamics calculations accurate for state-to-state transition probabilities in the H + D2 reaction?

    International Nuclear Information System (INIS)

    We present converged quantum dynamics for the H + D2 reaction at a total energy high enough to produce HD in the v' = 3, j' = 7 vibrational-rotational state and for total angular momenta J = 0, 1, and 2. We compare state-to-state partial cross sections for H + D2 (v = 0-2, j = 0, J = 0-2) → HD (v' = 0-2, j') + H and H + D2 (v = 1, j = 6, J = 0-2) → HD (v' = 0-2, j') + H as calculated from classical trajectory calculations with quantized initial conditions, i.e., a quasiclassical trajectory (QCT) simulation, to the results of converged quantum dynamics calculations involving up to 654 coupled channels. Final states in the QCT calculations are assigned by the quadratic smooth sampling (QSS) method. Since the quasiclassical and quantal calculations are carried out with the same potential energy surface, the comparison provides a direct test of the accuracy of the quasiclassical simulations as a function of the initial vibrational-rotational state and the final vibrational-rotational state

  16. Relativistic many-body calculations of excitation energy and radiative transition probabilities for many-electron ions

    International Nuclear Information System (INIS)

    Energy levels, line strengths, oscillator strengths, and transition rates are calculated for electric dipole nl1nl2[LSJ]-nl3nl4[L'S'J'] transition in Be- (n=2), Mg- (n=3), Zn- (n=4) and Sm- (n=5) like ions with nuclear charges ranging from Z=N to 100 where N is number of electron in system. (author)

  17. Relativistic calculation of the beta decay probabilities in the optimized Dirac-Kohn-Sham atom model and a chemical environment effect

    Energy Technology Data Exchange (ETDEWEB)

    Glushkov, Alexander [Odessa University (Ukraine); Russian Academy of Sciences, Troitsk (Russian Federation); Khetselius, Olga; Dubrovskaya, Yuliya [Odessa University (Ukraine); Lovett, Ludmila [UK National Academy of Sciences and Bookdata Co., London (United Kingdom)

    2009-07-01

    New theoretical scheme for calculating the beta decay characteristics and an account for chemical environment effect on the beta decay ones is developed. As method of calculation of the relativistic fields and electron wave functions, the gauge invariant Dirac-Fock and Dirac-Kohn-Sham approaches are used. The results of calculating the decay probabilities for the beta decays: {sup 33}P-{sup 33}S, {sup 35}S-{sup 35}Cl, {sup 63}Ni-{sup 63}Cu, {sup 241}Pu-{sup 241}Am are presented. Comparison of the Fermi function values is carried out for different approximations of an exchange effect account, calculation with using wave functions on the boundary of the charged spherical nucleus and with using squires of the amplitudes of expansion of these functions near zero.

  18. Relativistic many-body calculations of excitation energy and radiative transition probabilities for many-electron ions

    Energy Technology Data Exchange (ETDEWEB)

    Safronova, U.I.; Johnson, W.R. [Dept. of Physics, Univ. of Notre Dame, IN (United States)

    2000-01-01

    Energy levels, line strengths, oscillator strengths, and transition rates are calculated for electric dipole nl{sub 1}nl{sub 2}[LSJ]-nl{sub 3}nl{sub 4}[L'S'J'] transition in Be- (n=2), Mg- (n=3), Zn- (n=4) and Sm- (n=5) like ions with nuclear charges ranging from Z=N to 100 where N is number of electron in system. (author)

  19. Continuous-Energy Adjoint Flux and Perturbation Calculation using the Iterated Fission Probability Method in Monte Carlo Code TRIPOLI-4® and Underlying Applications

    Science.gov (United States)

    Truchet, G.; Leconte, P.; Peneliau, Y.; Santamarina, A.; Malvagi, F.

    2014-06-01

    Pile-oscillation experiments are performed in the MINERVE reactor at the CEA Cadarache to improve nuclear data accuracy. In order to precisely calculate small reactivity variations (TRIPOLI-4® by using the eigenvalue difference method. This "direct" method has shown limitations in the evaluation of very small reactivity effects because it needs to reach a very small variance associated to the reactivity in both states. To answer this problem, it has been decided to implement the exact perturbation theory in TRIPOLI-4® and, consequently, to calculate a continuous-energy adjoint flux. The Iterated Fission Probability (IFP) method was chosen because it has shown great results in some other Monte Carlo codes. The IFP method uses a forward calculation to compute the adjoint flux, and consequently, it does not rely on complex code modifications but on the physical definition of the adjoint flux as a phase-space neutron importance. In the first part of this paper, the IFP method implemented in TRIPOLI-4® is described. To illustrate the effciency of the method, several adjoint fluxes are calculated and compared with their equivalent obtained by the deterministic code APOLLO-2. The new implementation can calculate angular adjoint flux. In the second part, a procedure to carry out an exact perturbation calculation is described. A single cell benchmark has been used to test the accuracy of the method, compared with the "direct" estimation of the perturbation. Once again the method based on the IFP shows good agreement for a calculation time far more inferior to the "direct" method. The main advantage of the method is that the relative accuracy of the reactivity variation does not depend on the magnitude of the variation itself, which allows us to calculate very small reactivity perturbations with high precision. Other applications of this perturbation method are presented and tested like the calculation of exact kinetic parameters (βeff, Λeff) or sensitivity parameters.

  20. Electron-ion recombination in nuclear recoils tracks in nonpolar liquids. Calculation of the effect of external electric field on the escape probability

    Science.gov (United States)

    Mateja, Piotr; Wojcik, Mariusz

    2016-07-01

    A computer simulation method is applied to study electron-ion recombination in tracks of low-energy nuclear recoils in nonpolar liquids in which the electron transport can be described as ideal diffusion. The electron escape probability is calculated as a function of applied electric field, both for the field parallel to the track and for the field perpendicular to the track. The dependence of escape probability on the field direction is the stronger, the longer the ionization track, with a significant effect being found already for tracks of ~100 nm length. The results are discussed in the context of possible applications of nonpolar molecular liquids as target media in directional dark matter detectors.

  1. Impact of aging conditions on mechanical properties of thermoplastic polyurethane

    International Nuclear Information System (INIS)

    In this study, impact of environmental aging conditions on the mechanical properties of thermoplastic polyurethane (TPU) was investigated. Especially, effect of temperature on water diffusion has been studied. Water-sorption experiments, tensile test and dynamic mechanical thermal analysis (DMTA) were performed after immersion in distilled water at different temperatures (25, 70 and 90 oC). The sorption process was analyzed by gravimetric measurements at different temperatures. Also, diffusion coefficients of solvent molecules in the TPU samples were identified. Therefore the activation energy and the mixing enthalpy were deduced. The aging impact on some mechanical properties of this material has been investigated after various aging cycles. Degradation of mechanical properties was observed. In fact, elastic modulus and stress at 200% of strain were decreased. It was also shown that such degradation largely depends on both aging temperature and aging immersion duration. The storage modulus (E') was also affected by the hygrothermal (HT) environment. The modification of mechanical properties seems to be well correlated to structural observations obtained from scanning electron microscopy (SEM) photographs. Finally, through thermal aging experiments, it was deduced that the combination of temperature with water seems to be a major factor of TPU degradation.

  2. Model of Large-format EO-IR sensor for calculating the probability of true and false detection and tracking for moving and fixed objects

    Science.gov (United States)

    Korb, Andrew R.; Grossman, Stanley I.

    2015-05-01

    A model was developed to understand the effects of spatial resolution and Signal to Noise ratio on the detection and tracking performance of wide-field, diffraction-limited electro-optic and infrared motion imagery systems. False positive detection probability and false positive rate per frame were calculated as a function of target-to-background contrast and object size. Results showed that moving objects are fundamentally more difficult to detect than stationary objects because SNR for fixed objects increases and false positive probability detection rates diminish rapidly with successive frames whereas for moving objects the false detection rate remains constant or increases with successive frames. The model specifies that the desired performance of a detection system, measured by the false positive detection rate, can be achieved by image system designs with different combinations of SNR and spatial resolution, usually requiring several pixels resolving the object; this capability to tradeoff resolution and SNR enables system design trades and cost optimization. For operational use, detection thresholds required to achieve a particular false detection rate can be calculated. Interestingly, for moderate size images the model converges to the Johnson Criteria. Johnson found that an imaging system with an SNR >3.5 has a probability of detection >50% when the resolution on the object is 4 pixels or more. Under these conditions our model finds the false positive rate is less than one per hundred image frames, and the ratio of the probability of object detection to false positive detection is much greater than one. The model was programmed into Matlab to generate simulated images frames for visualization.

  3. Tumor control probability and the utility of 4D vs 3D dose calculations for stereotactic body radiotherapy for lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Valdes, Gilmer, E-mail: gilmer.valdes@uphs.upenn.edu [Department of Radiation Oncology, Perelman Center for Advanced Medicine, University of Pennsylvania, Philadelphia, PA (United States); Robinson, Clifford [Department of Radiation Oncology, Siteman Cancer Center, Washington University in St. Louis, St. Louis, MO (United States); Lee, Percy [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States); Morel, Delphine [Department of Biomedical Engineering, AIX Marseille 2 University, Marseille (France); Department of Medical Physics, Joseph Fourier University, Grenoble (France); Low, Daniel; Iwamoto, Keisuke S.; Lamb, James M. [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States)

    2015-04-01

    Four-dimensional (4D) dose calculations for lung cancer radiotherapy have been technically feasible for a number of years but have not become standard clinical practice. The purpose of this study was to determine if clinically significant differences in tumor control probability (TCP) exist between 3D and 4D dose calculations so as to inform the decision whether 4D dose calculations should be used routinely for treatment planning. Radiotherapy plans for Stage I-II lung cancer were created for 8 patients. Clinically acceptable treatment plans were created with dose calculated on the end-exhale 4D computed tomography (CT) phase using a Monte Carlo algorithm. Dose was then projected onto the remaining 9 phases of 4D-CT using the Monte Carlo algorithm and accumulated onto the end-exhale phase using commercially available deformable registration software. The resulting dose-volume histograms (DVH) of the gross tumor volume (GTV), planning tumor volume (PTV), and PTV{sub setup} were compared according to target coverage and dose. The PTV{sub setup} was defined as a volume including the GTV and a margin for setup uncertainties but not for respiratory motion. TCPs resulting from these DVHs were estimated using a wide range of alphas, betas, and tumor cell densities. Differences of up to 5 Gy were observed between 3D and 4D calculations for a PTV with highly irregular shape. When the TCP was calculated using the resulting DVHs for fractionation schedules typically used in stereotactic body radiation therapy (SBRT), the TCP differed at most by 5% between 4D and 3D cases, and in most cases, it was by less than 1%. We conclude that 4D dose calculations are not necessary for most cases treated with SBRT, but they might be valuable for irregularly shaped target volumes. If 4D calculations are used, 4D DVHs should be evaluated on volumes that include margin for setup uncertainty but not respiratory motion.

  4. Tumor control probability and the utility of 4D vs 3D dose calculations for stereotactic body radiotherapy for lung cancer

    International Nuclear Information System (INIS)

    Four-dimensional (4D) dose calculations for lung cancer radiotherapy have been technically feasible for a number of years but have not become standard clinical practice. The purpose of this study was to determine if clinically significant differences in tumor control probability (TCP) exist between 3D and 4D dose calculations so as to inform the decision whether 4D dose calculations should be used routinely for treatment planning. Radiotherapy plans for Stage I-II lung cancer were created for 8 patients. Clinically acceptable treatment plans were created with dose calculated on the end-exhale 4D computed tomography (CT) phase using a Monte Carlo algorithm. Dose was then projected onto the remaining 9 phases of 4D-CT using the Monte Carlo algorithm and accumulated onto the end-exhale phase using commercially available deformable registration software. The resulting dose-volume histograms (DVH) of the gross tumor volume (GTV), planning tumor volume (PTV), and PTVsetup were compared according to target coverage and dose. The PTVsetup was defined as a volume including the GTV and a margin for setup uncertainties but not for respiratory motion. TCPs resulting from these DVHs were estimated using a wide range of alphas, betas, and tumor cell densities. Differences of up to 5 Gy were observed between 3D and 4D calculations for a PTV with highly irregular shape. When the TCP was calculated using the resulting DVHs for fractionation schedules typically used in stereotactic body radiation therapy (SBRT), the TCP differed at most by 5% between 4D and 3D cases, and in most cases, it was by less than 1%. We conclude that 4D dose calculations are not necessary for most cases treated with SBRT, but they might be valuable for irregularly shaped target volumes. If 4D calculations are used, 4D DVHs should be evaluated on volumes that include margin for setup uncertainty but not respiratory motion

  5. Calculation of the pipes failure probability of the Rcic system of a nuclear power station by means of software WinPRAISE 07

    International Nuclear Information System (INIS)

    The growth and the cracks propagation by fatigue are a typical degradation mechanism that is presented in the nuclear industry as in the conventional industry; the unstable propagation of a crack can cause the catastrophic failure of a metallic component even with high ductility; for this reason, activities of programmed maintenance have been established in the industry using inspection and visual techniques and/or ultrasound with an established periodicity allowing to follow up to these growths, controlling the undesirable effects; however, these activities increase the operation costs; and in the peculiar case of the nuclear industry, they increase the radiation exposure to the participant personnel. The use of mathematical processes that integrate concepts of uncertainty, material properties and the probability associated to the inspection results, has been constituted as a powerful tool of evaluation of the component reliability, reducing costs and exposure levels. In this work the evaluation of the failure probability by cracks growth preexisting by fatigue is presented, in pipes of a Reactor Core Isolation Cooling system (Rcic) in a nuclear power station. The software WinPRAISE 07 (Piping Reliability Analysis Including Seismic Events) was used supported in the probabilistic fracture mechanics principles. The obtained values of failure probability evidenced a good behavior of the analyzed pipes with a maximum order of 1.0 E-6, therefore is concluded that the performance of the lines of these pipes is reliable even extrapolating the calculations at 10, 20, 30 and 40 years of service. (Author)

  6. Position-probability-sampled Monte Carlo calculation of VMAT, 3DCRT, step-shoot IMRT, and helical tomotherapy dose distributions using BEAMnrc/DOSXYZnrc

    International Nuclear Information System (INIS)

    Purpose: The commercial release of volumetric modulated arc therapy techniques using a conventional linear accelerator and the growing number of helical tomotherapy users have triggered renewed interest in dose verification methods, and also in tools for exploring the impact of machine tolerance and patient motion on dose distributions without the need to approximate time-varying parameters such as gantry position, MLC leaf motion, or patient motion. To this end we have developed a Monte Carlo-based calculation method capable of simulating a wide variety of treatment techniques without the need to resort to discretization approximations. Methods: The ability to perform complete position-probability-sampled Monte Carlo dose calculations was implemented in the BEAMnrc/DOSXZYnrc user codes of EGSnrc. The method includes full accelerator head simulations of our tomotherapy and Elekta linacs, and a realistic representation of continous motion via the sampling of a time variable. The functionality of this algorithm was tested via comparisons with both measurements and treatment planning dose distributions for four types of treatment techniques: 3D conformal, step-shoot intensity modulated radiation therapy, helical tomotherapy, and volumetric modulated arc therapy. Results: For static fields, the absolute dose agreement between the EGSnrc Monte Carlo calculations and measurements is within 2%/1 mm. Absolute dose agreement between Monte Carlo calculations and treatment planning system for the four different treatment techniques is within 3%/3 mm. Discrepancies with the tomotherapy TPS on the order of 10%/5 mm were observed for the extreme example of a small target located 15 cm off-axis and planned with a low modulation factor. The increase in simulation time associated with using position-probability sampling, as opposed to the discretization approach, was less than 2% in most cases. Conclusions: A single Monte Carlo simulation method can be used to calculate patient

  7. Probability and paternity testing.

    OpenAIRE

    Elston, R C

    1986-01-01

    A probability can be viewed as an estimate of a variable that is sometimes 1 and sometimes 0. To have validity, the probability must equal the expected value of that variable. To have utility, the average squared deviation of the probability from the value of that variable should be small. It is shown that probabilities of paternity calculated by the use of Bayes' theorem under appropriate assumptions are valid, but they can vary in utility. In particular, a recently proposed probability of p...

  8. IBM-1 calculations of low-lying excited levels and electric transition probabilities B(E2) on the even-even 174-180Hf isotopes

    International Nuclear Information System (INIS)

    In this paper, the interacting boson model (IBM-1) is discussed and employed for calculating the energy level and the electromagnetic transition probabilities B(E2) of the even - even 174-180Hf isotopes. These isotopes have been investigated based on two different arrangements; i.e., the dynamical symmetry of 176-180Mf isotopes, SU (3) (deformed nuclei) and the dynamical symmetry of 174Hf isotopein transition region SU(30-O(6). The determined values using the IBM-1 Hamiltonian showed significant agreement with the experimentally reported energy level and B(E2) values. The model provides a fast and accurate predication method of energy level B(E2)values. (authors).

  9. Simultaneous analysis of matter radii, transition probabilities, and excitation energies of Mg isotopes by angular-momentum-projected configuration-mixing calculations

    Science.gov (United States)

    Shimada, Mitsuhiro; Watanabe, Shin; Tagami, Shingo; Matsumoto, Takuma; Shimizu, Yoshifumi R.; Yahiro, Masanobu

    2016-06-01

    We perform simultaneous analysis of (1) matter radii, (2) B (E 2 ;0+→2+) transition probabilities, and (3) excitation energies, E (2+) and E (4+) , for Mg-4024 by using the beyond-mean-field (BMF) framework with angular-momentum-projected configuration mixing with respect to the axially symmetric β2 deformation with infinitesimal cranking. The BMF calculations successfully reproduce all of the data for rm,B (E 2 ) , and E (2+) and E (4+) , indicating that it is quite useful for data analysis; particularly for low-lying states. We also discuss the absolute value of the deformation parameter β2 deduced from measured values of B (E 2 ) and rm. This framework makes it possible to investigate the effects of β2 deformation, the change in β2 due to restoration of rotational symmetry, β2 configuration mixing, and the inclusion of time-odd components by infinitesimal cranking. Under the assumption of axial deformation and parity conservation, we clarify which effect is important for each of the three measurements and propose the kinds of BMF calculations that are practical for each of the three kinds of observables.

  10. Lexicographic probability, conditional probability, and nonstandard probability

    OpenAIRE

    Halpern, Joseph Y.

    2003-01-01

    The relationship between Popper spaces (conditional probability spaces that satisfy some regularity conditions), lexicographic probability systems (LPS's), and nonstandard probability spaces (NPS's) is considered. If countable additivity is assumed, Popper spaces and a subclass of LPS's are equivalent; without the assumption of countable additivity, the equivalence no longer holds. If the state space is finite, LPS's are equivalent to NPS's. However, if the state space is infinite, NPS's are ...

  11. Uncertainty calculation in the RIO air quality interpolation model and aggregation to yearly average and exceedance probability taking into account the temporal auto-correlation.

    Science.gov (United States)

    Maiheu, Bino; Nele, Veldeman; Janssen, Stijn; Fierens, Frans; Trimpeneers, Elke

    2010-05-01

    RIO is an operational air quality interpolation model developed by VITO and IRCEL-CELINE and produces hourly maps for different pollutant concentrations such as O3, PM10 and NO2 measured in Belgium [1]. The RIO methodology consists of residual interpolation by Ordinary Kriging of the residuals of the measured concentrations and pre-determined trend functions which express the relation between land cover information derived from the CORINE dataset and measured time-averaged concentrations [2]. RIO is an important tool for the Flemish administration and is among others used to report, as is required by each member state, on the air quality status in Flanders to the European Union. We feel that a good estimate of the uncertainty of the yearly average concentration maps and the probability of norm-exceedance are both as important as the values themselves. In this contribution we will discuss the uncertainties specific to the RIO methodology, where we have both contributions from the Ordinary Kriging technique as well as the trend functions. Especially the parameterisation of the uncertainty w.r.t. the trend functions will be the key indicator for the degree of confidence the model puts into using land cover information for spatial interpolation of pollutant concentrations. Next, we will propose a method which enables us to calculate the uncertainty on the yearly average concentrations as well as the number of exceedance days, taking into account the temporal auto-correlation of the concentration fields. It is clear that the autocorrelation will have a strong impact on the uncertainty estimation [3] of yearly averages. The method we propose is based on a Monte Carlo technique that generates an ensemble of interpolation maps with the correct temporal auto-correlation structure. From a generated ensemble, the calculation of norm-exceedance probability at each interpolation location becomes quite straightforward. A comparison with the ad-hoc method proposed in [3], where

  12. URR [Unresolved Resonance Region] computer code: A code to calculate resonance neutron cross-section probability tables, Bondarenko self-shielding factors, and self-indication ratios for fissile and fertile nuclides

    International Nuclear Information System (INIS)

    The URR computer code has been developed to calculate cross-section probability tables, Bondarenko self-shielding factors, and self-indication ratios for fertile and fissile isotopes in the unresolved resonance region. Monte Carlo methods are utilized to select appropriate resonance parameters and to compute the cross sections at the desired reference energy. The neutron cross sections are calculated by the single-level Breit-Wigner formalism with s-, p-, and d-wave contributions. The cross-section probability tables are constructed by sampling by Doppler broadened cross-sections. The various self-shielding factors are computer numerically as Lebesgue integrals over the cross-section probability tables

  13. Simultaneous analysis of matter radii, transition probabilities, and excitation energies of Mg isotopes by angular-momentum-projected configuration-mixing calculations

    CERN Document Server

    Shimada, Mitsuhiro; Tagami, Shingo; Matsumoto, Takuma; Shimizu, Yoshifumi R; Yahiro, Masanobu

    2016-01-01

    We perform simultaneous analysis of (1) matter radii, (2) $B(E2; 0^+ \\rightarrow 2^+ )$ transition probabilities, and (3) excitation energies, $E(2^+)$ and $E(4^+)$, for $^{24-40}$Mg by using the beyond mean-field (BMF) framework with angular-momentum-projected configuration mixing with respect to the axially symmetric $\\beta_2$ deformation with infinitesimal cranking. The BMF calculations successfully reproduce all of the data for $r_{\\rm m}$, $B(E2)$, and $E(2^+)$ and $E(4^+)$, indicating that it is quite useful for data analysis, particularly for low-lying states. We also discuss the absolute value of the deformation parameter $\\beta_2$ deduced from measured values of $B(E2)$ and $r_{\\rm m}$. This framework makes it possible to investigate the effects of $\\beta_2$ deformation, the change in $\\beta_2$ due to restoration of rotational symmetry, $\\beta_2$ configuration mixing, and the inclusion of time-odd components by infinitesimal cranking. Under the assumption of axial deformation and parity conservation,...

  14. Application of multi-dimensional discrimination diagrams and probability calculations to Paleoproterozoic acid rocks from Brazilian cratons and provinces to infer tectonic settings

    Science.gov (United States)

    Verma, Sanjeet K.; Oliveira, Elson P.

    2013-08-01

    In present work, we applied two sets of new multi-dimensional geochemical diagrams (Verma et al., 2013) obtained from linear discriminant analysis (LDA) of natural logarithm-transformed ratios of major elements and immobile major and trace elements in acid magmas to decipher plate tectonic settings and corresponding probability estimates for Paleoproterozoic rocks from Amazonian craton, São Francisco craton, São Luís craton, and Borborema province of Brazil. The robustness of LDA minimizes the effects of petrogenetic processes and maximizes the separation among the different tectonic groups. The probability based boundaries further provide a better objective statistical method in comparison to the commonly used subjective method of determining the boundaries by eye judgment. The use of readjusted major element data to 100% on an anhydrous basis from SINCLAS computer program, also helps to minimize the effects of post-emplacement compositional changes and analytical errors on these tectonic discrimination diagrams. Fifteen case studies of acid suites highlighted the application of these diagrams and probability calculations. The first case study on Jamon and Musa granites, Carajás area (Central Amazonian Province, Amazonian craton) shows a collision setting (previously thought anorogenic). A collision setting was clearly inferred for Bom Jardim granite, Xingú area (Central Amazonian Province, Amazonian craton) The third case study on Older São Jorge, Younger São Jorge and Maloquinha granites Tapajós area (Ventuari-Tapajós Province, Amazonian craton) indicated a within-plate setting (previously transitional between volcanic arc and within-plate). We also recognized a within-plate setting for the next three case studies on Aripuanã and Teles Pires granites (SW Amazonian craton), and Pitinga area granites (Mapuera Suite, NW Amazonian craton), which were all previously suggested to have been emplaced in post-collision to within-plate settings. The seventh case

  15. 外浮顶油罐雷击起火概率计算%Calculation on probability of fire caused by lightning for external floating roof oil tanks

    Institute of Scientific and Technical Information of China (English)

    刘健; 杨仲江; 卢慧慧

    2016-01-01

    Fire accidents caused by lightning strike on large external floating roof ( EFR) oil tanks have occurred for many times, so it is practically significant to evaluate the safety objectively and calculate the probability of fire caused by lightning. The harm modes of lightning on oil tanks were presented.The annual incidence of lightning strike for external floating roof oil tanks were calculated by means of Monte Carlo method combined with electro-geometric model ( EGM ) .The difference in protection effects of conventional electrostatic conductors and retractable grounding assemblies ( RGA) were analyzed.The annual accident rates of spark discharge by lightning on oil tanks installing RGAs were discussed.The results showed that the annual incidence of lightning strike increase with the increasing diameters and heights of oil tanks.The protection effect by u-sing RGA is better than that of conventional electrostatic conductors.The probability and annual accident rates of spark dis-charge can be significantly decreased when multiple RGAs are installed.The annual accident rates of spark discharge by lightning on oil tanks can be reduced to 10 -5 when only two RGAs are installed.%大型外浮顶储罐多次发生雷击起火事故,因此对其安全性做出客观评价,计算雷击起火概率现实意义重大。通过分析雷电对外浮顶油罐的危害方式,利用蒙特卡洛方法结合电气几何模型计算外浮顶油罐年雷击率。分析采用导静电线和可伸缩接地装置( RGA)的防护效果差别。最后计算安装可伸缩接地装置后油罐遭受雷击产生火花放电的年事故率。计算结果表明:年雷击率随着油罐直径和罐壁高度的增大而增加;采用可伸缩接地装置的防护效果明显优于传统导静电线;安装多个可伸缩接地装置可以明显降低产生火花的概率和年事故率。两个RGA就可以将油罐遭受雷击产生火花放电的年事故率降至10-5以下。

  16. Monte Carlo dose calculations and radiobiological modelling: analysis of the effect of the statistical noise of the dose distribution on the probability of tumour control

    International Nuclear Information System (INIS)

    The aim of this work is to investigate the influence of the statistical fluctuations of Monte Carlo (MC) dose distributions on the dose volume histograms (DVHs) and radiobiological models, in particular the Poisson model for tumour control probability (tcp). The MC matrix is characterized by a mean dose in each scoring voxel, d, and a statistical error on the mean dose, σd; whilst the quantities d and σd depend on many statistical and physical parameters, here we consider only their dependence on the phantom voxel size and the number of histories from the radiation source. Dose distributions from high-energy photon beams have been analysed. It has been found that the DVH broadens when increasing the statistical noise of the dose distribution, and the tcp calculation systematically underestimates the real tumour control value, defined here as the value of tumour control when the statistical error of the dose distribution tends to zero. When increasing the number of energy deposition events, either by increasing the voxel dimensions or increasing the number of histories from the source, the DVH broadening decreases and tcp converges to the 'correct' value. It is shown that the underestimation of the tcp due to the noise in the dose distribution depends on the degree of heterogeneity of the radiobiological parameters over the population; in particular this error decreases with increasing the biological heterogeneity, whereas it becomes significant in the hypothesis of a radiosensitivity assay for single patients, or for subgroups of patients. It has been found, for example, that when the voxel dimension is changed from a cube with sides of 0.5 cm to a cube with sides of 0.25 cm (with a fixed number of histories of 108 from the source), the systematic error in the tcp calculation is about 75% in the homogeneous hypothesis, and it decreases to a minimum value of about 15% in a case of high radiobiological heterogeneity. The possibility of using the error on the tcp to

  17. Adjustment of the thermohydraulic NUCIRC 2.0 code to the present aging conditions of the Embalse nuclear power plant

    International Nuclear Information System (INIS)

    This work gives a description of the adjustment process of NUCIRC code to the actual aging conditions of Embalse nuclear power plant. For this adjustment the flow of the fuel channels of the primary heat transport system (PHTS) is calculated using the channel heat balance flow (CHBF) methodology. Then roughness and the localized loss of charge are modified in NUCIRC code for different groups of channels. These adjustments are done in way to fit by regions the channels flows calculated with NUCIRC to the CHBF flows. The fitting results in a discrepancy by regions of less than 0,1% and an average quadratic error of 5% approximately. These values indicate that the code NUCIRC is right adjusted for critical channel power calculations and aging tracking of PHTS. (author)

  18. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  19. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  20. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  1. URR [Unresolved Resonance Region] computer code: A code to calculate resonance neutron cross-section probability tables, Bondarenko self-shielding factors, and self-indication ratios for fissile and fertile nuclides

    International Nuclear Information System (INIS)

    The URR computer code has been developed to calculate cross-section probability tables, Bondarenko self-shielding factors, and self- indication ratios for fertile and fissile isotopes in the unresolved resonance region. Monte Carlo methods are utilized to select appropriate resonance parameters and to compute the cross sections at the desired reference energy. The neutron cross sections are calculated by the single-level Breit-Wigner formalism with s-, p-, and d-wave contributions. The cross-section probability tables are constructed by sampling the Doppler broadened cross-section. The various shelf-shielded factors are computed numerically as Lebesgue integrals over the cross-section probability tables. 6 refs

  2. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  3. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  4. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  5. Durability of CFRP strengthened concrete structures under accelerated or environmental ageing conditions

    OpenAIRE

    Benzarti, Karim; QUIERTANT, Marc; Chataigner, Sylvain; Aubagnac, Christophe; Nishizaki, Itaru; Kato, Y.

    2008-01-01

    The durability of concrete slabs strengthened by bonded composite materials has been investigated in the framework of an international cooperation between two French and Japanese research institutes. Time evolution of the concrete/composite adhesive bond strength was studied under both controlled and environmental ageing conditions, by using different mechanical characterization methods. The first results of this ongoing experimental campaign are presented.

  6. Effect of physicochemical aging conditions on the composite-composite repair bond strength

    NARCIS (Netherlands)

    Brendeke, Johannes; Ozcan, Mutlu

    2007-01-01

    Purpose: This study evaluated the effect of different physicochemical aging methods and surface conditioning techniques on the repair bond strength of composite. It was hypothesized that the aging conditions would decrease the repair bond strength and surface conditioning methods would perform simil

  7. Comparison of Acuros (AXB) and Anisotropic Analytical Algorithm (AAA) for dose calculation in treatment of oesophageal cancer: effects on modelling tumour control probability

    International Nuclear Information System (INIS)

    To investigate systematic changes in dose arising when treatment plans optimised using the Anisotropic Analytical Algorithm (AAA) are recalculated using Acuros XB (AXB) in patients treated with definitive chemoradiotherapy (dCRT) for locally advanced oesophageal cancers. We have compared treatment plans created using AAA with those recalculated using AXB. Although the Anisotropic Analytical Algorithm (AAA) is currently more widely used in clinical routine, Acuros XB (AXB) has been shown to more accurately calculate the dose distribution, particularly in heterogeneous regions. Studies to predict clinical outcome should be based on modelling the dose delivered to the patient as accurately as possible. CT datasets from ten patients were selected for this retrospective study. VMAT (Volumetric modulated arc therapy) plans with 2 arcs, collimator rotation ± 5-10° and dose prescription 50 Gy / 25 fractions were created using Varian Eclipse (v10.0). The initial dose calculation was performed with AAA, and AXB plans were created by re-calculating the dose distribution using the same number of monitor units (MU) and multileaf collimator (MLC) files as the original plan. The difference in calculated dose to organs at risk (OAR) was compared using dose-volume histogram (DVH) statistics and p values were calculated using the Wilcoxon signed rank test. The potential clinical effect of dosimetric differences in the gross tumour volume (GTV) was evaluated using three different TCP models from the literature. PTV Median dose was apparently 0.9 Gy lower (range: 0.5 Gy - 1.3 Gy; p < 0.05) for VMAT AAA plans re-calculated with AXB and GTV mean dose was reduced by on average 1.0 Gy (0.3 Gy −1.5 Gy; p < 0.05). An apparent difference in TCP of between 1.2% and 3.1% was found depending on the choice of TCP model. OAR mean dose was lower in the AXB recalculated plan than the AAA plan (on average, dose reduction: lung 1.7%, heart 2.4%). Similar trends were seen for CRT plans

  8. Probable approaches to develop particle beam energy drivers and to calculate wall material ablation with X ray radiation from imploded targets

    International Nuclear Information System (INIS)

    The first subject was the development of future ion beam driver with medium-mass ion specie. This may enable us to develop a compromised driver from the point of view of the micro-divergence angle and the cost. We produced nitrogen ion beams, and measured the micro-divergence angle on the anode surface. The measured value was 5-6mrad for the above beam with 300-400keV energy, 300A peak current and 50ns duration. This value was enough small and tolerable for the future energy driver. The corresponding value for the proton beam with higher peak current was 20-30mrad, which was too large. So that, the scale-up experiment with the above kind of medium-mass ion beam must be realized urgently to clarify the beam characteristics in more details. The reactor wall ablation with the implosion X-ray was also calculated as the second subject in this paper. (author)

  9. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving a...

  10. Survivability of integrated PVDF film sensors to accelerated ageing conditions in aeronautical/aerospace structures

    International Nuclear Information System (INIS)

    This work validates the use of integrated polyvinylidene fluoride (PVDF) film sensors for dynamic testing, even after being subjected to UV-thermo-hygro-mechanical accelerated ageing conditions. The verification of PVDF sensors’ survivability in these environmental conditions, typically confronted by civil and military aircraft, is the main concern of the study. The evaluation of survivability is made by a comparison of dynamic testing results provided by the PVDF patch sensors subjected to an accelerated ageing protocol, and those provided by neutral non-aged sensors (accelerometers). The available measurements are the time-domain response signals issued from a modal analysis procedure, and the corresponding frequency response functions (FRF). These are in turn used to identify the constitutive properties of the samples by extraction of the modal parameters, in particular the natural frequencies. The composite specimens in this study undergo different accelerated ageing processes. After several weeks of experimentation, the samples exhibit a loss of stiffness, represented by a decrease in the elastic moduli down to 10%. Despite the ageing, the integrated PVDF sensors, subjected to the same ageing conditions, are still capable of providing reliable data to carry out a close followup of these changes. This survivability is a determinant asset in order to use integrated PVDF sensors to perform structural health monitoring (SHM) in the future of full-scale composite aeronautical structures. (paper)

  11. 雷电引发油罐火灾爆炸事故的概率计算%Study on Probability Calculation for Oil Tank Fire and Explosion Caused by Lightning

    Institute of Scientific and Technical Information of China (English)

    苏伯尼; 黄弘; 李云涛

    2013-01-01

    In order to perform quantitative risk assessment of fire caused by lightning in oil tank areas,probabilities of lightning-induced floating roof tank fire and explosion are estimated.According to domestic and international standards,models of lightning and floating roof tanks were established.The risk probability of lightning hitting oil tanks was calculated using these models.After that,the risk probability of fire and explosion when lightning hit oil tanks was estimated according to some existing experimental results.The results show that probability of rim seal spark causing fire is greater than that of lightning burning through oil tank shell under normal circumstances.And probability of lightning-induced fire and explosion on tanks using mechanical primary seal is higher than that on tanks using soft primary seal.%为了对储油罐区雷击火灾进行定量的风险评估,估算雷电引发浮顶罐火灾爆炸事件的概率.根据国内外相关标准,建立闪电和浮顶罐的模型,并计算闪电击中不同尺寸油罐的概率.然后,根据已有的试验结果,估算闪电在击中不同尺寸油罐的情况下,通过烧穿油罐外壳以及引起密封圈火花2种途径导致火灾爆炸的概率,开展雷电引发油罐火灾爆炸的风险评估.计算结果表明:一般情况下,密封圈火花导致火灾的概率大于闪电烧穿油罐外壳导致火灾的概率;采用机械密封的浮顶罐雷击火灾事故概率比采用软密封的高.

  12. Effect of age condition on fatigue properties of 2E12 aluminum alloy

    Institute of Scientific and Technical Information of China (English)

    YAN Liang; DU Feng-shan; DAI Sheng-long; YANG Shou-jie

    2010-01-01

    The fatigue behaviors of 2E12 aluminum alloy in T3 and T6 conditions at room temperature in air were investigated.The microstructures and fatigue fracture surfaces of the alloy were examined by transmission electron microscopy(TEM)and scanning electron microscopy(SEM).The results show that the alloy exhibits higher fatigue crack propagation(FCP)resistance in T3condition than in T6 condition,the fatigue life is increased by 54% and the fatigue crack growth rate(FCGR)decreases significantly.The fatigue fractures of the alloy in T3 and T6 conditions are transgranular.But in T3 condition,secondary cracks occur and fatigue striations are not clear.In T6 condition,ductile fatigue striations are observed.The effect of aging conditions on fatigue behaviors is explained in terms of the slip planarity of dislocations and the cyclic slip reversibility.

  13. 考虑变量相关性的尾矿坝坡失稳溃坝概率计算方法%Calculation method on probability of tailings dam failure caused by dam slope instability considering correlation of variables

    Institute of Scientific and Technical Information of China (English)

    郑欣; 李全明; 许开立; 耿丽艳

    2015-01-01

    对国内外尾矿坝溃坝事故进行整理分析,得出坝坡失稳是导致溃坝的一个主要原因,利用突变理论对坝坡失稳进行分析,从突变学角度证明内摩擦角和粘聚力是尾矿坝坡失稳溃坝的主要影响因素。摒弃传统的尾矿坝抗滑稳定性安全系数定值计算方法,选取综合考虑诸因素的不确定性的概率方法来计算尾矿坝坡失稳溃坝概率,确定内摩擦角和粘聚力作为随机参数,建立尾矿坝坡失稳破坏的功能函数,确定模拟次数;采用考虑随机变量相关性的蒙特卡罗进行尾矿坝坡失稳溃坝概率计算。该方法成功克服了matlab传统子集模拟方法只能解决随机变量为正态分布且变量之间相互独立的不足,并以某尾矿坝为例对其因坝坡失稳导致的溃坝概率进行了计算。%The accidents of tailings dam failure at home and abroad were analyzed, and it showed that the dam slope instability was a major reason causing dam failure.The dam slope instability was analyzed by the catastrophe theory, and it proved from the perspective of catastrophe that the internal friction angle and cohesion strength were the main influence factors of dam slope instability and failure.Abandoning the traditional calculation method on an-ti-slide stability safety coefficient of tailings dam, the probability method comprehensively considering the uncertain-ty of each factor was selected to calculate the probability of dam failure.The internal friction angle and cohesion strength were taken as random parameters, the function of dam slope instability and failure was established, and the number of simulation was determined.The Monte Carlo considering the correlation between random variables was applied to calculate the probability of dam failure.This method overcomes the shortcomings that the traditional Mat-lab subset simulation method can only solve the problems that random variables are normal distribution and the

  14. Interpretations of Negative Probabilities

    OpenAIRE

    Burgin, Mark

    2010-01-01

    In this paper, we give a frequency interpretation of negative probability, as well as of extended probability, demonstrating that to a great extent, these new types of probabilities, behave as conventional probabilities. Extended probability comprises both conventional probability and negative probability. The frequency interpretation of negative probabilities gives supportive evidence to the axiomatic system built in (Burgin, 2009; arXiv:0912.4767) for extended probability as it is demonstra...

  15. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  16. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  17. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  18. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  19. Gaussian Probabilities and Expectation Propagation

    OpenAIRE

    Cunningham, John P.; Hennig, Philipp; Lacoste-Julien, Simon

    2011-01-01

    While Gaussian probability densities are omnipresent in applied mathematics, Gaussian cumulative probabilities are hard to calculate in any but the univariate case. We study the utility of Expectation Propagation (EP) as an approximate integration method for this problem. For rectangular integration regions, the approximation is highly accurate. We also extend the derivations to the more general case of polyhedral integration regions. However, we find that in this polyhedral case, EP's answer...

  20. On Probability Leakage

    OpenAIRE

    Briggs, William M

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  1. Are early onset aging conditions correlated to daily activity functions in youth and adults with Down syndrome?

    Science.gov (United States)

    Lin, Jin-Ding; Lin, Lan-Ping; Hsu, Shang-Wei; Chen, Wen-Xiu; Lin, Fu-Gong; Wu, Jia-Ling; Chu, Cordia

    2014-11-13

    This study aims to answer the research question of "Are early onset aging conditions correlated to daily activity functions in youth and adults with Down syndrome (DS)?" A cross-sectional survey was employed to recruit 216 individuals with DS over 15 years of age in the analyses. A structured questionnaire included demographic data, brief self-reported aging conditions, Dementia Screening Questionnaire for Individuals with Intellectual Disabilities (DSQIID) and activity of daily living (ADL) scales were completed by the primary caregivers who were well-suited for providing information on the functioning conditions of the DS individuals. Results showed that the most five frequent aging conditions (sometimes, usually and always) included frailty (20.2%), vision problem (15.8%), loss of language ability (15.3%), sleep problem (14.9%) and memory impairment (14.5%). Other onset aging conditions included more chronic diseases (13.9%), hearing loss (13%), chewing ability and tooth loss (12.5%), incontinence (11.1%), depressive syndrome (7.7%), falls and gait disorder (7.2%), loss of taste and smell (7.2%). The data also showed scores of DSQIID, onset aging conditions and ADL has significant relationships each other in Pearson's correlation tests. Finally, multiple linear regression analyses indicated onset aging conditions (β=-0.735, p<0.001) can significantly predicted the variation in ADL scores after adjusting other factors (R(2)=0.381). This study suggests that the authority should initiate early intervention programs aim to improve healthy aging and ADL functions for people with DS. PMID:25462513

  2. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  3. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    2013-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  4. Fusion Probability in Dinuclear System

    CERN Document Server

    Hong, Juhee

    2015-01-01

    Fusion can be described by the time evolution of a dinuclear system with two degrees of freedom, the relative motion and transfer of nucleons. In the presence of the coupling between two collective modes, we solve the Fokker-Planck equation in a locally harmonic approximation. The potential of a dinuclear system has the quasifission barrier and the inner fusion barrier, and the escape rates can be calculated by the Kramers' model. To estimate the fusion probability, we calculate the quasifission rate and the fusion rate. We investigate the coupling effects on the fusion probability and the cross section of evaporation residue.

  5. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  6. Effect of surface conditioning methods on the microtensile bond strength of resin composite to composite after aging conditions

    NARCIS (Netherlands)

    Ozcan, Mutlu; Barbosa, Silvia Helena; Melo, Renata Marques; Galhano, Graziela Avila Prado; Bottino, Marco Antonio

    2007-01-01

    Objectives. This study evaluated the effect of two different surface conditioning methods on the repair bond strength of a bis-GMA-adduct/bis-EMA/TEGDMA based resin composite after three aging conditions. Methods. Thirty-six composite resin blocks (Esthet X, Dentsply) were prepared (5 mm x 6 mm x 6

  7. 基于模糊故障树方法的钻井平台井喷概率计算%Probability Calculation of Blowout of Drilling Platform Based on Fuzzy Fault Tree Method

    Institute of Scientific and Technical Information of China (English)

    董海波; 顾学康

    2013-01-01

      基于模糊理论,提出了一种定量风险评估方法——模糊故障树方法。查阅历史数据库或者借助专家判断,给出故障树模型中各基本事件发生可能性的模糊数表示。考虑到不同专家意见之间可能存在的差异,给出了处理专家意见的运算法则及确定专家权重的理论方法。以半潜式钻井平台发生井喷为顶事件,构建了故障树模型,依据给出的模糊故障树理论模型,计算得到半潜式平台在钻进或固井过程中发生井喷的概率。%Based on fuzzy theory, a quantitative risk assessment method called fuzzy fault tree analysis is presented. By referring to risk database or by dint of expert judgements, this paper presents occurrence possibility of each basic event in the fault tree model which is expressed in the form of fuzzy numbers. Since each expert may have a different opinion, this paper developes an algorithm to aggregate expert opinion and a method to determine the importance weight of expert opinion. This paper constructs fault tree model on blowout of semi-submersible drilling platform. According to fuzzy fault tree analysis method, the probability of blowout is calculated during drilling or cementing on a semi-submersible drilling platform.

  8. Calculation of Leakage Probability of Pipe Connecting Flange Based on Monte Carlo Method%基于MonteCarlo法管道连接法兰泄漏概率计算

    Institute of Scientific and Technical Information of China (English)

    王程龙; 谢禹钧; 韦权权; 于小泽

    2016-01-01

    运用有限元软件 Ansys 对螺栓法兰接头进行模拟,得到了在预紧和操作工况下垫片的应力分布,计算出不同内压工况下垫片的应力。采用 Monte Carlo 法在管道工作压力不断波动时进行可靠性分析,创建极限方程并根据极限方程应用大型软件 Matlab 以变量的分布类型对变量进行反复随机抽样,计算出不同工作压力下螺栓法兰泄漏的概率。分析结果表明,工作压力的波动产生的附加载荷对螺栓法兰泄漏会产生很大的影响,必须加以足够的重视。%Bolted flange joints were simulated with the finite element software Ansys,getting the gasket stress distribution under the condition of the preload and operation,and calculating the gasket stress under different working condition of internal pressure.Due to constant fluctuations in pipeline pressure and the pipe stress complex,the working pressure,temperature, uncertain factors such as their own constraints on its tightness should be fully considered.The Monte Carlo method is used to have reliability analysis in pipeline working pressure fluctuating.Limit equation is created and according to the distribution of the variable type and limit equation using large-scale software Matlab is used to repeat random sampling to calculate the probability of bolt flange leakages under different working pressures.The results show that,the additional load produced by working pressure fluctuation will have a very big effect on the bolt flange leak,that must be got seriously enough attention.

  9. Average Transmission Probability of a Random Stack

    Science.gov (United States)

    Lu, Yin; Miniatura, Christian; Englert, Berthold-Georg

    2010-01-01

    The transmission through a stack of identical slabs that are separated by gaps with random widths is usually treated by calculating the average of the logarithm of the transmission probability. We show how to calculate the average of the transmission probability itself with the aid of a recurrence relation and derive analytical upper and lower…

  10. Non-Archimedean Probability

    OpenAIRE

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2011-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization o...

  11. Logical Probability Preferences

    OpenAIRE

    Saad, Emad

    2013-01-01

    We present a unified logical framework for representing and reasoning about both probability quantitative and qualitative preferences in probability answer set programming, called probability answer set optimization programs. The proposed framework is vital to allow defining probability quantitative preferences over the possible outcomes of qualitative preferences. We show the application of probability answer set optimization programs to a variant of the well-known nurse restoring problem, c...

  12. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  13. Agreeing Probability Measures for Comparative Probability Structures

    OpenAIRE

    Wakker, Peter

    1981-01-01

    It is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a $\\sigma$-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid for the general case, but only for $\\sigma$-algebras. Here the proof of Niiniluoto (1972) is supplemented. Furthermore an example is presented that reveals many misunderstandings in the literature. At the...

  14. Non-Archimedean Probability

    CERN Document Server

    Benci, Vieri; Wenmackers, Sylvia

    2011-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and zero- and unit-probability events pose no particular epistemological problems. We use a non-Archimedean field as the range of the probability function. As a result, the property of countable additivity in Kolmogorov's axiomatization of probability is replaced by a different type of infinite additivity.

  15. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  16. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  17. Nuclear structure of tellurium 133 via beta decay and shell model calculations in the doubly magic tin 132 region. [J,. pi. , transition probabilities, neutron and proton separation, g factors

    Energy Technology Data Exchange (ETDEWEB)

    Lane, S.M.

    1979-08-01

    An experimental investigation of the level structure of /sup 133/Te was performed by spectroscopy of gamma-rays following the beta-decay of 2.7 min /sup 133/Sb. Multiscaled gamma-ray singles spectra and 2.5 x 10/sup 7/ gamma-gamma coincidence events were used in the assignment of 105 of the approximately 400 observed gamma-rays to /sup 133/Sb decay and in the construction of the /sup 133/Te level scheme with 29 excited levels. One hundred twenty-two gamma-rays were identified as originating in the decay of other isotopes of Sb or their daughter products. The remaining gamma-rays were associated with the decay of impurity atoms or have as yet not been identified. A new computer program based on the Lanczos tridiagonalization algorithm using an uncoupled m-scheme basis and vector manipulations was written. It was used to calculate energy levels, parities, spins, model wavefunctions, neutron and proton separation energies, and some electromagnetic transition probabilities for the following nuclei in the /sup 132/Sn region: /sup 128/Sn, /sup 129/Sn, /sup 130/Sn, /sup 131/Sn, /sup 130/Sb, /sup 131/Sb, /sup 132/Sb, /sup 133/Sb, /sup 132/Te, /sup 133/Te, /sup 134/Te, /sup 134/I, /sup 135/I, /sup 135/Xe, and /sup 136/Xe. The results are compared with experiment and the agreement is generally good. For non-magic nuclei: the lg/sub 7/2/, 2d/sub 5/2/, 2d/sub 3/2/, 1h/sub 11/2/, and 3s/sub 1/2/ orbitals are available to valence protons and the 2d/sub 5/2/, 2d/sub 3/2/, 1h/sub 11/2/, and 3s/sub 1/2/ orbitals are available to valence neutron holes. The present CDC7600 computer code can accommodate 59 single particle states and vectors comprised of 30,000 Slater determinants. The effective interaction used was that of Petrovich, McManus, and Madsen, a modification of the Kallio-Kolltveit realistic force. Single particle energies, effective charges and effective g-factors were determined from experimental data for nuclei in the /sup 132/Sn region. 116 references.

  18. Estimating extreme flood probabilities

    International Nuclear Information System (INIS)

    Estimates of the exceedance probabilities of extreme floods are needed for the assessment of flood hazard at Department of Energy facilities. A new approach using a joint probability distribution of extreme rainfalls and antecedent soil moisture conditions, along with a rainfall runoff model, provides estimates of probabilities for floods approaching the probable maximum flood. This approach is illustrated for a 570 km2 catchment in Wisconsin and a 260 km2 catchment in Tennessee

  19. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  20. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  1. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  2. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  3. Introduction to probability models

    CERN Document Server

    Ross, Sheldon M

    2006-01-01

    Introduction to Probability Models, Tenth Edition, provides an introduction to elementary probability theory and stochastic processes. There are two approaches to the study of probability theory. One is heuristic and nonrigorous, and attempts to develop in students an intuitive feel for the subject that enables him or her to think probabilistically. The other approach attempts a rigorous development of probability by using the tools of measure theory. The first approach is employed in this text. The book begins by introducing basic concepts of probability theory, such as the random v

  4. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  5. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  6. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to...

  7. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  8. Qubit persistence probability

    International Nuclear Information System (INIS)

    In this work, I formulate the persistence probability for a qubit device as the probability of measuring its computational degrees of freedom in the unperturbed state without the decoherence arising from environmental interactions. A decoherence time can be obtained from the persistence probability. Drawing on recent work of Garg, and also Palma, Suomine, and Ekert, I apply the persistence probability formalism to a generic single-qubit device coupled to a thermal environment, and also apply it to a trapped-ion quantum register coupled to the ion vibrational modes. (author)

  9. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  10. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  11. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  12. On Quantum Conditional Probability

    Directory of Open Access Journals (Sweden)

    Isabel Guerra Bobo

    2013-02-01

    Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.

  13. Topics in probability

    CERN Document Server

    Prabhu, Narahari

    2011-01-01

    Recent research in probability has been concerned with applications such as data mining and finance models. Some aspects of the foundations of probability theory have receded into the background. Yet, these aspects are very important and have to be brought back into prominence.

  14. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  15. Economy, probability and risk

    Directory of Open Access Journals (Sweden)

    Elena Druica

    2007-05-01

    Full Text Available The science of probabilities has earned a special place because it tried through its concepts to build a bridge between theory and experimenting.As a formal notion which by definition does not lead to polemic, probability, nevertheless, meets a series of difficulties of interpretation whenever the probability must be applied to certain particular situations.Usually, the economic literature brings into discussion two interpretations of the concept of probability:the objective interpretation often found under the name of frequency or statistical interpretation and the subjective or personal interpretation. Surprisingly, the third appproach is excluded:the logical interpretation.The purpose of the present paper is to study some aspects of the subjective and logical interpretation of the probability, aswell as the implications in the economics.

  16. Heterogeneous Calculation of ε

    International Nuclear Information System (INIS)

    A heterogeneous method of calculating the fast fission factor given by Naudet has been applied to the Carlvik - Pershagen definition of ε. An exact calculation of the collision probabilities is included in the programme developed for the Ferranti - Mercury computer

  17. A case concerning the improved transition probability

    OpenAIRE

    Tang, Jian; Wang, An Min

    2006-01-01

    As is well known, the existed perturbation theory can be applied to calculations of energy, state and transition probability in many quantum systems. However, there are different paths and methods to improve its calculation precision and efficiency in our view. According to an improved scheme of perturbation theory proposed by [An Min Wang, quant-ph/0611217], we reconsider the transition probability and perturbed energy for a Hydrogen atom in a constant magnetic field. We find the results obt...

  18. 曲柄滑块机构运动精度的概率分析与计算%Probability Analysis and Calculation of Kinematic Accuracy for Slider-Crank Mechanism

    Institute of Scientific and Technical Information of China (English)

    陈胜军; 贾方

    2013-01-01

    A general model of slider-crank mechanism precision analysis is established based on matrix analysis theory;probability analysis model of slider-crank mechanism movement precision is obtained based on state functions; the movement output accuracy model and probability analysis model of slider-crank mechanism are established based on centring slidercrank mechanism.The case study indicated that centring slider-crank mechanism has different kinematic error and different reliability under given design accuracy.The effectiveness probability of slider-crank mechanism can be given out under different motion stage by the models,and it is important for design and manufacture of slider-crank mechanism.%利用矩阵分析理论建立了曲柄滑块机构精度分析的一般模型,利用状态函数建立了曲柄滑块机构运动精度的概率分析模型,以对心曲柄滑块机构为具体对象,建立了曲柄滑块机构运动输出精度模型及其概率分析计算模型.算例分析表明:在给定的设计精度下,对心曲柄滑块机构在不同的运动状态有着不同的运动误差和不同的运动可靠度.模型可以定量地给出曲柄滑块机构在不同运动状态下的失效概率,对曲柄滑块机构的设计与制造具有应用价值.

  19. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  20. The concept of probability

    International Nuclear Information System (INIS)

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  1. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  2. Relativistic many-body calculations of transition probabilities for the 2l12l2[LSJ]-2l32l4[L'S'J'] lines in Be-like ions

    International Nuclear Information System (INIS)

    Reduced matrix elements, oscillator strengths, and transition rates are calculated for all allowed and forbidden 2s-2p electric dipole transitions in berylliumlike ions with nuclear charges ranging from Z = 4 to 100. Many-body perturbation theory (MBPT), including the Breit interaction, is used to evaluate retarded E1 matrix elements in length and velocity forms. The calculations start with a 1s2 Dirac-Fock potential and include all possible n = 2 configurations, leading to 4 odd-parity and 6 even-parity states. First-order perturbation theory is used to obtain intermediate coupling coefficients. Second-order MBPT is used to determine the matrix elements, which are evaluated for the 16 possible E1 transitions. The transition energies used in the calculation of oscillator strengths and transition rates are evaluated using second-order MBPT. The importance of virtual electron-positron pair (negative energy) contributions to the transition amplitudes is discussed. (orig.)

  3. Relativistic calculations of 3s21S0-3s3p 1P1 and 3s21S0-3s3p 3P1,2 transition probabilities in the Mg isoelectronic sequence

    International Nuclear Information System (INIS)

    Using the multi-configuration Dirac—Fock self-consistent field method and the relativistic configuration-interaction method, calculations of transition energies, oscillator strengths and rates are performed for the 3s21S0-3s3p 1P1 spin-allowed transition, 3s21S0-3s3p 3P1,2 intercombination and magnetic quadrupole transition in the Mg isoelectronic sequence (Mg I, Al II, Si III, P IV and S V). Electron correlations are treated adequately, including intravalence electron correlations. The influence of the Breit interaction on oscillator strengths and transition energies are investigated. Quantum electrodynamics corrections are added as corrections. The calculation results are found to be in good agreement with the experimental data and other theoretical calculations. (atomic and molecular physics)

  4. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  5. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  6. Stochastic Programming with Probability

    CERN Document Server

    Andrieu, Laetitia; Vázquez-Abad, Felisa

    2007-01-01

    In this work we study optimization problems subject to a failure constraint. This constraint is expressed in terms of a condition that causes failure, representing a physical or technical breakdown. We formulate the problem in terms of a probability constraint, where the level of "confidence" is a modelling parameter and has the interpretation that the probability of failure should not exceed that level. Application of the stochastic Arrow-Hurwicz algorithm poses two difficulties: one is structural and arises from the lack of convexity of the probability constraint, and the other is the estimation of the gradient of the probability constraint. We develop two gradient estimators with decreasing bias via a convolution method and a finite difference technique, respectively, and we provide a full analysis of convergence of the algorithms. Convergence results are used to tune the parameters of the numerical algorithms in order to achieve best convergence rates, and numerical results are included via an example of ...

  7. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  8. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  9. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...

  10. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake “calibrating adjustments” to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments that...

  11. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  12. Probability with Roulette

    Science.gov (United States)

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  13. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  14. Bayesian default probability models

    OpenAIRE

    Andrlíková, Petra

    2014-01-01

    This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...

  15. Probability densities in strong turbulence

    Science.gov (United States)

    Yakhot, Victor

    2006-03-01

    In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.

  16. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  17. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  18. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  19. Probability of causation

    International Nuclear Information System (INIS)

    New Zealand population and cancer statistics have been used to derive the probability that an existing cancer in an individual was the result of a known exposure to radiation. Hypothetical case histories illustrate how sex, race, age at exposure, age at presentation with disease, and the type of cancer affect this probability. The method can be used now to identify claims in which a link between exposure and disease is very strong or very weak, and the types of cancer and population sub-groups for which radiation is most likely to be the causative agent. Advantages and difficulties in using a probability of causation approach in legal or compensation hearings are outlined. The approach is feasible for any carcinogen for which reasonable risk estimates can be made

  20. Minimum Probability Flow Learning

    CERN Document Server

    Sohl-Dickstein, Jascha; DeWeese, Michael R

    2009-01-01

    Learning in probabilistic models is often severely hampered by the general intractability of the normalization factor and its derivatives. Here we propose a new learning technique that obviates the need to compute an intractable normalization factor or sample from the equilibrium distribution of the model. This is achieved by establishing dynamics that would transform the observed data distribution into the model distribution, and then setting as the objective the minimization of the initial flow of probability away from the data distribution. Score matching, minimum velocity learning, and certain forms of contrastive divergence are shown to be special cases of this learning technique. We demonstrate the application of minimum probability flow learning to parameter estimation in Ising models, deep belief networks, multivariate Gaussian distributions and a continuous model with a highly general energy function defined as a power series. In the Ising model case, minimum probability flow learning outperforms cur...

  1. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  2. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  3. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  4. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  5. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.; Simon, H.- U.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  6. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  7. Logic, Truth and Probability

    OpenAIRE

    Quznetsov, Gunn

    1998-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  8. Logic and probability

    OpenAIRE

    Quznetsov, G. A.

    2003-01-01

    The propositional logic is generalized on the real numbers field. The logical analog of the Bernoulli independent tests scheme is constructed. The variant of the nonstandard analysis is adopted for the definition of the logical function, which has all properties of the classical probability function. The logical analog of the Large Number Law is deduced from properties of this function.

  9. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  10. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  11. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  12. Semiclassical transition probabilities for interacting oscillators

    OpenAIRE

    Khlebnikov, S. Yu.

    1994-01-01

    Semiclassical transition probabilities characterize transfer of energy between "hard" and "soft" modes in various physical systems. We establish the boundary problem for singular euclidean solutions used to calculate such probabilities. Solutions are found numerically for a system of two interacting quartic oscillators. In the double-well case, we find numerical evidence that certain regular {\\em minkowskian} trajectories have approximate stopping points or, equivalently, are approximately pe...

  13. Failure probability of ceramic coil springs

    OpenAIRE

    Nohut, Serkan; Schneider, Gerold A.

    2009-01-01

    Ceramic springs are commercially available and a detailed reliability analysis of these components would be useful for their introduction in new applications. In this paper an analytical and a numerical analyses of the failure probability for coil springs under compression is presented. Based on analytically derived relationships and numerically calculated results, fitting functions for volume and surface flaws will be introduced which provide the prediction of the failure probability of cera...

  14. Aging, condition monitoring, and loss-of-coolant accident (LOCA) tests of Class 1E electrical cables: Summary of results

    International Nuclear Information System (INIS)

    This paper summarizes the results of aging, condition monitoring, and accident testing of various nuclear power plant cable products. Four sets of cables were aged under simultaneous thermal (≅95C) and radiation (≅0.10 kGy/hr) conditions. One set of cables was aged for 3 months, a second set was aged for 6 months, a third set was aged for 9 months, and a fourth set was not aged. A sequential accident consisting of high dose rate irradiation (≅6 kGy/hr) and high temperature steam was then performed on each set of cables. The results of the tests indicate that the feasibility of life extension of some popular cable products is promising. Mechanical measurements, primarily elongation, modulus, and density, were more effective than electrical measurements for monitoring age-related degradation. The broad objectives of this experimental program were twofold: (a) to determine the life extension potential of popular cable products used in nuclear power plants and (b) to determine the potential of condition monitoring for residual life assessment

  15. Aging, condition monitoring, and loss-of-coolant accident (LOCA) tests of Class 1E electrical cables: Summary of results

    International Nuclear Information System (INIS)

    This paper summarizes the results of aging, condition monitoring, and accident testing of Class 1E cables used in nuclear power generating stations. Three sets of cables were aged for up to 9 months under simultaneous thermal (≅ 100 degrees C) and radiation (≅0.10 kGy/hr) conditions. After the aging, the cables were exposed to a simulated accident consisting of high dose rate irradiation (≅6 kGy/hr) followed by a high temperature steam exposure. A fourth set of cables, which were unaged, were also exposed to the accident conditions. The cables that were aged for 3 months and then accident tested were subsequently exposed to a high temperature steam fragility test (up to 400 degrees C), while the cables that were aged for 6 months and then accident tested were subsequently exposed to a 1000-hour submergence test in a chemical solution. The results of the tests indicate that the feasibility of life extension of many popular nuclear power plant cable products is promising and that mechanical measurements (primarily elongation, modulus, and density) were more effective than electrical measurements for monitoring age-related degradation. In the high temperature steam test, ethylene propylene rubber (EPR) cable materials generally survived to higher temperatures than crosslinked polyolefin (XLPO) cable materials. In dielectric testing after the submergence testing, the XLPO materials performed better than the EPR materials. This paper presents some recent experimental data that are not yet available elsewhere and a summary of findings from the entire experimental program

  16. Aging, condition monitoring, and loss-of-coolant accident (LOCA) tests of Class 1E electrical cables: Summary of results

    International Nuclear Information System (INIS)

    This paper summarizes the results of aging, condition monitoring, and accident testing of Class 1E cables used in nuclear power generating stations. Three sets of cables were aged for up to 9 months under simultaneous thermal (≅ 100C) and radiation (≅ 0.10 kGy/hr) conditions. After the aging, the cables were exposed to a simulated accident consisting of high dose rate irradiation (≅ 6 kGy/hr) followed by a high temperature steam exposure. A fourth set of cables, which were unaged, were also exposed to the accident conditions. The cables that were aged for 3 months and then accident tested were subsequently exposed to a high temperature steam fragility test (up to 400C), while the cables that were aged for 6 months and then accident tested were subsequently exposed to a 1,000-hour submergence test in a chemical solution. The results of the tests indicate that the feasibility of life extension of many popular nuclear power plant cable products is promising and that mechanical measurements (primarily elongation, modulus, and density) were more effective than electrical measurements for monitoring age-related degradation. In the high temperature steam test, ethylene propylene rubber (EPR) cable materials generally survived to higher temperatures than crosslinked polyolefin (XLPO) cable materials. In dielectric testing after the submergence testing, the XLPO materials performed better than the EPR materials. This paper presents some recent experimental data that are not yet available elsewhere and a summary of findings from the entire experimental program

  17. Aging, condition monitoring, and loss-of-coolant accident (LOCA) tests of class 1E electrical cables

    International Nuclear Information System (INIS)

    This report describes the results of aging, condition monitoring, and accident testing of ethylene propylene rubber (EPR) cables. Three sets of cables were aged for up to 9 months under simultaneous thermal (≅100 degrees C) and radiation (≅0.10 kGy/hr) conditions. A sequential accident consisting of high dose rate irradiation (≅6 kGy/hr) and high temperature steam followed the aging. Also exposed to the accident conditions was a fourth set of cables, which were unaged. The test results indicate that most properly installed EPR cables should be able to survive an accident after 60 years for total aging doses of at least 150--200 kGy and for moderate ambient temperatures on the order of 45--55 degrees C (potentially higher or lower, depending on material specific activation energies and total radiation doses). Mechanical measurements (primarily elongation, modulus, and density) were more effective than electrical measurements for monitoring age-related degradation

  18. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  19. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  20. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  1. Transition probabilities for argon I

    International Nuclear Information System (INIS)

    Transition probabilities for ArI lines have been calculated on the basis of the (j,k)-coupling scheme for more than 16000 spectral lines belonging to the transition arrays 4s-np (n=4 to n=9), 5s-np (n=5 to n=9), 6s-np (n=6 to n=9), 7s-np (n=8 to n=9), 4p-ns (n=5 to n=10), 5p-ns (n=6 to n=9), 6p-ns (n=7 to n=8), 4p-nd (n=3 to n=9), 5p-nd (n=4 to n=9), 3d-np (n=5 to n=9), 4d-np (n=6 to n=9), 5d-np (n=7 to n=9), 3d-nf (n=4 to n=9), 4d-nf (n=4 to n=9), 5d-nf (n=5 to n=9), 4f-nd (n=5 to n=9) 5f-nd (n=6 to n=9), 4f-ng (n=5 to n=9), 5f-ng (n=6 to n=9). Inso far as values by other authors exist, comparison is made with these values. It turns out that the results obtained in (j,k)-coupling are close to those obtained in intermediate coupling except for intercombination lines. For high principal and/or orbital quantum numbers the transition probabilities for a multiplet approach those of the corresponding transitions in atomic hydrogen. The calculated values are applied to construct a simplified argon-atom model, which reflects the real transition properties and which allows simplified but realistic non-equilibrium calculations for argon plasmas which deviate from local thermodynamic equilibrium (LTE)

  2. Objectifying Subjective Probabilities

    Czech Academy of Sciences Publication Activity Database

    Childers, Timothy

    Dordrecht: Springer, 2012 - ( Weber , M.; Dieks, D.; Gonzalez, W.; Hartman, S.; Stadler, F.; Stöltzner, M.), s. 19-28. (The Philosophy of Science in a European Perspective. 3). ISBN 978-94-007-3029-8. [Pluralism in the Foundations of Statistics. Canterbury (GB), 09.09.2010-10.09.2010] R&D Projects: GA ČR(CZ) GAP401/10/1504 Institutional support: RVO:67985955 Keywords : probabilities * direct Inference Subject RIV: AA - Philosophy ; Religion

  3. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  4. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  5. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  6. Measurement Uncertainty and Probability

    Science.gov (United States)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  7. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    This text provides undergraduate mathematics students with an introduction to the modern theory of probability as well as the roots of the theory's mathematical ideas and techniques. Centered around the concept of measure and integration, the treatment is applicable to other branches of analysis and explores more specialized topics, including convergence theorems and random sequences and functions.The initial part is devoted to an exploration of measure and integration from first principles, including sets and set functions, general theory, and integrals of functions of real variables. These t

  8. Chemical immobilization of adult female Weddell seals with tiletamine and zolazepam: effects of age, condition and stage of lactation

    Directory of Open Access Journals (Sweden)

    Harcourt Robert G

    2006-02-01

    Full Text Available Abstract Background Chemical immobilization of Weddell seals (Leptonychotes weddellii has previously been, for the most part, problematic and this has been mainly attributed to the type of immobilizing agent used. In addition to individual sensitivity, physiological status may play an important role. We investigated the use of the intravenous administration of a 1:1 mixture of tiletamine and zolazepam (Telazol® to immobilize adult females at different points during a physiologically demanding 5–6 week lactation period. We also compared performance between IV and IM injection of the same mixture. Results The tiletamine:zolazepam mixture administered intravenously was an effective method for immobilization with no fatalities or pronounced apnoeas in 106 procedures; however, there was a 25 % (one animal in four mortality rate with intramuscular administration. Induction time was slightly longer for females at the end of lactation (54.9 ± 2.3 seconds than at post-parturition (48.2 ± 2.9 seconds. In addition, the number of previous captures had a positive effect on induction time. There was no evidence for effects due to age, condition (total body lipid, stage of lactation or number of captures on recovery time. Conclusion We suggest that intravenous administration of tiletamine and zolazepam is an effective and safe immobilizing agent for female Weddell seals. Although individual traits could not explain variation in recovery time, we suggest careful monitoring of recovery times during longitudinal studies (> 2 captures. We show that physiological pressures do not substantially affect response to chemical immobilization with this mixture; however, consideration must be taken for differences that may exist for immobilization of adult males and juveniles. Nevertheless, we recommend a mass-specific dose of 0.50 – 0.65 mg/kg for future procedures with adult female Weddell seals and a starting dose of 0.50 mg/kg for other age classes and other

  9. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  10. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  11. Calculation of the pipes failure probability of the Rcic system of a nuclear power station by means of software WinPRAISE 07; Calculo de la probabilidad de falla de tuberias del sistema RCIC de una central nuclear mediante el software WinPRAISE 07

    Energy Technology Data Exchange (ETDEWEB)

    Jasso G, J.; Diaz S, A.; Mendoza G, G.; Sainz M, E. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Garcia de la C, F. M., E-mail: angeles.diaz@inin.gob.mx [Comision Federal de Electricidad, Central Nucleoelectrica Laguna Verde, Km 44.5 Carretera Cardel-Nautla, 91476 Laguna Verde, Alto Lucero, Veracruz (Mexico)

    2014-10-15

    The growth and the cracks propagation by fatigue are a typical degradation mechanism that is presented in the nuclear industry as in the conventional industry; the unstable propagation of a crack can cause the catastrophic failure of a metallic component even with high ductility; for this reason, activities of programmed maintenance have been established in the industry using inspection and visual techniques and/or ultrasound with an established periodicity allowing to follow up to these growths, controlling the undesirable effects; however, these activities increase the operation costs; and in the peculiar case of the nuclear industry, they increase the radiation exposure to the participant personnel. The use of mathematical processes that integrate concepts of uncertainty, material properties and the probability associated to the inspection results, has been constituted as a powerful tool of evaluation of the component reliability, reducing costs and exposure levels. In this work the evaluation of the failure probability by cracks growth preexisting by fatigue is presented, in pipes of a Reactor Core Isolation Cooling system (Rcic) in a nuclear power station. The software WinPRAISE 07 (Piping Reliability Analysis Including Seismic Events) was used supported in the probabilistic fracture mechanics principles. The obtained values of failure probability evidenced a good behavior of the analyzed pipes with a maximum order of 1.0 E-6, therefore is concluded that the performance of the lines of these pipes is reliable even extrapolating the calculations at 10, 20, 30 and 40 years of service. (Author)

  12. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  13. Probable maximum flood control

    International Nuclear Information System (INIS)

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  14. Accidents, probabilities and consequences

    International Nuclear Information System (INIS)

    Following brief discussion of the safety of wind-driven power plants and solar power plants, some aspects of the safety of fast breeder and thermonuclear power plants are presented. It is pointed out that no safety evaluation of breeders comparable to the Rasmussen investigation has been carried out and that discussion of the safety aspects of thermonuclear power is only just begun. Finally, as an illustration of the varying interpretations of risk and safety analyses, four examples are given of predicted probabilities and consequences in Copenhagen of the maximum credible accident at the Barsebaeck plant, under the most unfavourable meterological conditions. These are made by the Environment Commission, Risoe Research Establishment, REO (a pro-nuclear group) and OOA (an anti-nuclear group), and vary by a factor of over 1000. (JIW)

  15. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  16. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination of...... the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  17. The Britannica Guide to Statistics and Probability

    CERN Document Server

    2011-01-01

    By observing patterns and repeated behaviors, mathematicians have devised calculations to significantly reduce human potential for error. This volume introduces the historical and mathematical basis of statistics and probability as well as their application to everyday situations. Readers will also meet the prominent thinkers who advanced the field and established a numerical basis for prediction

  18. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  19. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  20. Transition probabilities for diffusion equations by means of path integrals

    OpenAIRE

    Goovaerts, Marc; DE SCHEPPER, Ann; Decamps, Marc

    2002-01-01

    In this paper, we investigate the transition probabilities for diffusion processes. In a first part, we show how transition probabilities for rather general diffusion processes can always be expressed by means of a path integral. For several classical models, an exact calculation is possible, leading to analytical expressions for the transition probabilities and for the maximum probability paths. A second part consists of the derivation of an analytical approximation for the transition probab...

  1. Transition probabilities for diffusion equations by means of path integrals.

    OpenAIRE

    Goovaerts, Marc; De Schepper, A; Decamps, M.

    2002-01-01

    In this paper, we investigate the transition probabilities for diffusion processes. In a first part, we show how transition probabilities for rather general diffusion processes can always be expressed by means of a path integral. For several classical models, an exact calculation is possible, leading to analytical expressions for the transition probabilities and for the maximum probability paths. A second part consists of the derivation of an analytical approximation for the transition probab...

  2. Calculation of decision making probability using probit and logit models

    OpenAIRE

    Barbara Futryn; Marek Fura

    2005-01-01

    The aim of this article is presentation of logit and probit models and their wide application in many different science. Logit and probit regression are used for analyzing the relationship between one or more independent variables with categorical dependent variable. There are a lot of advantages of logit (probit) models over linear multiple regression. These methods imply that the dependent variable is actually the result of a transformation of an underlying variable, which is not restricted...

  3. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  4. The Logic of Parametric Probability

    CERN Document Server

    Norman, Joseph W

    2012-01-01

    The computational method of parametric probability analysis is introduced. It is demonstrated how to embed logical formulas from the propositional calculus into parametric probability networks, thereby enabling sound reasoning about the probabilities of logical propositions. An alternative direct probability encoding scheme is presented, which allows statements of implication and quantification to be modeled directly as constraints on conditional probabilities. Several example problems are solved, from Johnson-Laird's aces to Smullyan's zombies. Many apparently challenging problems in logic turn out to be simple problems in algebra and computer science; often just systems of polynomial equations or linear optimization problems. This work extends the mathematical logic and parametric probability methods invented by George Boole.

  5. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  6. Computing Earthquake Probabilities on Global Scales

    Science.gov (United States)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  7. Electric quadrupole transition probabilities for atomic lithium

    International Nuclear Information System (INIS)

    Electric quadrupole transition probabilities for atomic lithium have been calculated using the weakest bound electron potential model theory (WBEPMT). We have employed numerical non-relativistic Hartree–Fock wavefunctions for expectation values of radii and the necessary energy values have been taken from the compilation at NIST. The results obtained with the present method agree very well with the Coulomb approximation results given by Caves (1975). Moreover, electric quadrupole transition probability values not existing in the literature for some highly excited levels have been obtained using the WBEPMT

  8. Physics with exotic probability theory

    OpenAIRE

    Youssef, Saul

    2001-01-01

    Probability theory can be modified in essentially one way while maintaining consistency with the basic Bayesian framework. This modification results in copies of standard probability theory for real, complex or quaternion probabilities. These copies, in turn, allow one to derive quantum theory while restoring standard probability theory in the classical limit. The argument leading to these three copies constrain physical theories in the same sense that Cox's original arguments constrain alter...

  9. Quantum Foundations : Is Probability Ontological ?

    OpenAIRE

    Rosinger, Elemer E

    2007-01-01

    It is argued that the Copenhagen Interpretation of Quantum Mechanics, founded ontologically on the concept of probability, may be questionable in view of the fact that within Probability Theory itself the ontological status of the concept of probability has always been, and is still under discussion.

  10. Trajectory versus probability density entropy

    Science.gov (United States)

    Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  11. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  12. Calculation Method for Injury Probability of Living Beings Caused by Electric Shock Due to Touch and Step Voltages%雷电引起接触电压、跨步电压导致生物伤害的损害概率计算方法探析

    Institute of Scientific and Technical Information of China (English)

    冯鹤

    2016-01-01

    为了提高人员生命损失风险计算的针对性、准确性,为人工接地装置安全布设提出建设性的指导意见,通过对人工接地装置泄流时接触电压、跨步电压的计算方法的分析,确定了人工接地体泄流时接触电压、跨步电压导致人身伤亡的损害概率的定量计算方法.计算时,通过对接触电压、跨步电压计算值与人体耐受接触电压、跨步电压阈值的比较分析,确定能达到该阈值的最小雷电流幅值,再通过对项目所在位置雷电活动特征的分析,确定可能超过灾害阈值对应的雷电流幅值,即可能产生风险的雷电流所占的频率,认为该频率即为接触电压、跨步电压导致生物伤害的损害概率.对人工接地装置接触电压、跨步电压的计算以垂直接地极为例,同时考虑了冲击电流较工频电流对计算方法的影响.%In order to improve the pertinence and veracity of life loss risk calculation,this paper aims to put forward constructive suggestions on safely establishing the artificial earth device.By analyzing the calculation method of touch voltage and step voltage when the artificial earth device discharges,the author confirmed the quantitative calculation method of the injury probability caused by touch voltage and step voltage under the circumstances.In calculation,through comparative analysis of the calculated values of touch voltage and step voltage and the threshold values of human tolerance voltage and step voltage,the author determined the minimum lightning current amplitude which could also reach the threshold value.Afterwards,by measuring the lightning activity features of the program site,the author calculated the lightning current amplitude which was likely to exceed the disaster threshold,namely the frequency of risky lightning current,and based on the result,the author regarded this frequency value as the injury probability of living beings generated by touch voltage and step voltage

  13. Investigation of probable decays in rhenium isotopes

    International Nuclear Information System (INIS)

    Making use of effective liquid drop model (ELDM), the feasibility of proton and alpha decays and various cluster decays is analysed theoretically. For different neutron-rich and neutron-deficient isotopes of Rhenium in the mass range 150 < A < 200, the half-lives of proton and alpha decays and probable cluster decays are calculated considering the barrier potential as the effective liquid drop one which is the sum of Coulomb, surface and centrifugal potentials. The calculated half-lives for proton decay from various Rhenium isotopes are then compared with the universal decay law (UDL) model to assess the efficiency of the present formalism. Geiger-Nuttal plots of the probable decays are analysed and their respective slopes and intercepts are evaluated

  14. Fusion probability in heavy nuclei

    Science.gov (United States)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross sections over a wider energy range for

  15. Approximation of Failure Probability Using Conditional Sampling

    Science.gov (United States)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  16. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  17. Probabilities of multiple quantum teleportation

    OpenAIRE

    Woesler, Richard

    2002-01-01

    Using quantum teleportation a quantum state can be teleported with a certain probability. Here the probabilities for multiple teleportation are derived, i. e. for the case that a teleported quantum state is teleported again or even more than two times, for the two-dimensional case, e. g., for the two orthogonal direcations of the polarization of photons. It is shown that the probability for an exact teleportation, except for an irrelevant phase factor, is 25 %, i. e., surprisingly, this resul...

  18. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  19. Free Probability on a Direct Product of Noncommutative Probability Spaces

    OpenAIRE

    Cho, Ilwoo

    2005-01-01

    In this paper, we observevd the amalgamated free probability of direct product of noncommutative probability spaces. We defined the amalgamated R-transforms, amalgamated moment series and the amalgamated boxed convolution. They maks us to do the amalgamated R-transform calculus, like the scalar-valued case.

  20. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte;

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory to...

  1. Probability theory and its models

    OpenAIRE

    Humphreys, Paul

    2008-01-01

    This paper argues for the status of formal probability theory as a mathematical, rather than a scientific, theory. David Freedman and Philip Stark's concept of model based probabilities is examined and is used as a bridge between the formal theory and applications.

  2. Decision analysis with approximate probabilities

    Science.gov (United States)

    Whalen, Thomas

    1992-01-01

    This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.

  3. A graduate course in probability

    CERN Document Server

    Tucker, Howard G

    2014-01-01

    Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.

  4. The albedo effect on neutron transmission probability.

    Science.gov (United States)

    Khanouchi, A; Sabir, A; Boulkheir, M; Ichaoui, R; Ghassoun, J; Jehouani, A

    1997-01-01

    The aim of this study is to evaluate the albedo effect on the neutron transmission probability through slab shields. For this reason we have considered an infinite homogeneous slab having a fixed thickness equal to 20 lambda (lambda is the mean free path of the neutron in the slab). This slab is characterized by the factor Ps (scattering probability) and contains a vacuum channel which is formed by two horizontal parts and an inclined one (David, M. C. (1962) Duc and Voids in shields. In Reactor Handbook, Vol. III, Part B, p. 166). The thickness of the vacuum channel is taken equal to 2 lambda. An infinite plane source of neutrons is placed on the first of the slab (left face) and detectors, having windows equal to 2 lambda, are placed on the second face of the slab (right face). Neutron histories are sampled by the Monte Carlo method (Booth, T. E. and Hendricks, J. S. (1994) Nuclear Technology 5) using exponential biasing in order to increase the Monte Carlo calculation efficiency (Levitt, L. B. (1968) Nuclear Science and Engineering 31, 500-504; Jehouani, A., Ghassoun, J. and Abouker, A. (1994) In Proceedings of the 6th International Symposium on Radiation Physics, Rabat, Morocco) and we have applied the statistical weight method which supposes that the neutron is born at the source with a unit statistical weight and after each collision this weight is corrected. For different values of the scattering probability and for different slopes of the inclined part of the channel we have calculated the neutron transmission probability for different positions of the detectors versus the albedo at the vacuum channel-medium interface. Some analytical representations are also presented for these transmission probabilities. PMID:9463883

  5. Subjective probability models for lifetimes

    CERN Document Server

    Spizzichino, Fabio

    2001-01-01

    Bayesian methods in reliability cannot be fully utilized and understood without full comprehension of the essential differences that exist between frequentist probability and subjective probability. Switching from the frequentist to the subjective approach requires that some fundamental concepts be rethought and suitably redefined. Subjective Probability Models for Lifetimes details those differences and clarifies aspects of subjective probability that have a direct influence on modeling and drawing inference from failure and survival data. In particular, within a framework of Bayesian theory, the author considers the effects of different levels of information in the analysis of the phenomena of positive and negative aging.The author coherently reviews and compares the various definitions and results concerning stochastic ordering, statistical dependence, reliability, and decision theory. He offers a detailed but accessible mathematical treatment of different aspects of probability distributions for exchangea...

  6. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  7. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  8. Assault frequency and preformation probability of the alpha emission process

    OpenAIRE

    Zhang, H.F.; Royer, G.; Li, J.Q.

    2011-01-01

    A study of the assault frequency and preformation factor of the α-decay description is performed from the experimental α-decay constant and the penetration probabilities calculated from the generalized liquid-drop model (GLDM) potential barriers. To determine the assault frequency a quantum-mechanical method using a harmonic oscillator is introduced and leads to values of around 1021 s−1, similar to the ones calculated within the classical method. The preformation probability is around 10−1-1...

  9. Probability

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.

  10. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  11. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  12. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  13. The collision probability modules of WIMS-E

    International Nuclear Information System (INIS)

    This report describes how flat source first flight collision probabilities are calculated and used in the WIMS-E modular program. It includes a description of the input to the modules W-FLU, W-THES, W-PIP, W-PERS and W-MERGE. Input to other collision probability modules are described in separate reports. WIMS-E is capable of calculating collision probabilities in a wide variety of geometries, some of them quite complicated. It can also use them for a variety of purposes. (author)

  14. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  15. Transition probabilities of Br II

    Science.gov (United States)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  16. Induction, of and by Probability

    OpenAIRE

    Rendell, Larry

    2013-01-01

    This paper examines some methods and ideas underlying the author's successful probabilistic learning systems(PLS), which have proven uniquely effective and efficient in generalization learning or induction. While the emerging principles are generally applicable, this paper illustrates them in heuristic search, which demands noise management and incremental learning. In our approach, both task performance and learning are guided by probability. Probabilities are incrementally normalized and re...

  17. Trajectory probability hypothesis density filter

    OpenAIRE

    García-Fernández, Ángel F.; Svensson, Lennart

    2016-01-01

    This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the best Poisson approximation to the multitrajectory filtering density in the sense of minimising the K...

  18. Hf Transition Probabilities and Abundances

    OpenAIRE

    Lawler, J. E.; Hartog, E.A. den; Labby, Z. E.; Sneden, C.; Cowan, J. J.; Ivans, I. I.

    2006-01-01

    Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement...

  19. Gd Transition Probabilities and Abundances

    OpenAIRE

    Hartog, E.A. den; Lawler, J. E.; Sneden, C.; Cowan, J. J.

    2006-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has be...

  20. Sm Transition Probabilities and Abundances

    OpenAIRE

    Lawler, J. E.; Hartog, E.A. den; Sneden, C.; Cowan, J. J.

    2005-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundanc...

  1. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  2. Compliance with endogenous audit probabilities

    OpenAIRE

    Konrad, Kai A.; Lohse, Tim; Qari, Salmai

    2015-01-01

    This paper studies the effect of endogenous audit probabilities on reporting behavior in a face-to-face compliance situation such as at customs. In an experimental setting in which underreporting has a higher expected payoff than truthful reporting we find an increase in compliance of about 80% if subjects have reason to believe that their behavior towards an officer influences their endogenous audit probability. Higher compliance is driven by considerations about how own appearance and perfo...

  3. Novel Bounds on Marginal Probabilities

    OpenAIRE

    Mooij, Joris M.; Kappen, Hilbert J

    2008-01-01

    We derive two related novel bounds on single-variable marginal probability distributions in factor graphs with discrete variables. The first method propagates bounds over a subtree of the factor graph rooted in the variable, and the second method propagates bounds over the self-avoiding walk tree starting at the variable. By construction, both methods not only bound the exact marginal probability distribution of a variable, but also its approximate Belief Propagation marginal (``belief''). Th...

  4. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author)

  5. Collision probabilities in spatially stochastic media II

    International Nuclear Information System (INIS)

    An improved model for calculating collision probabilities in spatially stochastic media is described based upon a method developed by Cassell and Williams [Cassell, J.S., Williams, M.M.R., in press. An approximate method for solving radiation and neutron transport problems in spatially stochastic media. Annals of Nuclear Energy] and is applicable to three-dimensional problems. We shall show how to evaluate the collision probability in an arbitrarily shaped non-re-entrant lump, consisting of a random dispersal of two phases, for any form of autocorrelation function. Specific examples, with numerical values, are given for a sphere and a slab. In the case of the slab we allow the material to have different stochastic properties in the x, y and z directions

  6. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  7. Joint probability distributions for projection probabilities of random orthonormal states

    International Nuclear Information System (INIS)

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal. (paper)

  8. Joint probability distributions for projection probabilities of random orthonormal states

    Science.gov (United States)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  9. Structure-factor probabilities for related structures

    International Nuclear Information System (INIS)

    Probability relationships between structure factors from related structures have allowed previously only for either differences in atomic scattering factors (isomorphous replacement case) or differences in atomic positions (coordinate error case). In the coordinate error case, only errors drawn from a single probability distribution have been considered, in spite of the fact that errors vary widely through models of macromolecular structures. It is shown that the probability relationships can be extended to cover more general cases. Either the atomic parameters or the reciprocal-space vectors may be chosen as the random variables to derive probability relationships. However, the relationships turn out to be very similar for either choice. The most intuitive is the expected electron-density formalism, which arises from considering the atomic parameters as random variables. In this case, the centroid of the structure-factor distribution is the Fourier transform of the expected electron-density function, which is obtained by smearing each atom over its possible positions. The centroid estimate has a phase different from, and more accurate than, that obtained from the unweighted atoms. The assumption that there is a sufficient number of independent errors allows the application of the central limit theorem. This gives a one- (centric case) or two-dimensional (non-centric) Gaussian distribution about the centroid estimate. The general probability expression reduces to those derived previously when the appropriate simplifying assumptions are made. The revised theory has implications for calculating more accurate phases and maps, optimizing molecular replacement models, refining structures, estimating coordinate errors and interpreting refined B factors. (orig.)

  10. Transition probabilities between levels of K and K+

    International Nuclear Information System (INIS)

    In this work transition probabilities between Ievels of n < 11 for K and for the known of K+ are calculated. Two computer programs based on the Coulomb approximation and the most suitable coupling schemes has been used. Lifetimes of all these levels are also calculated. (Author)

  11. Energy-shifting formulae yield reliable reaction and capture probabilities

    International Nuclear Information System (INIS)

    Predictions of energy-shifting formulae for partial reaction and capture probabilities are compared with coupled channels calculations. The quality of the agreement notably improves with increasing mass of the system and/or decreasing mass asymmetry in the heavy-ion collision. The formulae are reliable and useful for circumventing impracticable reaction calculations at low energies

  12. Improved Ar(II) transition probabilities

    OpenAIRE

    Danzmann, K.; de Kock, M

    1986-01-01

    Precise Ar(II) branching ratios have been measured on a high current hollow cathode with a 1-m Fourier transform spectrometer. Absolute transition probabilities for 11 Ar(II) lines were calculated from these branching ratios and lifetime measurements published by Mohamed et al. For the prominent 4806 Å line, the present result is Aik = 7.12×107s-1 ±2.8%, which is in excellent agreement with recent literature data derived from pure argon diagnostics, two-wavelength-interferometry, and Hβ-diagn...

  13. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  14. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  15. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  16. Born Rule and Noncontextual Probability

    CERN Document Server

    Logiurato, Fabrizio

    2012-01-01

    The probabilistic rule that links the formalism of Quantum Mechanics (QM) to the real world was stated by Born in 1926. Since then, there were many attempts to derive the Born postulate as a theorem, Gleason's one being the most prominent. The Gleason derivation, however, is generally considered as rather intricate and its physical meaning, in particular in relation with the noncontextuality of probability (NP), is not quite evident. More recently, we are witnessing a revival of interest on possible demonstrations of the Born rule, like Zurek's and Deutsch's based on the decoherence and on the theory of decisions, respectively. Despite an ongoing debate about the presence of hidden assumptions and circular reasonings, these have the merit of prompting more physically oriented approaches to the problem. Here we suggest a new proof of the Born rule based on the noncontextuality of probability. Within the theorem we also demonstrate the continuity of probability with respect to the amplitudes, which has been sug...

  17. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  18. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  19. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  20. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  1. Interference of probabilities in dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Zak, Michail, E-mail: michail.zak@gmail.com [Jet Propulsion Laboratory California Institute of Technology, Pasadena, CA 91109 (United States)

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  2. Objective Lightning Probability Forecast Tool Phase II

    Science.gov (United States)

    Lambert, Winnie

    2007-01-01

    This presentation describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability, and an ability to distinguish between lightning and non-lightning days.

  3. Application of the diagrams of phase transformations during aging for optimizing the aging conditions for V1469 and 1441 Al-Li alloys

    Science.gov (United States)

    Lukina, E. A.; Alekseev, A. A.; Antipov, V. V.; Zaitsev, D. V.; Klochkova, Yu. Yu.

    2009-12-01

    To describe the changes in the phase composition of alloys during aging, it is convenient to construct TTT diagrams on the temperature-aging time coordinates in which time-temperature regions of the existence of nonequilibrium phases that form during aging are indicated. As a rule, in constructing the diagrams of phase transformations during aging (DPTA), time-temperature maps of properties are plotted. A comparison of the diagrams with maps of properties allows one to analyze the effect of the structure on the properties. In this study, we analyze the DPTAs of V1469 (Al-1.2 Li-0.46 Ag-3.4 Cu-0.66 Mg) and 1441 (Al-1.8 Li-1.1 Mg-1.6 Cu, C Mg/ C Cu ≈ 1) alloys. Examples of the application of DPTA for the development of steplike aging conditions are reported.

  4. Pollock on probability in epistemology

    OpenAIRE

    Fitelson, Branden

    2010-01-01

    In Thinking and Acting John Pollock offers some criticisms of Bayesian epistemology, and he defends an alternative understanding of the role of probability in epistemology. Here, I defend the Bayesian against some of Pollock's criticisms, and I discuss a potential problem for Pollock's alternative account.

  5. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    Science.gov (United States)

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  6. Transition probability and preferential gauge

    OpenAIRE

    Chen, C.Y.

    1999-01-01

    This paper is concerned with whether or not the preferential gauge can ensure the uniqueness and correctness of results obtained from the standard time-dependent perturbation theory, in which the transition probability is formulated in terms of matrix elements of Hamiltonian.

  7. Quantum correlations; quantum probability approach

    OpenAIRE

    Majewski, W A

    2014-01-01

    This survey gives a comprehensive account of quantum correlations understood as a phenomenon stemming from the rules of quantization. Centered on quantum probability it describes the physical concepts related to correlations (both classical and quantum), mathematical structures, and their consequences. These include the canonical form of classical correlation functionals, general definitions of separable (entangled) states, definition and analysis of quantumness of correlations, description o...

  8. Diverse Consequences of Algorithmic Probability

    OpenAIRE

    Özkural, Eray

    2011-01-01

    We reminisce and discuss applications of algorithmic probability to a wide range of problems in artificial intelligence, philosophy and technological society. We propose that Solomonoff has effectively axiomatized the field of artificial intelligence, therefore establishing it as a rigorous scientific discipline. We also relate to our own work in incremental machine learning and philosophy of complexity.

  9. Asbestos and Probable Microscopic Polyangiitis

    OpenAIRE

    George S Rashed Philteos; Kelly Coverett; Rajni Chibbar; Ward, Heather A; Cockcroft, Donald W

    2004-01-01

    Several inorganic dust lung diseases (pneumoconioses) are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase) positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex) is described and the possible...

  10. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  11. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  12. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  13. Fuzzy Markov chains: uncertain probabilities

    OpenAIRE

    James J. Buckley; Eslami, Esfandiar

    2002-01-01

    We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.

  14. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  15. Investigation of Flood Inundation Probability in Taiwan

    Science.gov (United States)

    Wang, Chia-Ho; Lai, Yen-Wei; Chang, Tsang-Jung

    2010-05-01

    Taiwan is located at a special point, which is in the path of typhoons from northeast Pacific Ocean. Taiwan is also situated in a tropical-subtropical transition zone. As a result, rainfall is abundant all the year round, especially in summer and autumn. For flood inundation analysis in Taiwan, there exist a lot of uncertainties in hydrological, hydraulic and land-surface topography characteristics, which can change flood inundation characteristics. According to the 7th work item of article 22 in Disaster Prevention and Protection Act in Taiwan, for preventing flood disaster being deteriorating, investigation analysis of disaster potentials, hazardous degree and situation simulation must be proceeded with scientific approaches. However, the flood potential analysis uses a deterministic approach to define flood inundation without considering data uncertainties. This research combines data uncertainty concept in flood inundation maps for showing flood probabilities in each grid. It can be an emergency evacuation basis as typhoons come and extremely torrential rain begin. The research selects Hebauyu watershed of Chiayi County as the demonstration area. Owing to uncertainties of data used, sensitivity analysis is first conducted by using Latin Hypercube sampling (LHS). LHS data sets are next input into an integrated numerical model, which is herein developed to assess flood inundation hazards in coastal lowlands, base on the extension of the 1-D river routing model and the 2-D inundation routing model. Finally, the probability of flood inundation simulation is calculated, and the flood inundation probability maps are obtained. Flood Inundation probability maps can be an alternative of the old flood potential maps for being a regard of building new hydraulic infrastructure in the future.

  16. Unified set of atomic transition probabilities for neutral argon

    OpenAIRE

    Wiese, W.; Brault, J.; Danzmann, K.; Helbig, V.; de Kock, M

    1989-01-01

    The atomic transition probabilities and radiative lifetimes of neutral argon have been the subject of numerous experiments and calculations, but the results exhibit many discrepancies and inconsistencies. We present a unified set of atomic transition probabilities, which is consistent with essentially all recent results, albeit sometimes only after critical reanalysis. The data consistency and scale confirmation has been achieved in two ways. (i) We have carried out some lifetime–branching-ra...

  17. Survival probability of drug resistant mutants in malaria parasites.

    OpenAIRE

    Mackinnon, M. J.

    1997-01-01

    This study predicts the ultimate probability of survival of a newly arisen drug resistant mutant in a population of malaria parasites, with a view to understanding what conditions favour the evolution of drug resistance. Using branching process theory and a population genetics transmission model, the probabilities of survival of one- and two-locus new mutants are calculated as functions of the degree of drug pressure, the mean and variation in transmission rate, and the degree of natural sele...

  18. Quantum probabilities and entanglement for multimode quantum systems

    International Nuclear Information System (INIS)

    Quantum probabilities are defined for several important physical cases characterizing measurements with multimode quantum systems. These are the probabilities for operationally testable measurements, for operationally uncertain measurements, and for entangled composite events. The role of the prospect and state entanglement is emphasized. Numerical modeling is presented for a two-mode Bose-condensed system of trapped atoms. The interference factor is calculated by invoking the channel-state duality.

  19. Evaluation of photoexcitation and photoionization probabilities by the trajectory method

    International Nuclear Information System (INIS)

    A new trajectory-based method of transition probability evaluation in quantum system was developed. It is based on a path integral representation of probability and uses Weyl symbols for initial and final states. The method belongs to the efficient initial value representation (IVR) schemes. The pre-exponential factor specific to the semi-classical method is equal to one, and does not need be separately calculated. This eliminates problems with caustics and Maslov indices of trajectories. The method is equally efficient for evaluation of the transition probabilities into separate states and groups of states, including an entire ionization continuum, for example. The capabilities of the method are demonstrated by the evaluation of the photo-excitation and photo-ionization probabilities in the hydrogen atom exposed to an ultrashort photo-pulse, and total photo-ionization probability in the helium atom. (authors)

  20. Calculation Methods for Wallenius’ Noncentral Hypergeometric Distribution

    DEFF Research Database (Denmark)

    Fog, Agner

    2008-01-01

    conditional distribution of independent binomial variates given their sum. No reliable calculation method for Wallenius' noncentral hypergeometric distribution has hitherto been described in the literature. Several new methods for calculating probabilities from Wallenius' noncentral hypergeometric...

  1. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  2. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  3. Sm Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Sneden, C; Cowan, J J

    2005-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 212 odd-parity levels of Sm II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier-transform spectrometry to determine transition probabilities for more than 900 lines of Sm II. This work is the largest-scale laboratory study to date of Sm II transition probabilities using modern methods. This improved data set has been used to determine a new solar photospheric Sm abundance, log epsilon = 1.00 +/- 0.03, from 26 lines. The spectra of three very metal-poor, neutron-capture-rich stars also have been analyzed, employing between 55 and 72 Sm II lines per star. The abundance ratios of Sm relative to other rare earth elements in these stars are in agreement, and are consistent with ratios expected from rapid neutron-capture nucleosynthesis (the r-process).

  4. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  5. The probability of extraterrestrial life

    International Nuclear Information System (INIS)

    Since beginning of times, human beings need to live in the company of other humans, beings developing what we now know as human societies. According to this idea there has been speculation, specially in the present century, about the possibility that human society has company of the other thinking creatures living in other planets somewhere in our galaxy. In this talk we will only use reliable data from scientific observers in order to establish a probability. We will explain the analysis on the physico-chemical principles which allow the evolution of organic molecules in our planet and establish these as the forerunners of life in our planet. On the other hand, the physical process governing stars, their characteristics and their effects on planets will also be explained as well as the amount of energy that a planet receives, its mass, atmosphere and kind of orbit. Finally, considering all this information, a probability of life from outer space will be given. (Author)

  6. Classical Probability and Quantum Outcomes

    Directory of Open Access Journals (Sweden)

    James D. Malley

    2014-05-01

    Full Text Available There is a contact problem between classical probability and quantum outcomes. Thus, a standard result from classical probability on the existence of joint distributions ultimately implies that all quantum observables must commute. An essential task here is a closer identification of this conflict based on deriving commutativity from the weakest possible assumptions, and showing that stronger assumptions in some of the existing no-go proofs are unnecessary. An example of an unnecessary assumption in such proofs is an entangled system involving nonlocal observables. Another example involves the Kochen-Specker hidden variable model, features of which are also not needed to derive commutativity. A diagram is provided by which user-selected projectors can be easily assembled into many new, graphical no-go proofs.

  7. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  8. Relative transition probabilities of cobalt

    Science.gov (United States)

    Roig, R. A.; Miller, M. H.

    1974-01-01

    Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.

  9. Probability for primordial black holes

    Science.gov (United States)

    Bousso, R.; Hawking, S. W.

    1995-11-01

    We consider two quantum cosmological models with a massive scalar field: an ordinary Friedmann universe and a universe containing primordial black holes. For both models we discuss the complex solutions to the Euclidean Einstein equations. Using the probability measure obtained from the Hartle-Hawking no-boundary proposal we find that the only unsuppressed black holes start at the Planck size but can grow with the horizon scale during the roll down of the scalar field to the minimum.

  10. Tight Bernoulli tail probability bounds

    OpenAIRE

    Dzindzalieta, Dainius

    2014-01-01

    The purpose of the dissertation is to prove universal tight bounds for deviation from the mean probability inequalities for functions of random variables. Universal bounds shows that they are uniform with respect to some class of distributions and quantity of variables and other parameters. The bounds are called tight, if we can construct a sequence of random variables, such that the upper bounds are achieved. Such inequalities are useful for example in insurance mathematics, for constructing...

  11. Asbestos and Probable Microscopic Polyangiitis

    Directory of Open Access Journals (Sweden)

    George S Rashed Philteos

    2004-01-01

    Full Text Available Several inorganic dust lung diseases (pneumoconioses are associated with autoimmune diseases. Although autoimmune serological abnormalities are common in asbestosis, clinical autoimmune/collagen vascular diseases are not commonly reported. A case of pulmonary asbestosis complicated by perinuclear-antineutrophil cytoplasmic antibody (myeloperoxidase positive probable microscopic polyangiitis (glomerulonephritis, pericarditis, alveolitis, multineuritis multiplex is described and the possible immunological mechanisms whereby asbestosis fibres might be relevant in induction of antineutrophil cytoplasmic antibodies are reviewed in the present report.

  12. Probability distributions of landslide volumes

    OpenAIRE

    M. T. Brunetti; Guzzetti, F.; M. Rossi

    2009-01-01

    We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3≤VL≤1013 m3. We determine the probability density of landslide volumes, p(VL), using kernel density estimation. Each landslide...

  13. Transition probabilities and radiative lifetimes of levels in F I

    Energy Technology Data Exchange (ETDEWEB)

    Celik, Gueltekin, E-mail: gultekin@selcuk.edu.tr; Dogan, Duygu; Ates, Sule; Taser, Mehmet

    2012-07-15

    The electric dipole transition probabilities and the lifetimes of excited levels have been calculated using the weakest bound electron potential model theory (WBEPMT) and the quantum defect orbital theory (QDOT) in atomic fluorine. In the calculations, many of transition arrays included both multiplet and fine-structure transitions are considered. We employed Numerical Coulomb Approximation (NCA) wave functions and numerical non-relativistic Hartree-Fock (NRHF) wave functions for expectation values of radii in determination of parameters. The necessary energy values have been taken from experimental energy data in the literature. The calculated transition probabilities and lifetimes have been compared with available theoretical and experimental results. A good agreement with results in literature has been obtained. Moreover, some transition probability and the lifetime values not existing in the literature for some highly excited levels have been obtained using these methods.

  14. Clusters Emission and Preformation Probability in Superheavy Nuclei

    International Nuclear Information System (INIS)

    Exotic decays of super-heavy elements with atomic number ζ (114,118,120) are studied taking Coulomb and proximity potential as interacting barrier. We computed, in fission model, the formation probability for different clusters, taking a law of powers potential. we also calculated the half-life times for the cluster radioactivity more probable of being preformed in the WKB approach. We obtained that some of these clusters, are in the actual limit of measure 1030s and show also a strong dependence of the formation probability. (Author)

  15. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...... as a filter with a transfer function depending on the actual velocity. This influences the detection probability, which gets lower at certain velocities. An index directly reflecting the probability of detection can easily be calculated from the cross-correlation estimated. This makes it possible to...

  16. Reach/frequency for printed media: Personal probabilities or models

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    2000-01-01

    The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded...... that, in order to prevent bias, ratings per group must be used as reading probabilities. Nevertheless, in most cases, the estimates are still biased compared with panel data, thus overestimating net ´reach. Models with the same assumptions as with assignments of reading probabilities are presented and...

  17. Double K-shell ionization probability in 54Mn

    International Nuclear Information System (INIS)

    We have measured the probability of double K-shell vacancy production in the electron capture decay of 54Mn to the 835-keV level of 54Cr. The probability was deduced from the number of triple coincidences among the Cr hypersatellite and satellite x rays emitted in filling the double vacancy and the 835-keV γ ray. The probability of double K-shell vacancy production per K-shell electron capture (PKK) was found to be (2.3-0.5+0.8)x10-4. Comparisons to previous experimental results and theoretical calculations are discussed

  18. Generating target probability sequences and events

    OpenAIRE

    Ella, Vaignana Spoorthy

    2013-01-01

    Cryptography and simulation of systems require that events of pre-defined probability be generated. This paper presents methods to generate target probability events based on the oblivious transfer protocol and target probabilistic sequences using probability distribution functions.

  19. Electronic factors for K-shell-electron conversion probability and electron-positron pair formation probability in electric monopole transitions

    International Nuclear Information System (INIS)

    This paper presents, in tabular form, the electronic factors ΩK,π(Z,k) of the electric monopole transition probability associated with the internal conversion of an electron from the atomic K shell (IC;K) and with the internal pair formation(IPF;π). The Ωπ values are calculated by taking the nuclear Coulomb effects into account. The corrections to ΩK due to finite nuclear size and bound-state atomic screening are not included in the present calculations. The calculated ratio of the K-shell-electron conversion probability to the electron-positron pair formation probability is found to be in good agreement with the available experimental data for Z-<40

  20. Electric quadrupole transition probabilities and line strengths of Ti11+

    International Nuclear Information System (INIS)

    Electric quadrupole transition probabilities and line strengths have been calculated using the weakest bound electron potential model for sodium-like titanium, considering many transition arrays. We employed numerical Coulomb approximation and non-relativistic Hartree–Fock wavefunctions for the expectation values of radii in determination of parameters of the model. The necessary energy values have been taken from experimental data in the literature. The calculated electric quadrupole line strengths have been compared with available data in the literature and good agreement has been obtained. Moreover, some electric quadrupole transition probability and line strength values not existing in the literature for some highly excited levels have been obtained using this method

  1. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Simonen, Fredric A.

    2009-05-01

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  2. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    International Nuclear Information System (INIS)

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  3. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  4. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  5. Estimating Probabilities in Recommendation Systems

    CERN Document Server

    Sun, Mingxuan; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.

  6. Evaluation of the Permanent Deformations and Aging Conditions of Batu Pahat Soft Clay-Modified Asphalt Mixture by Using a Dynamic Creep Test

    Directory of Open Access Journals (Sweden)

    Al Allam A. M.

    2016-01-01

    Full Text Available This study aimed to evaluate the permanent deformation and aging conditions of BatuPahat soft clay–modified asphalt mixture, also called BatuPahat soft clay (BPSC particles; these particles are used in powder form as an additive to hot-mix asphalt mixture. In this experiment, five percentage compositions of BPSC (0%, 2%, 4%, 6%, and 8% by weight of bitumen were used. A novel design was established to modify the hot-mix asphalt by using the Superpave method for each additive ratio. Several laboratory tests evaluating different properties, such as indirect tensile strength, resilient stiffness modulus, and dynamic creep, was conducted to assess the performance of the samples mixed through the Superpave method. In the resilient modulus test, fatigue and rutting resistance were reduced by the BPSC particles. The added BPSC particles increased the indirect tensile strength. Among the mixtures, 4% BPSC particles yielded the highest performance. In the dynamic creep test, 4% BPSC particles added to the unaged and short-term aged specimens also showed the highest performance. Based on these results, our conclusion is that the BPSC particles can alleviate the permanent deformation (rutting of roads.

  7. Exact feature probabilities in images with occlusion

    CERN Document Server

    Pitkow, Xaq

    2010-01-01

    To understand the computations of our visual system, it is important to understand also the natural environment it evolved to interpret. Unfortunately, existing models of the visual environment are either unrealistic or too complex for mathematical description. Here we describe a naturalistic image model and present a mathematical solution for the statistical relationships between the image features and model variables. The world described by this model is composed of independent, opaque, textured objects which occlude each other. This simple structure allows us to calculate the joint probability distribution of image values sampled at multiple arbitrarily located points, without approximation. This result can be converted into probabilistic relationships between observable image features as well as between the unobservable properties that caused these features, including object boundaries and relative depth. Using these results we explain the causes of a wide range of natural scene properties, including high...

  8. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  9. Some improved transition probabilities for neutral carbon

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Charlotte Froese [Atomic Physics Division, National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)

    2006-05-14

    An earlier paper (Zatsarinny O and Froese Fischer C 2002 J. Phys. B: At. Mol. Opt. Phys. 35 4669) presented oscillator strengths for transitions from the 2p{sup 2} 3P term to high-lying excited states of carbon. The emphasis was on the accurate prediction of energy levels relative to the ionization limit and allowed transition data from the ground state. The present paper reports some refined transition probability calculations for transitions from 2p{sup 2}, {sup 3}P, 1{sup D}, and {sup 1}S to all odd levels up to 2p3d{sup 3}P. Particular attention is given to intercombination lines where relativistic effects are most important.

  10. Generalized Bures products from free probability

    CERN Document Server

    Jarosz, Andrzej

    2012-01-01

    Inspired by the theory of quantum information, I use two non-Hermitian random matrix models - a weighted sum of circular unitary ensembles and a product of rectangular Ginibre unitary ensembles - as building blocks of three new products of random matrices which are generalizations of the Bures model. I apply the tools of both Hermitian and non-Hermitian free probability to calculate the mean densities of their eigenvalues and singular values in the thermodynamic limit, along with their divergences at zero; the results are supported by Monte Carlo simulations. I pose and test conjectures concerning the relationship between the two densities (exploiting the notion of the N-transform), the shape of the mean domain of the eigenvalues (an extension of the single ring theorem), and the universal behavior of the mean spectral density close to the domain's borderline (using the complementary error function).

  11. Constraints on probability distributions of grammatical forms

    Directory of Open Access Journals (Sweden)

    Kostić Aleksandar

    2007-01-01

    Full Text Available In this study we investigate the constraints on probability distribution of grammatical forms within morphological paradigms of Serbian language, where paradigm is specified as a coherent set of elements with defined criteria for inclusion. Thus, for example, in Serbian all feminine nouns that end with the suffix "a" in their nominative singular form belong to the third declension, the declension being a paradigm. The notion of a paradigm could be extended to other criteria as well, hence, we can think of noun cases, irrespective of grammatical number and gender, or noun gender, irrespective of case and grammatical number, also as paradigms. We took the relative entropy as a measure of homogeneity of probability distribution within paradigms. The analysis was performed on 116 morphological paradigms of typical Serbian and for each paradigm the relative entropy has been calculated. The obtained results indicate that for most paradigms the relative entropy values fall within a range of 0.75 - 0.9. Nonhomogeneous distribution of relative entropy values allows for estimating the relative entropy of the morphological system as a whole. This value is 0.69 and can tentatively be taken as an index of stability of the morphological system.

  12. Hf Transition Probabilities and Abundances

    CERN Document Server

    Lawler, J E; Labby, Z E; Sneden, C; Cowan, J J; Ivans, I I

    2006-01-01

    Radiative lifetimes from laser-induced fluorescence measurements, accurate to about +/- 5 percent, are reported for 41 odd-parity levels of Hf II. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 150 lines of Hf II. Approximately half of these new transition probabilities overlap with recent independent measurements using a similar approach. The two sets of measurements are found to be in good agreement for measurements in common. Our new laboratory data are applied to refine the hafnium photospheric solar abundance and to determine hafnium abundances in 10 metal-poor giant stars with enhanced r-process abundances. For the Sun we derive log epsilon (Hf) = 0.88 +/- 0.08 from four lines; the uncertainty is dominated by the weakness of the lines and their blending by other spectral features. Within the uncertainties of our analysis, the r-process-rich stars possess constant Hf/La and Hf/Eu abundance ratios, log epsilon (Hf...

  13. Gd Transition Probabilities and Abundances

    CERN Document Server

    Den Hartog, E A; Sneden, C; Cowan, J J

    2006-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...

  14. An Alternative Teaching Method of Conditional Probabilities and Bayes' Rule: An Application of the Truth Table

    Science.gov (United States)

    Satake, Eiki; Vashlishan Murray, Amy

    2015-01-01

    This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…

  15. Predicting most probable conformations of a given peptide sequence in the random coil state.

    Science.gov (United States)

    Bayrak, Cigdem Sevim; Erman, Burak

    2012-11-01

    In this work, we present a computational scheme for finding high probability conformations of peptides. The scheme calculates the probability of a given conformation of the given peptide sequence using the probability distribution of torsion states. Dependence of the states of a residue on the states of its first neighbors along the chain is considered. Prior probabilities of torsion states are obtained from a coil library. Posterior probabilities are calculated by the matrix multiplication Rotational Isomeric States Model of polymer theory. The conformation of a peptide with highest probability is determined by using a hidden Markov model Viterbi algorithm. First, the probability distribution of the torsion states of the residues is obtained. Using the highest probability torsion state, one can generate, step by step, states with lower probabilities. To validate the method, the highest probability state of residues in a given sequence is calculated and compared with probabilities obtained from the Coil Databank. Predictions based on the method are 32% better than predictions based on the most probable states of residues. The ensemble of "n" high probability conformations of a given protein is also determined using the Viterbi algorithm with multistep backtracking. PMID:22955874

  16. Electric quadrupole transition probabilities for singly ionized magnesium

    International Nuclear Information System (INIS)

    Electric quadrupole transition probabilities for Mg II have been calculated within the weakest bound electron potential model (WBEPM) theory using experimental energy levels and theoretical expectation values of orbital radii corresponding to those energy levels under the assumption of the LS coupling scheme. In this work, the WBEPM theory has been applied to forbidden transitions for the first time. The present results are consistent with earlier theoretical calculations. Some of these results are reported for the first time.

  17. LOFT fuel rod transient DNB probability density function studies

    International Nuclear Information System (INIS)

    Significantly improved calculated DNB safety margins were defined by the development and use of probability density functions (PDF) for transient MDNBR nuclear fuel rods in the Loss of Fluid Test (LOFT) reactor. Calculations for limiting transients and response surface methods were used thereby including transient interactions and trip uncertainties in the MDNBR PDF. Applicability and sensitivity studies determined that the PDF and resulting nominal MDNBR limits are stable, applicable over a wide range of potential input parameters, and applicable to most transients

  18. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  19. The Inductive Applications of Probability Calculus

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available The Author goes back to Founders of Probability calculus to investigate their original interpretation of the probability measure in the applications of the probability theory to real problems. The Author puts in evidence some misunderstandings related to the inversion of deductions derived by the use of probability distributions for investigating the causes of events.

  20. MEMS Calculator

    Science.gov (United States)

    SRD 166 MEMS Calculator (Web, free access)   This MEMS Calculator determines the following thin film properties from data taken with an optical interferometer or comparable instrument: a) residual strain from fixed-fixed beams, b) strain gradient from cantilevers, c) step heights or thicknesses from step-height test structures, and d) in-plane lengths or deflections. Then, residual stress and stress gradient calculations can be made after an optical vibrometer or comparable instrument is used to obtain Young's modulus from resonating cantilevers or fixed-fixed beams. In addition, wafer bond strength is determined from micro-chevron test structures using a material test machine.

  1. Delayed neutron emission probability measurements

    International Nuclear Information System (INIS)

    Some neutrons are emitted from fission fragments several seconds to several minutes after fission occurs. These delayed neutrons play a key role for the conduct and in safety aspects of nuclear reactors [1]. But the probabilities to emit such neutrons (Pn) are not well known. A summary of different database and compilation of Pn values is presented to show these discrepancies and uncertainties. Experiments are carried out at the Lohengrin mass spectrometer (at Inst. Laue Langevin in Grenoble) and at the ISOLDE facility (CERN) in order to measure some Pn values. Two different techniques are used: either by using gamma-rays detection or neutron emission detection. These two techniques and some preliminary results are presented. (authors)

  2. Associativity and normative credal probability.

    Science.gov (United States)

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  3. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  4. Transition Probabilities in 189Os

    International Nuclear Information System (INIS)

    The level structure of 189Os has been studied from the decay of 189Ir (13,3 days) produced in proton spallation at CERN and mass separated in the ISOLDE on-line facility. The gamma-ray spectrum has been recorded both with a high resolution Si(Li) - detector and Ge(Li) - detectors. Three previously not reported transitions were observed defining a new level at 348.5 keV. Special attention was given to the low energy level band structure. Several multipolarity mixing ratios were deduced from measured L-subshell ratios which, together with measured level half-lives, gave absolute transition probabilities. The low level decay properties are discussed in terms of the Nilsson model with the inclusion of Coriolis coupling

  5. Estimate of Extinction Probability of Bisexual Galton-Watson Branching Process

    OpenAIRE

    Z. Zarabi Zadeh; R. Farnoosh

    2010-01-01

    In this paper a bisexual Galton-Watson branching process is studied. Monte Carlo method is purposed to calculate the extinction probability. For certain class of processes ${{Z_{n}}}$ extinction probability is calculated and simulated, when initially population size ${(Z_{0})}$ has a different value, then results of two methods are compared.

  6. Importance function by collision probabilities for Monte Carloby code TRIPOLI

    International Nuclear Information System (INIS)

    We present a completely automatic biasing technique where the parameters of the biased simulation are deduced from the solution of the adjoint transport equation calculated by collision probabilities. In this study we shall estimate the importance function through collision probabilities method and we shall evaluate its possibilities thanks to a Monte Carlo calculation. We have run simulations with this new biasing method for one-group transport problems with isotropic shocks (one dimension geometry and X-Y geometry) and for multigroup problems with anisotropic shocks (one dimension geometry). For the anisotropic problems we solve the adjoint equation with anisotropic collision probabilities. The results show that for the one-group and the homogeneous geometry transport problems the method is quite optimal without Splitting and Russian Roulette technique for both the multigroup and heterogeneous X-Y geometry ones the figures of merit are higher if we add Splitting and Russian Roulette technique. (orig.)

  7. Failure probability of PWR reactor coolant loop piping

    International Nuclear Information System (INIS)

    This paper describes the results of assessments performed on the PWR coolant loop piping of Westinghouse and Combustion Engineering plants. For direct double-ended guillotine break (DEGB), consideration was given to crack existence probability, initial crack size distribution, hydrostatic proof test, preservice inspection, leak detection probability, crack growth characteristics, and failure criteria based on the net section stress failure and tearing modulus stability concept. For indirect DEGB, fragilities of major component supports were estimated. The system level fragility was then calculated based on the Boolean expression involving these fragilities. Indirect DEGB due to seismic effects was calculated by convolving the system level fragility and the seismic hazard curve. The results indicate that the probability of occurrence of both direct and indirect DEGB is extremely small, thus, postulation of DEGB in design should be eliminated and replaced by more realistic criteria

  8. Importance function by collision probabilities for Monte Carlo code Tripoli

    International Nuclear Information System (INIS)

    We present a completely automatic biasing technique where the parameters of the biased simulation are deduced from the solution of the adjoint transport equation calculated by collision probabilities. In this study we shall estimate the importance function through collision probabilities method and we shall evaluate its possibilities thanks to a Monte Carlo calculation. We have run simulations with this new biasing method for one-group transport problems with isotropic shocks (one dimension geometry and X-Y geometry) and for multigroup problems with anisotropic shocks (one dimension geometry). For the anisotropic problems we solve the adjoint equation with anisotropic collision probabilities. The results show that for the one-group and homogeneous geometry transport problems the method is quite optimal without Splitting and Russian Roulette technique but for the multigroup and heterogeneous X-Y geometry ones the figures of merit are higher if we add Splitting and Russian Roulette technique

  9. Wigner function and the probability representation of quantum states

    Directory of Open Access Journals (Sweden)

    Man’ko Margarita A.

    2014-01-01

    Full Text Available The relation of theWigner function with the fair probability distribution called tomographic distribution or quantum tomogram associated with the quantum state is reviewed. The connection of the tomographic picture of quantum mechanics with the integral Radon transform of the Wigner quasidistribution is discussed. The Wigner–Moyal equation for the Wigner function is presented in the form of kinetic equation for the tomographic probability distribution both in quantum mechanics and in the classical limit of the Liouville equation. The calculation of moments of physical observables in terms of integrals with the state tomographic probability distributions is constructed having a standard form of averaging in the probability theory. New uncertainty relations for the position and momentum are written in terms of optical tomograms suitable for directexperimental check. Some recent experiments on checking the uncertainty relations including the entropic uncertainty relations are discussed.

  10. Probability of induced nuclear fission in diffusion model

    International Nuclear Information System (INIS)

    The apparatus of the fission diffusion model taking into account nonequilibrium stage of the process as applied to the description of the probability of induced nuclear fission is described. The results of calculation of the energy dependence of 212Po nuclear fissility according to the new approach are presented

  11. LOFT fuel-rod-transient DNB probability density function studies

    International Nuclear Information System (INIS)

    Significantly improved DNB safety margins were calculated for LOFT reactor fuel rods by use of probability density functions (PDF) for transient MDNBR. Applicability and sensitivity studies determined that the PDF and resulting nominal MDNBR limits are stable, applicable over a wide range of potential input parameters, and applicable to most transients

  12. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    2014-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  13. Cosmological dynamics in tomographic probability representation

    OpenAIRE

    Man'ko, V. I.; G. Marmo(Università di Napoli and INFN, Napoli, Italy); Stornaiolo, C.

    2004-01-01

    The probability representation for quantum states of the universe in which the states are described by a fair probability distribution instead of wave function (or density matrix) is developed to consider cosmological dynamics. The evolution of the universe state is described by standard positive transition probability (tomographic transition probability) instead of the complex transition probability amplitude (Feynman path integral) of the standard approach. The latter one is expressed in te...

  14. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  15. Probability fracture mechanics analysis of plates with surface cracks

    International Nuclear Information System (INIS)

    Background: The uncertainties of input parameters in an deterministic structural integrity assessment of pressure vessels may affect the assessment results. This can be improved by performing probability fracture mechanics (PFM) analysis. Purpose: This work investigates the effect of uncertainties of load, defect size, fracture toughness and failure criteria on the failure probability of semi-elliptical surface cracks in plates under combined tension and bending. Methods: The correction factor method provided by EPRI is used to estimate the stress intensity factor (SIF). The J-integral values at the deepest point of the surface crack tip are evaluated using the reference stress method and the globe limit load solution developed by Goodall and Webster and Lei. PFM analysis is performed with considering the uncertainty of crack size, yield strength and fracture toughness and Monte-Carlo (MC) simulation is used to calculate the failure probability. Results: Failure probability increases with increase of load level, Lr, for all load ratio values considered in this work for a given failure criterion. However, the failure probability based on the elastic-plastic fracture criterion is higher than that based on the linear elastic fracture criterion for a given load lever, Lr. Conclusions: The load level and the failure criteria have significant effect on the failure probability. However, the load ratio makes a little contribution to the failure probability for a given failure criterion. (authors)

  16. THE TRANSITION PROBABILITY MATRIX OF A MARKOV CHAIN MODEL IN AN ATM NETWORK

    Institute of Scientific and Technical Information of China (English)

    YUE Dequan; ZHANG Huachen; TU Fengsheng

    2003-01-01

    In this paper we consider a Markov chain model in an ATM network, which has been studied by Dag and Stavrakakis. On the basis of the iterative formulas obtained by Dag and Stavrakakis, we obtain the explicit analytical expression of the transition probability matrix. It is very simple to calculate the transition probabilities of the Markov chain by these expressions. In addition, we obtain some results about the structure of the transition probability matrix, which are helpful in numerical calculation and theoretical analysis.

  17. Prior probabilities modulate cortical surprise responses: A study of event-related potentials.

    Science.gov (United States)

    Seer, Caroline; Lange, Florian; Boos, Moritz; Dengler, Reinhard; Kopp, Bruno

    2016-07-01

    The human brain predicts events in its environment based on expectations, and unexpected events are surprising. When probabilistic contingencies in the environment are precisely instructed, the individual can form expectations based on quantitative probabilistic information ('inference-based learning'). In contrast, when probabilistic contingencies are imprecisely instructed, expectations are formed based on the individual's cumulative experience ('experience-based learning'). Here, we used the urn-ball paradigm to investigate how variations in prior probabilities and in the precision of information about these priors modulate choice behavior and event-related potential (ERP) correlates of surprise. In the urn-ball paradigm, participants are repeatedly forced to infer hidden states responsible for generating observable events, given small samples of factual observations. We manipulated prior probabilities of the states, and we rendered the priors calculable or incalculable, respectively. The analysis of choice behavior revealed that the tendency to consider prior probabilities when making decisions about hidden states was stronger when prior probabilities were calculable, at least in some of our participants. Surprise-related P3b amplitudes were observed in both the calculable and the incalculable prior probability condition. In contrast, calculability of prior probabilities modulated anteriorly distributed ERP amplitudes: when prior probabilities were calculable, surprising events elicited enhanced P3a amplitudes. However, when prior probabilities were incalculable, surprise was associated with enhanced N2 amplitudes. Furthermore, interindividual variability in reliance on prior probabilities was associated with attenuated P3b surprise responses under calculable in comparison to incalculable prior probabilities. Our results suggest two distinct neural systems for probabilistic learning that are recruited depending on contextual cues such as the precision of

  18. Relative transition probabilities for krypton.

    Science.gov (United States)

    Miller, M. H.; Roig, R. A.; Bengtson, R. D.

    1972-01-01

    First experimental line strength data for the visible Kr II lines and for several of the more prominent Kr I lines are given. The spectroscopic light source used is the thermal plasma behind the reflected shock wave in a gas-driven shock tube. A 3/4-m spectrograph and a 1-m spectrograph were employed simultaneously to provide redundant photometry. The data are compared with other measurements and with theoretical calculations.

  19. The survival probability of neutrons in supercritical convex bodies using a time-dependent collision probability method

    International Nuclear Information System (INIS)

    We consider the probability of the survival of the neutron population when one neutron is injected into a supercritical fissile convex body. The formalism developed by Pal and Bell is used and the equations arising for the survival probability are solved by using a time-dependent collision probability technique. In principle, this method can be used for arbitrarily shaped convex bodies. A simple one-region case is seen to lead to reasonably accurate results when compared with the work of Gregson and Prinja [Gregson, M.W., Prinja, A.K., 2008. Time-dependent non-extinction probability for fast burst reactors. Transactions of the American Nuclear Society 98, 533 (Anaheim, CA)]. The calculations are extended to the case where a steady background neutron source is present. The time-dependent, self-collision probabilities are evaluated for slab, sphere and infinite cylindrical geometries. A method due to Lefvert [Lefvert, T., 1979. New applications of the collision probability method in neutron transport theory. Progress in Nuclear Energy 4, 97] for solving time-dependent collision probability equations is shown to give accurate results. The usefulness of diffusion theory to solve this problem is also investigated

  20. Avoiding Negative Probabilities in Quantum Mechanics

    CERN Document Server

    Nyambuya, Golden Gadzirayi

    2013-01-01

    As currently understood since its discovery, the bare Klein-Gordon theory consists of negative quantum probabilities which are considered to be physically meaningless if not outright obsolete. Despite this annoying setback, these negative probabilities are what led the great Paul Dirac in 1928 to the esoteric discovery of the Dirac Equation. The Dirac Equation led to one of the greatest advances in our understanding of the physical world. In this reading, we ask the seemingly senseless question, "Do negative probabilities exist in quantum mechanics?" In an effort to answer this question, we arrive at the conclusion that depending on the choice one makes of the quantum probability current, one will obtain negative probabilities. We thus propose a new quantum probability current of the Klein-Gordon theory. This quantum probability current leads directly to positive definite quantum probabilities. Because these negative probabilities are in the bare Klein-Gordon theory, intrinsically a result of negative energie...

  1. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  2. The Black Hole Formation Probability

    CERN Document Server

    Clausen, Drew; Ott, Christian D

    2014-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we derive the probability that a star will make a BH as a function of its ZAMS mass, $P_{\\rm BH}(M_{\\rm ZAMS})$. We explore possible biases in the observed BH mass distribution and find that this sample is best suited for studying BH formation in stars with ZAMS masses in the range $12-...

  3. Exact Bures Probabilities of Separability

    CERN Document Server

    Slater, P B

    1999-01-01

    We reexamine the question of what constitutes the conditional Bures or "quantum Jeffreys" prior for a certain four-dimensional convex subset (P) of the eight-dimensional convex set (Q) of 3 x 3 density matrices (rho_{Q}). We find that two competing procedures yield related but not identical priors - the prior previously reported (J. Phys. A 29, L271 [1996]) being normalizable over P, the new prior here, not. Both methods rely upon the same formula of Dittmann for the Bures metric tensor g, but differ in the parameterized form of rho employed. In the earlier approach, the input is a member of P, that is rho_{P}, while here it is rho_{Q}, and only after this computation is the conditioning on P performed. Then, we investigate several one-dimensional subsets of the fifteen-dimensional set of 4 x 4 density matrices, to which we apply, in particular, the first methodology. Doing so, we determine exactly the conditional Bures probabilities of separability into product states of 2 x 2 density matrices. We find that ...

  4. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis...... of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested....

  5. Transition probabilities, oscillator strengths and lifetimes for singly ionized magnesium

    International Nuclear Information System (INIS)

    The electric dipole transition probabilities, oscillator strengths and lifetimes have been calculated using the weakest bound electron potential model theory (WBEPMT) for singly ionized magnesium. In the calculations both multiplet and fine structure transitions are studied. We have employed both the numerical Coulomb approximation (NCA) method and numerical non-relativistic Hartree-Fock (NRHF) wave functions for expectation values of radii. The calculated oscillator strengths and lifetimes have been compared with MCHF results given by Fischer et al. (2006). A good agreement has been obtained with the MCHF results. Moreover, some new transition probabilities, oscillator strengths and lifetime values, not existing in the data bases for highly excited levels in singly ionized magnesium, have been obtained using this method.

  6. Brief communication: On direct impact probability of landslides on vehicles

    Science.gov (United States)

    Nicolet, Pierrick; Jaboyedoff, Michel; Cloutier, Catherine; Crosta, Giovanni B.; Lévy, Sébastien

    2016-04-01

    When calculating the risk of railway or road users of being killed by a natural hazard, one has to calculate a temporal spatial probability, i.e. the probability of a vehicle being in the path of the falling mass when the mass falls, or the expected number of affected vehicles in case such of an event. To calculate this, different methods are used in the literature, and, most of the time, they consider only the dimensions of the falling mass or the dimensions of the vehicles. Some authors do however consider both dimensions at the same time, and the use of their approach is recommended. Finally, a method considering an impact on the front of the vehicle is discussed.

  7. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Horn, J. E.; Harter, T.

    2011-06-01

    Onsite wastewater treatment systems such as septic systems are common in rural and semi-rural areas around the world; in the US, about 25-30 % of households are served by a septic system and a private drinking water well. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. Particularly in areas with small lots, thus a high septic system density, these typically shallow wells are prone to contamination by septic system leachate. Typically, mass balance approaches are used to determine a maximum septic system density that would prevent contamination of the aquifer. In this study, we estimate the probability of a well pumping partially septic system leachate. A detailed groundwater and transport model is used to calculate the capture zone of a typical drinking water well. A spatial probability analysis is performed to assess the probability that a capture zone overlaps with a septic system drainfield depending on aquifer properties, lot and drainfield size. We show that a high septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We conclude that mass balances calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances which experience limited attenuation, and those being harmful even in low concentrations.

  8. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Horn

    2011-06-01

    Full Text Available Onsite wastewater treatment systems such as septic systems are common in rural and semi-rural areas around the world; in the US, about 25–30 % of households are served by a septic system and a private drinking water well. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. Particularly in areas with small lots, thus a high septic system density, these typically shallow wells are prone to contamination by septic system leachate. Typically, mass balance approaches are used to determine a maximum septic system density that would prevent contamination of the aquifer. In this study, we estimate the probability of a well pumping partially septic system leachate. A detailed groundwater and transport model is used to calculate the capture zone of a typical drinking water well. A spatial probability analysis is performed to assess the probability that a capture zone overlaps with a septic system drainfield depending on aquifer properties, lot and drainfield size. We show that a high septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We conclude that mass balances calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances which experience limited attenuation, and those being harmful even in low concentrations.

  9. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  10. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  11. Exact Chi-Square and Fisher's Exact Probability Test for 3 by 2 Cross-Classification Tables.

    Science.gov (United States)

    Berry, Kenneth J.; Mielke, Paul W., Jr.

    1987-01-01

    Subroutines to calculate exact chi square and Fisher's exact probability tests are presented for 3 by 2 cross-classification tables. A nondirectional probability value for each test is computed recursively. (Author/GDC)

  12. Estimation of failure probabilities of reactor pressure vessel

    International Nuclear Information System (INIS)

    Full text: Probabilistic structural analysis of components used in nuclear industry is finding increasing popularity. One of the uses of this analysis is the estimation of probability of failure over the lifetime of the structure, considering the time dependent deteriorating mechanisms. The estimation of probability of failure of the nuclear reactor components over its service life is a very important issue. It is being used to optimize the design, optimize the schedules of in-service inspections, make decision regarding fitness for service and estimation of residual life. This has been traditionally been evaluated using the sophisticated Monte Carlo simulation programs on fastest available computers or on parallel processing machines. The time taken to make these calculation runs into days as the probability of failure expected is less than 10-6. The probability calculations involve solution of a multi-dimensional definite integral. This paper proposes the use of Lepage's VEGAS numerical integration algorithm for solution of these integrals. It essentially uses Monte Carlo simulation with adaptive importance sampling as the solution technique. The method is reliable and converges quickly. The paper demonstrates the use of this algorithm in estimating the probability of reactor components. The mode of failure considered is fracture mechanics. The deteriorating mechanisms considered are fatigue and embrittlement due to nuclear radiation. The probability of failure is obtained over the lifetime of the reactor. The results are compared with those obtained from Monte Carlo simulation, reported in literature. The results show a very good match with the published literature. The time taken for calculations by VEGAS algorithm is a few minutes on a Pentium based personal computer

  13. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan Cort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  14. Limit Theorems in Free Probability Theory I

    OpenAIRE

    Chistyakov, G. P.; Götze, F.

    2006-01-01

    Based on a new analytical approach to the definition of additive free convolution on probability measures on the real line we prove free analogs of limit theorems for sums for non-identically distributed random variables in classical Probability Theory.

  15. Non-Equilibrium Random Matrix Theory : Transition Probabilities

    CERN Document Server

    Pedro, Francisco Gil

    2016-01-01

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large $N$ limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  16. Probability output of multi-class support vector machines

    Institute of Scientific and Technical Information of China (English)

    忻栋; 吴朝晖; 潘云鹤

    2002-01-01

    A novel approach to interpret the outputs of multi-class support vector machines is proposed in this paper. Using the geometrical interpretation of the classifying heperplane and the distance of the pattern from the hyperplane, one can calculate the posterior probability in binary classification case. This paper focuses on the probability output in multi-class phase where both the one-against-one and one-against-rest strategies are considered. Experiment on the speaker verification showed that this method has high performance.

  17. Estimating Small Probabilities for Langevin Dynamics

    OpenAIRE

    Aristoff, David

    2012-01-01

    The problem of estimating small transition probabilities for overdamped Langevin dynamics is considered. A simplification of Girsanov's formula is obtained in which the relationship between the infinitesimal generator of the underlying diffusion and the change of probability measure corresponding to a change in the potential energy is made explicit. From this formula an asymptotic expression for transition probability densities is derived. Separately the problem of estimating the probability ...

  18. Probability distributions of landslide volumes

    Directory of Open Access Journals (Sweden)

    M. T. Brunetti

    2009-03-01

    Full Text Available We examine 19 datasets with measurements of landslide volume, VL, for sub-aerial, submarine, and extraterrestrial mass movements. Individual datasets include from 17 to 1019 landslides of different types, including rock fall, rock slide, rock avalanche, soil slide, slide, and debris flow, with individual landslide volumes ranging over 10−4 m3VL≤1013 m3. We determine the probability density of landslide volumes, p(VL, using kernel density estimation. Each landslide dataset exhibits heavy tailed (self-similar behaviour for their frequency-size distributions, p(VL as a function of VL, for failures exceeding different threshold volumes, VL*, for each dataset. These non-cumulative heavy-tailed distributions for each dataset are negative power-laws, with exponents 1.0≤β≤1.9, and averaging β≈1.3. The scaling behaviour of VL for the ensemble of the 19 datasets is over 17 orders of magnitude, and is independent of lithological characteristics, morphological settings, triggering mechanisms, length of period and extent of the area covered by the datasets, presence or lack of water in the failed materials, and magnitude of gravitational fields. We argue that the statistics of landslide volume is conditioned primarily on the geometrical properties of the slope or rock mass where failures occur. Differences in the values of the scaling exponents reflect the primary landslide types, with rock falls exhibiting a smaller scaling exponent (1.1≤β≤1.4 than slides and soil slides (1.5≤β≤1.9. We argue that the difference is a consequence of the disparity in the mechanics of rock falls and slides.

  19. Development of direct transmission probability method for criticality safety analysis

    International Nuclear Information System (INIS)

    We have developed new deterministic Sn-type two dimensional transport calculation method using direct transmission probabilities. The present method has advantage on calculation accuracy for geometries with much void or neutron absorption regions, because paths of neutron are calculated from generation to reaction without approximation. Checking calculations are carried out for a criticality safety problem of fuel assemblies in a spent fuel storage pool with neutron absorption materials, which show difference between the present method and the conventional Sn methods of DOT3.5 on eigenvalues and flux distributions. The other checking calculations for a neutron shielding problem show advantage of the present method comparing with the conventional Sn methods from the viewpoint of ray effects. (author)

  20. On the measurement probability of quantum phases

    OpenAIRE

    Schürmann, Thomas

    2006-01-01

    We consider the probability by which quantum phase measurements of a given precision can be done successfully. The least upper bound of this probability is derived and the associated optimal state vectors are determined. The probability bound represents an unique and continuous transition between macroscopic and microscopic measurement precisions.

  1. Uniqueness in ergodic decomposition of invariant probabilities

    OpenAIRE

    Zimmermann, Dieter

    1992-01-01

    We show that for any set of transition probabilities on a common measurable space and any invariant probability, there is at most one representing measure on the set of extremal, invariant probabilities with the $\\sigma$-algebra generated by the evaluations. The proof uses nonstandard analysis.

  2. Equivalence of two orthogonalities between probability measures

    OpenAIRE

    Takatsu, Asuka

    2011-01-01

    Given any two probability measures on a Euclidean space with mean 0 and finite variance, we demonstrate that the two probability measures are orthogonal in the sense of Wasserstein geometry if and only if the two spaces by spanned by the supports of each probability measure are orthogonal.

  3. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.; Hole, Arna Risa; Rutström, E. Elisabeth

    2012-01-01

    We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The...... probabilities are indeed best characterized as probability distributions with non-zero variance....

  4. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  5. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  6. SPEI Calculator

    OpenAIRE

    Beguería, Santiago; Vicente Serrano, Sergio M.

    2009-01-01

    [EN] *Objectives: The program calculates time series of the Standardised Precipitation-Evapotransporation Index (SPEI). *Technical Characteristics: The program is executed from the Windows console. From an input data file containing monthly time series of precipitation and mean temperature, plus the geographic coordinates of the observatory, the program computes the SPEI accumulated at the time interval specified by the user, and generates a new data file with the SPEI time serie...

  7. Burnout calculation

    International Nuclear Information System (INIS)

    Reviewed is the effect of heat flux of different system parameters on critical density in order to give an initial view on the value of several parameters. A thorough analysis of different equations is carried out to calculate burnout is steam-water flows in uniformly heated tubes, annular, and rectangular channels and rod bundles. Effect of heat flux density distribution and flux twisting on burnout and storage determination according to burnout are commended

  8. Relativistic calculations of 3s2 1S0-3s3p 1P1 and 3s2 1S0-3s3p 3P1,2 transition probabilities in the Mg isoelectronic sequence

    Institute of Scientific and Technical Information of China (English)

    Cheng Cheng; Gao Xiang; Qing Bo; Zhang Xiao-Le; Li Jia-Ming

    2011-01-01

    Using the multi-configuration Dirac-Fock self-consistent field method and the relativistic configuration-interaction method, calculations of transition energies, oscillator strengths and rates are performed for the 3s2 1S0-3s3p 1P1 spinallowed transition, 3s2 1S0-3s3p 3P1,2 intercombination and magnetic quadrupole transition in the Mg isoelectronic sequence (Mg Ⅰ, Al Ⅱ, Si ⅢⅢ, P Ⅳ and S Ⅴ). Electron correlations are treated adequately, including intravalence electron correlations. The influence of the Breit interaction on oscillator strengths and transition energies are investigated. Quantum electrodynamics corrections are added as corrections. The calculation results are found to be in good agreement with the experimental data and other theoretical calculations.

  9. Transition probabilities and radiative lifetimes of Mg III

    Science.gov (United States)

    Alonso-Medina, A.; Colón, C.; Moreno-Díaz, C.

    2015-03-01

    There have been calculated transition probabilities for 365 lines arising from 2p5 n s(n = 3 , 4 , 5) , 2p5 n p(n = 3 , 4) , 2p5 n d(n = 3 , 4) , 2p5 n f(n = 4 , 5) and 2p5 5g configurations of Mg III and radiative lifetimes corresponding to 89 levels. These values were obtained in intermediate coupling (IC) by using ab initio relativistic Hartree-Fock (HFR) calculations. Later, we use the standard method of least square fitting of experimental energy levels for the IC calculations by means of Cowan's computer codes. The vast majority of the calculated transition probabilities correspond to lines lying in the ultraviolet range (UV) which are of high interest in astrophysics. Our results are compared to those previously reported in the literature. Furthermore, the values of transition probabilities of configuration levels 2p5 4d, 2p5 n f(n = 4 , 5) and 2p5 5g are presented for the first time. In light of these findings, it is possible to extend the range of wavelengths which allows us to estimate the temperature in plasma diagnostic. In addition, our results for radiative lifetimes have been compared to the available experimental values.

  10. Positron creation probabilities in low-energy heavy-ion collisions

    International Nuclear Information System (INIS)

    The previously developed technique for calculation of ionization probabilities in low-energy heavy-ion collisions [A.I. Bondarev et al., Phys. Scr. T156, 014054 (2013)] is extended to the evaluation of positron creation probabilities. The differential probabilities are evaluated by two alternative methods. The first one uses hydrogenic continuum wave functions, while the second one uses discretized continuum wave functions corresponding to a finite basis expansion. These methods are applied to the calculation of the differential positron creation probabilities in the U91+(1s)-U92+ collision. The results obtained by both methods are found in good agreement. (authors)

  11. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  12. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  13. Lectures on probability and statistics. Revision

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.

  14. Lectures on probability and statistics. Revision

    International Nuclear Information System (INIS)

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion

  15. Negative probabilities and information gain in weak measurements

    International Nuclear Information System (INIS)

    We study the outcomes in a general measurement with postselection, and derive upper bounds for the pointer readings in weak measurement. The probabilities inferred from weak measurements change along with the coupling strength; and the true probabilities can be obtained when the coupling is strong enough. By calculating the information gain of the measuring device about which path the particles pass through, we show that the “negative probabilities” only emerge for cases when the information gain is little due to very weak coupling between the measuring device and the particles. When the coupling strength increases, we can unambiguously determine whether a particle passes through a given path every time, hence the average shifts always represent true probabilities, and the strange “negatives probabilities” disappear.

  16. Some theoretical aspects of the group-IIIA-ion atomic clocks: Intercombination transition probabilities

    International Nuclear Information System (INIS)

    The main focus of this paper is the theoretical study of the 3P1→1S0 intercombination transition probabilities of the group-IIIA ions that are excellent candidates for high-accuracy atomic clocks. The importance of relativistic effects on the intercombination transition probabilities is made apparent by comparing their calculated values with those of the allowed 1P1→1S0 transition probabilities. In striking contrast to the allowed transition probabilities, the intercombination transition probabilities exhibit a strong Z dependence

  17. Calculator calculus

    CERN Document Server

    McCarty, George

    1982-01-01

    How THIS BOOK DIFFERS This book is about the calculus. What distinguishes it, however, from other books is that it uses the pocket calculator to illustrate the theory. A computation that requires hours of labor when done by hand with tables is quite inappropriate as an example or exercise in a beginning calculus course. But that same computation can become a delicate illustration of the theory when the student does it in seconds on his calculator. t Furthermore, the student's own personal involvement and easy accomplishment give hi~ reassurance and en­ couragement. The machine is like a microscope, and its magnification is a hundred millionfold. We shall be interested in limits, and no stage of numerical approximation proves anything about the limit. However, the derivative of fex) = 67.SgX, for instance, acquires real meaning when a student first appreciates its values as numbers, as limits of 10 100 1000 t A quick example is 1.1 , 1.01 , 1.001 , •••• Another example is t = 0.1, 0.01, in the functio...

  18. Reliability calculations

    International Nuclear Information System (INIS)

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  19. Generic Degraded Configuration Probability Analysis for the Codisposal Waste Package

    International Nuclear Information System (INIS)

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M and O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by keff in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package

  20. Cisplatin and radiation: Interaction probabilities and therapeutic possibilities

    International Nuclear Information System (INIS)

    This paper examines the probability of interactions occurring between drug lesions and radiation lesions in DNA for the cytotoxic and radiosensitizing agent cisplatin. The number of cisplatin-induced DNA adducts and radiation-induced strand breaks after a given dose of each agent are known for given cell systems, from which the probability that these lesions will interact can be estimated. Results of these calculations indicate that the probability of interaction could be high, depending on the distance over which two lesions can interact and the probability of repair of the interaction lesion. Calculated lesion numbers have been compared with known data on radiation modification, including illustrations of inconsistencies. In the second part of the paper, ways in which combined therapy with cisplatin and radiation can be improved are described. Development of methods to predict which types of tumor and which individual tumors within a given type are sensitive to the cytotoxic and radiosensitizing effects of the drug would aid rational selection of patients for combination treatments. Immunocytochemical methods sensitive enough to monitor cisplatin-DNA interactions in patients are available and may be useful in this context. The delivery and maintenance of higher tumor concentrations of radiosensitizer offers a further possibility for improvement. Studies of intratumoral injection of cisplatin have shown promise for achieving this goal while limiting normal tissue toxicity.46 references

  1. Collision strengths and transition probabilities for Co III forbidden lines

    Science.gov (United States)

    Storey, P. J.; Sochi, Taha

    2016-07-01

    In this paper we compute the collision strengths and their thermally averaged Maxwellian values for electron transitions between the 15 lowest levels of doubly ionized cobalt, Co2+, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.

  2. Collision strengths and transition probabilities for Co III forbidden lines

    CERN Document Server

    Storey, P J

    2016-01-01

    In this paper we compute the collision strengths and their thermally-averaged Maxwellian values for electron transitions between the fifteen lowest levels of doubly-ionised cobalt, Co^{2+}, which give rise to forbidden emission lines in the visible and infrared region of spectrum. The calculations also include transition probabilities and predicted relative line emissivities. The data are particularly useful for analysing the thermodynamic conditions of supernova ejecta.

  3. Potential energy surface and fusion probability in DNS model

    International Nuclear Information System (INIS)

    The Potential Energy Surface (PES) for particle exchange in Di-nuclear system is studied in detail. It is found that the nuclear deformation effect can change the shape of PES significantly. The dynamical deformation as a function of the reaction time in the reaction process is investigated in a simple model and authors found that its variation with time is dramatic. The fusion probabilities PCN of some reaction channels based on the mechanism of cold fusion are also calculated

  4. Absolute Kr I and Kr II transition probabilities

    International Nuclear Information System (INIS)

    Transition probabilities for 11 KrI and 9 KrII lines between 366.5 and 599.3nm were obtained from measurements with a wall-stabilised arc at atmospheric pressure in pure krypton. The population densities of the excited krypton levels were calculated under the assumption of LTE from electron densities measured by laser interferometry. The uncertainties for the KrI and the KrII data are 15 and 25% respectively. (author)

  5. Classification of Forest Type From Space Using Recollision Probability

    Science.gov (United States)

    Schull, M. A.; Ganguly, S.; Samanta, A.; Myneni, R. B.; Knyazikhin, Y.

    2008-12-01

    A classification of forest types has been produced based on the scattering characteristics within the forest canopy. The scattering processes are described by the spectral invariants of the radiative transfer theory. The spectral invariants, recollision and escape probabilities, explain the effect that canopy structural hierarchy has on the bidirectional reflectance factor (BRF). Here we show that the recollision probability can delineate between needleleaf and broadleaf forest types given the same effective LAI. Since the recollision probability tells about the multiple scattering in the canopy, we have found that the recollision probability is sensitive to hierarchal canopy structure. Given the fact that needleleafs have 1 more hierarchal level (needles within the shoot as apposed to a flat leaf) there is more scattering within a needleleaf than a broadleaf forest for the same effective LAI allowing for separation between forest types. Promising results were attained yielding a high level of confidence by simply applying a threshold of recollision probability values calculated from AVIRIS hyperspectral data. The results are shown for AVIRIS campaigns in the Northeast region of the US flown in August of 2003.

  6. On the probability of cure for heavy-ion radiotherapy

    Science.gov (United States)

    Hanin, Leonid; Zaider, Marco

    2014-07-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.

  7. On the probability of cure for heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)

  8. Probability and Quantum Paradigms: the Interplay

    International Nuclear Information System (INIS)

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology

  9. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  10. Probability and Quantum Paradigms: the Interplay

    Science.gov (United States)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  11. Introduction: Research and Developments in Probability Education

    OpenAIRE

    Manfred Borovcnik; Ramesh Kapadia

    2009-01-01

    In the topic study group on probability at ICME 11 a variety of ideas on probability education were presented. Some of the papers have been developed further by the driving ideas of interactivity and use of the potential of electronic publishing. As often happens, the medium of research influences the results and thus – not surprisingly – the research change its character during this process. This paper provides a summary of the main threads of research in probability education across the wor...

  12. Probabilities and signalling in quantum field theory

    OpenAIRE

    Dickinson, Robert; Forshaw, Jeff; Millington, Peter

    2016-01-01

    We present an approach to computing probabilities in quantum field theory for a wide class of source-detector models. The approach works directly with probabilities and not with squared matrix elements, and the resulting probabilities can be written in terms of expectation values of nested commutators and anti-commutators. We present results that help in the evaluation of these, including an expression for the vacuum expectation values of general nestings of commutators and anti-commutators i...

  13. Local Percolation Probabilities for a Natural Sandstone

    OpenAIRE

    Hilfer, R.; Rag, T.; Virgi, B.

    1996-01-01

    Local percolation probabilities are used to characterize the connectivity in porous and heterogeneous media. Together with local porosity distributions they allow to predict transport properties \\cite{hil91d}. While local porosity distributions are readily obtained, measurements of the local percolation probabilities are more difficult and have not been attempted previously. First measurements of three dimensional local porosity distributions and percolation probabilities from a pore space re...

  14. Are All Probabilities Fundamentally Quantum Mechanical?

    OpenAIRE

    Pradhan, Rajat Kumar

    2011-01-01

    The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype e...

  15. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how the...... motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  16. Lagrangian Probability Distributions of Turbulent Flows

    OpenAIRE

    Friedrich, R.

    2002-01-01

    We outline a statistical theory of turbulence based on the Lagrangian formulation of fluid motion. We derive a hierarchy of evolution equations for Lagrangian N-point probability distributions as well as a functional equation for a suitably defined probability functional which is the analog of Hopf's functional equation. Furthermore, we adress the derivation of a generalized Fokker-Plank equation for the joint velocity - position probability density of N fluid particles.

  17. Quantum Statistical Mechanics. III. Equilibrium Probability

    OpenAIRE

    Attard, Phil

    2014-01-01

    Given are a first principles derivation and formulation of the probabilistic concepts that underly equilibrium quantum statistical mechanics. The transition to non-equilibrium probability is traversed briefly.

  18. Some New Results on Transition Probability

    Institute of Scientific and Technical Information of China (English)

    Yu Quan XIE

    2008-01-01

    In this paper, we study the basic properties of stationary transition probability of Markov processes on a general measurable space (E, ε), such as the continuity, maximum probability, zero point, positive probability set standardization, and obtain a series of important results such as Continuity Theorem, Representation Theorem, Levy Theorem and so on. These results are very useful for us to study stationary tri-point transition probability on a general measurable space (E, ε). Our main tools such as Egoroff's Theorem, Vitali-Hahn-Saks's Theorem and the theory of atomic set and well-posedness of measure are also very interesting and fashionable.

  19. Time and probability in quantum cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Greensite, J. (San Francisco State Univ., CA (USA). Dept. of Physics and Astronomy)

    1990-10-01

    A time function, an exactly conserved probability measure, and a time-evolution equation (related to the Wheeler-DeWitt equation) are proposed for quantum cosmology. The time-integral of the probability measure is the measure proposed by Hawking and Page. The evolution equation reduces to the Schroedinger equation, and probability measure to the Born measure, in the WKB approximation. The existence of this 'Schroedinger-limit', which involves a cancellation of time-dependencies in the probability density between the WKB prefactor and integration measure, is a consequence of laplacian factor ordering in the Wheeler-DeWitt equation. (orig.).

  20. Real analysis and probability solutions to problems

    CERN Document Server

    Ash, Robert P

    1972-01-01

    Real Analysis and Probability: Solutions to Problems presents solutions to problems in real analysis and probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability; the interplay between measure theory and topology; conditional probability and expectation; the central limit theorem; and strong laws of large numbers in terms of martingale theory.Comprised of eight chapters, this volume begins with problems and solutions for the theory of measure and integration, followed by various applications of the basic integration theory.

  1. Bayesian logistic betting strategy against probability forecasting

    CERN Document Server

    Kumon, Masayuki; Takemura, Akimichi; Takeuchi, Kei

    2012-01-01

    We propose a betting strategy based on Bayesian logistic regression modeling for the probability forecasting game in the framework of game-theoretic probability by Shafer and Vovk (2001). We prove some results concerning the strong law of large numbers in the probability forecasting game with side information based on our strategy. We also apply our strategy for assessing the quality of probability forecasting by the Japan Meteorological Agency. We find that our strategy beats the agency by exploiting its tendency of avoiding clear-cut forecasts.

  2. Entailment in Probability of Thresholded Generalizations

    OpenAIRE

    Bamber, Donald

    2013-01-01

    A nonmonotonic logic of thresholded generalizations is presented. Given propositions A and B from a language L and a positive integer k, the thresholded generalization A=>B{k} means that the conditional probability P(B|A) falls short of one by no more than c*d^k. A two-level probability structure is defined. At the lower level, a model is defined to be a probability function on L. At the upper level, there is a probability distribution over models. A definition is given of what it means for a...

  3. Advantages of the probability amplitude over the probability density in quantum mechanics

    OpenAIRE

    Kurihara, Yoshimasa; Quach, Nhi My Uyen

    2013-01-01

    We discuss reasons why a probability amplitude, which becomes a probability density after squaring, is considered as one of the most basic ingredients of quantum mechanics. First, the Heisenberg/Schrodinger equation, an equation of motion in quantum mechanics, describes a time evolution of the probability amplitude rather than of a probability density. There may be reasons why dynamics of a physical system are described by amplitude. In order to investigate one role of the probability amplitu...

  4. Photon recollision probability in heterogeneous forest canopies: Compatibility with a hybrid GO model

    Science.gov (United States)

    Mõttus, Matti; Stenberg, Pauline; Rautiainen, Miina

    2007-02-01

    Photon recollision probability, or the probability by which a photon scattered from a phytoelement in the canopy will interact within the canopy again, has previously been shown to approximate well the fractions of radiation scattered and absorbed by homogeneous plant covers. To test the applicability of the recollision probability theory to more complicated canopy structures, a set of modeled stands was generated using allometric relations for Scots pine trees growing in central Finland. A hybrid geometric-optical model (FRT, or the Kuusk-Nilson model) was used to simulate the reflectance and transmittance of the modeled forests consisting of ellipsoidal tree crowns and, on the basis of the simulations, the recollision probability (p) was calculated for the canopies. As the recollision probability theory assumes energy conservation, a method to check and ensure energy conservation in the model was first developed. The method enabled matching the geometric-optical and two-stream submodels of the hybrid FRT model, and more importantly, allowed calculation of the recollision probability from model output. Next, to assess the effect of canopy structure on the recollision probability, the obtained p-values were compared to those calculated for structureless (homogeneous) canopies with similar effective LAI using a simple two-stream radiation transfer model. Canopy structure was shown to increase the recollision probability, implying that structured canopies absorb more efficiently the radiation interacting with the canopy, and it also changed the escape probabilities for different scattering orders. Most importantly, the study demonstrated that the concept of recollision probability is coherent with physically based canopy reflectance models which use the classical radiative transfer theory. Furthermore, it was shown that as a first approximation, the recollision probability can be considered to be independent of wavelength. Finally, different algorithms for

  5. Probability of paternity in paternity testing using the DNA fingerprint procedure.

    Science.gov (United States)

    Honma, M; Ishiyama, I

    1989-01-01

    For the purpose of applying DNA fingerprinting to paternity testing, we established a general formula to calculate the probability of paternity and evaluated the ability of DNA fingerprinting to determine paternity. PMID:2591980

  6. Noise figure and photon probability distribution in Coherent Anti-Stokes Raman Scattering (CARS)

    OpenAIRE

    Dimitropoulos, D.; Solli, D. R.; Claps, R.; Jalali, B.

    2006-01-01

    The noise figure and photon probability distribution are calculated for coherent anti-Stokes Raman scattering (CARS) where an anti-Stokes signal is converted to Stokes. We find that the minimum noise figure is ~ 3dB.

  7. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  8. Impact-parameter dependence of ionization probabilities of the L sub-shells

    International Nuclear Information System (INIS)

    Ionization probabilities of the Lsub(I) and Lsub(III) subshells have been calculated for 82Pb by the impact of protons within the framework of binary encounter approximation (BEA) theory. The results obtained have been compared with the probabilities based on semi-classical approximation (SCA) theory. (Auth.)

  9. Inclusive spectra of hadrons created by color tube fission 1. Probability of tube fission

    OpenAIRE

    Gedalin, E. V.

    1997-01-01

    The probability of color tube fission that includes the tube surface small oscillation corrections is obtained with pre-exponential factor accuracy on the basis of previously constructed color tube model. Using these expressions the probability of the tube fission in $n$ point is obtained that is the basis for calculation of inclusive spectra of produced hadrons.

  10. Technique of account of a leak's probability of a steam generator due to destruction of a studs of a collector cover

    International Nuclear Information System (INIS)

    The approach estimating the leak probability of flanged joint due to the destruction of fastening studs is described. The mentioned approach consists of two stages. The probability of destroying one stud is calculated at the first stage, and the probability of different combination interpositions of intact and destroyed studs is calculated at the second one. The probability calculation of leak in the area of collector cover of steam generator PGV-1000 is used as an example of developed approach

  11. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  12. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out-c...... well as for bimodal processes with two dominating frequencies in the structural response....

  13. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  14. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability of...

  15. Correlation as Probability of Common Descent.

    Science.gov (United States)

    Falk, Ruma; Well, Arnold D.

    1996-01-01

    One interpretation of the Pearson product-moment correlation ("r"), correlation as the probability of originating from common descent, important to the genetic measurement of inbreeding, is examined. The conditions under which "r" can be interpreted as the probability of "identity by descent" are specified, and the possibility of generalizing this…

  16. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  17. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  18. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  19. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  20. Conditional probability of the tornado missile impact given a tornado occurrence

    International Nuclear Information System (INIS)

    Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets

  1. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  2. Laboratory-Tutorial activities for teaching probability

    CERN Document Server

    Wittmann, M C; Morgan, J T; Feeley, Roger E.; Morgan, Jeffrey T.; Wittmann, Michael C.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain "touchstone" examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler's fallacy, and use expectati...

  3. The enigma of probability and physics

    International Nuclear Information System (INIS)

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  4. Quantum probability assignment limited by relativistic causality.

    Science.gov (United States)

    Han, Yeong Deok; Choi, Taeseung

    2016-01-01

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment. PMID:26971717

  5. Interelectronic-interaction effect on the transition probability in high-Z He-like ions

    OpenAIRE

    Indelicato, Paul; Shabaev, V. M.; Volotka, A. V.

    2004-01-01

    The interelectronic-interaction effect on the transition probabilities in high-Z He-like ions is investigated within a systematic quantum electrodynamic approach. The calculation formulas for the interelectronic-interaction corrections of first order in $1/Z$ are derived using the two-time Green function method. These formulas are employed for numerical evaluations of the magnetic transition probabilities in heliumlike ions. The results of the calculations are compared with experimental value...

  6. Assessing the clinical probability of pulmonary embolism

    Energy Technology Data Exchange (ETDEWEB)

    Miniati, M. [Consiglio Nazionale delle Ricerche, Institute of Clinical Physiology, Pisa (Italy); Pistolesi, M. [University of Florence, Dept. of Section of Nuclear Medicine Critical Care, Florence (Italy)

    2001-12-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score {<=} 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score {>=} 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was

  7. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  8. Angular anisotropy representation by probability tables

    International Nuclear Information System (INIS)

    In this paper, we improve point-wise or group-wise angular anisotropy representation by using probability tables. The starting point of this study was to give more flexibility (sensitivity analysis) and more accuracy (ray effect) to group-wise anisotropy representation by Dirac functions, independently introduced at CEA (Mao, 1998) and at IRSN (Le Cocq, 1998) ten years ago. Basing ourselves on our experience of cross-section description, acquired in CALENDF (Sublet et al., 2006), we introduce two kinds of moment based probability tables, Dirac (DPT) and Step-wise (SPT) Probability Tables where the angular probability distribution is respectively represented by Dirac functions or by a step-wise function. First, we show how we can improve equi-probable cosine representation of point-wise anisotropy by using step-wise probability tables. Then we show, by Monte Carlo techniques, how we can obtain a more accurate description of group-wise anisotropy than the one usually given by a finite expansion on a Legendre polynomial basis (that can induce negative values) and finally, we describe it by Dirac probability tables. This study is carried out in the framework of GALILEE project R and D activities (Coste-Delclaux, 2008). (authors)

  9. Uncertainty about probability: a decision analysis perspective

    Energy Technology Data Exchange (ETDEWEB)

    Howard, R.A.

    1988-03-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.

  10. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  11. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  12. Radiative lifetimes and atomic transition probabilities

    International Nuclear Information System (INIS)

    Radiative lifetimes and atomic transition probabilities have been measured for over 35 neutral and singly ionized species in the Wisconsin Atomic Transition Probabilities (WATP) Program since it began in 1980. Radiative lifetimes are measured using time-resolved laser-induced fluorescence of a slow atomic/ionic beam. These lifetimes are combined with branching fractions to yield absolute atomic transition probabilities for neutral and singly ionized species. The branching fractions are determined from emission spectra recorded using the 1.0 m Fourier-transform spectrometer at the National Solar Observatory. The current focus of the WATP Program is on the rare-earth elements, in particular Tm, Dy, and Ho

  13. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from the...... same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects with...

  14. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  15. Probability Distributions for a Surjective Unimodal Map

    Institute of Scientific and Technical Information of China (English)

    HongyanSUN; LongWANG

    1996-01-01

    In this paper we show that the probability distributions for a surjective unimodal map can be classified into three types,δ function,asymmetric and symmetric type,by identifying the binary structures of its initial values.The Borel's normal number theorem is equivalent or prior to the Frobenius-Perron operator in analyzing the probability distributions for this kind of maps,and in particular we can constitute a multifractal probability distribution from the surjective tent map by selecting a non- Borel's normal number as the initial value.

  16. Are All Probabilities Fundamentally Quantum Mechanical?

    CERN Document Server

    Pradhan, Rajat Kumar

    2011-01-01

    The subjective and the objective aspects of probabilities are incorporated in a simple duality axiom inspired by observer participation in quantum theory. Transcending the classical notion of probabilities, it is proposed and demonstrated that all probabilities may be fundamentally quantum mechanical in the sense that they may all be derived from the corresponding amplitudes. The classical coin-toss and the quantum double slit interference experiments are discussed as illustrative prototype examples. Absence of multi-order quantum interference effects in multiple-slit experiments and the Experimental tests of complementarity in Wheeler's delayed-choice type experiments are explained using the involvement of the observer.

  17. Miscorrection probability beyond the minimum distance

    OpenAIRE

    Cassuto, Yuval; Bruck, Jehoshua

    2004-01-01

    The miscorrection probability of a list decoder is the probability that the decoder will have at least one non-causal codeword in its decoding sphere. Evaluating this probability is important when using a list-decoder as a conventional decoder since in that case we require the list to contain at most one codeword for most of the errors. A lower bound on the miscorrection is the main result. The key ingredient in the proof is a new combinatorial upper bound on the list-size for a general q−ary...

  18. Size constrained unequal probability sampling with a non-integer sum of inclusion probabilities

    OpenAIRE

    Grafström, Anton; Qualité, Lionel; Tillé, Yves; Matei, Alina

    2012-01-01

    More than 50 methods have been developed to draw unequal probability samples with fixed sample size. All these methods require the sum of the inclusion probabilities to be an integer number. There are cases, however, where the sum of desired inclusion probabilities is not an integer. Then, classical algorithms for drawing samples cannot be directly applied. We present two methods to overcome the problem of sample selection with unequal inclusion probabilities when their sum is not an integer ...

  19. Choosing information variables for transition probabilities in a time-varying transition probability Markov switching model

    OpenAIRE

    Andrew J. Filardo

    1998-01-01

    This paper discusses a practical estimation issue for time-varying transition probability (TVTP) Markov switching models. Time-varying transition probabilities allow researchers to capture important economic behavior that may be missed using constant (or fixed) transition probabilities. Despite its use, Hamilton’s (1989) filtering method for estimating fixed transition probability Markov switching models may not apply to TVTP models. This paper provides a set of sufficient conditions to justi...

  20. Probability assessment of results of radiological analysis of surface waters affected by radioactive raw materials mining

    International Nuclear Information System (INIS)

    Water quality for classification purposes is determined by the average of the prescribed number of most unfavourable values of the indicator. In hydrology data are processed using a probability evaluation. The results are given of the probability evaluation of the occurrence of natural radionuclides for the example of a set of values obtained in a model catchment (Berounka river). The calculation of volume activities of radionuclides with a chosen non-exceedance probability may be completed with a calculation of reliability for the chosen significance level. (M.D.)