WorldWideScience

Sample records for k-electron capture probabilities

  1. K-electron capture probability in 171Lu

    International Nuclear Information System (INIS)

    Mishra, N.R.; Vara Prasad, N.V.S.; Chandrasekhara Rao, M.V.S.; Satyanarayana, G.; Sastry, D.L.; Chintalapudi, S.N.

    1999-01-01

    The K-electron capture probability in the decay of 171 Lu to the 835.06 keV level of the daughter nucleus 171 Yb is measured to be 0.822 ± 0.027 involving two transitions, in agreement with the theoretical value 0.833. The experimental value is seen to be consistent with the mass prediction of the relationship due to Wapstra and Bos. (author)

  2. Measurement of K-electron capture probability in the decay of 87Y

    International Nuclear Information System (INIS)

    Prasad, N.V.S.V.; Murty, G.S.K.; Rao, M.V.S.C.; Sastry, D.L.

    1993-01-01

    The K-electron capture probability for the 1/2 - to 3/2 - transition in the decay of 87 Y to the 873.0 keV level in the daughter 87 Sr was measured for the first time using an x-γ summing method. The experimental P K value was found to be 0.911 ± 0.047, in agreement with the theoretical value of 0.878. (author)

  3. Measurement of K-electron capture probability in the decay of [sup 87]Y

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, N.V.S.V.; Murty, G.S.K.; Rao, M.V.S.C.; Sastry, D.L. (Andhra Univ., Visakhapatnam (India). Labs. for Nuclear Research); Chintalapudi, S.N. (Inter University Consortium for DAE Facilities, Calcutta (India))

    1993-04-01

    The K-electron capture probability for the 1/2[sup -] to 3/2[sup -]transition in the decay of [sup 87]Y to the 873.0 keV level in the daughter [sup 87]Sr was measured for the first time using an x-[gamma] summing method. The experimental P[sub K] value was found to be 0.911 [+-] 0.047, in agreement with the theoretical value of 0.878. (author).

  4. Experimental study of K-electron capture probability in the decay of {sup 111}In

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, N.R.; Kalyani, V.D.M.L.; Maldhusudhana Rao, P.V.; Vara Prasad, N.V.; Chandrasekhar Rao, M.V.S.; Satyanarayana, G.; Sastry, D.L. [Visakhapatnam, Andhra Univ. (India). Swami Jnanananda Lab. for Nuclear Research; Chintalapudi, S.N. [Calcutt, Inter-Univ. Consortium for DAEW Facilities (India)

    1998-03-01

    The K-electron capture probability in the decay of {sup 111}In to the 416.64 keV level of the daughter nucleus {sup 111}Cd was measured to be 0.853 {+-}0.037 in agreement with the theoretical value 0.865. The experimental value is seen to be consistent with the mass prediction of the relationship due to Wapstra and Bos.

  5. Measurement on K-electron capture probability in the decay of {sup 97}Ru

    Energy Technology Data Exchange (ETDEWEB)

    Kalayani, V.D.M.L.; Vara Prasad, N.V.S.; Chandrasekhar Rao, M.V.S.; Satyanarayana, G.; Sastry, D.L. [Swami Jnanananda Laboratories for Nuclear Research, Andhra University, Visakhapatnam (India); Chintalapudi, S.N. [Inter University Consortium for DEA Facililities, Calcutta (India)

    1999-08-01

    The K-electron capture probabilities of two strong allowed transitions 5/2{sup +}{yields}5/2{sup +} and 5/2{sup +}{yields}7/2{sup +} were measured in the decay of {sup 97}Ru employing the X-{gamma} internal summing technique. The two P{sub K} experimental values were found to be 0.884{+-}0.046 and 0.886{+-}0.018 in agreement with the theoretical values 0.878 and 0.878, respectively. The theoretical values are seen to be insensitive for Q{sub EC} values above 200 keV.

  6. Measurement on K-electron capture probability in the decay of 97Ru

    International Nuclear Information System (INIS)

    Kalayani, V.D.M.L.; Vara Prasad, N.V.S.; Chandrasekhar Rao, M.V.S.; Satyanarayana, G.; Sastry, D.L.; Chintalapudi, S.N.

    1999-01-01

    The K-electron capture probabilities of two strong allowed transitions 5/2 + →5/2 + and 5/2 + →7/2 + were measured in the decay of 97 Ru employing the X-γ internal summing technique. The two P K experimental values were found to be 0.884±0.046 and 0.886±0.018 in agreement with the theoretical values 0.878 and 0.878, respectively. The theoretical values are seen to be insensitive for Q EC values above 200 keV

  7. Relative K-electron capture probabilities in the decay of 99Rh

    International Nuclear Information System (INIS)

    Mishra, N.R.; Chandrasekhar Rao, M.V.S.; Satyanarayana, G.; Sastry, D.L.; Chintalapudi, S.N.

    2000-01-01

    The relative K-electron capture probabilities (P K ) to the 1383.23, 896.98, 618.09, 442.78 and 322.43 keV levels in 99 Ru in the decay of 99 Rh are measured employing the X-γ internal sum-coincidence technique. The measured values P K 1383.23=0.851±0.066, P K 896.98=0.834±0.061, P K 618.09=0.870±0.01, P K 442.78=0.882±0.035 and P K 322.43=0.852±0.061 are found to be in good agreement with the theoretical values. The dependence of P K on EC transition energy is discussed. (author)

  8. Measurement on K-electron capture probabilities in the decay of [sup 183]Re and [sup 168]Tm

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, N.V.S.V.; Rao, M.V.S.C.; Reddy, S.B.; Satyanarayana, G.; Sastry, D.L. (Andhra Univ., Visakhapatnam (India). Swami Jnanananda Labs. for Nuclear Research); Murty, G.S.K. (UNDNJ, Newark, NJ (United States). Dept. of Radiology); Chintalapudi, S.N. (Inter University Consortium for DAE Facilities, Calcutta (India))

    1994-03-01

    The K-electron capture probabilities for the 5/2[sup +] to 3/2[sup -]transition in the electron capture decay of [sup 183]Re to the 208.805 keV level in the daughter [sup 183]W and for the 3[sup (+)] to 3[sup -]and 3[sup (+)] to 4[sup -] transitions in the electron capture decay of [sup 168]Tm to the 1541.4 keV and 1093.0 keV levels, respectively, in the daughter [sup 168]Er were measured for the first time using an x-[gamma] summing method. The experimental P[sub K] values are reported in this paper, together with those due to theory, and discussed. (Author).

  9. New approach to K-electron-capture probabilities to the 437 and 384 keV levels in the decay of /sup 133/Ba

    Energy Technology Data Exchange (ETDEWEB)

    Singh, K; Sahota, H S [Punjabi Univ., Patiala (India). Dept. of Physics

    1983-12-01

    The K-electron-capture probabilities to the 437 and 384 keV levels in the decay of /sup 133/Ba have been determined from a measurement of gamma-ray intensities in conjunction with an analysis of the K x-ray-gamma-ray sum peaks. The results are independent of fluorescence yield and detector efficiency.

  10. Electron capture probabilities in sup 105 Ag

    Energy Technology Data Exchange (ETDEWEB)

    Chandrasekhar Rao, M.V.S.; Sree Krishna Murty, G.; Radha Krishna, K.; Bhuloka Reddy, S.; Satyanarayana, G.; Raghavaiah, C.V.; Sastry, D.L. (Andhra Univ., Visakhapatnam (India). Labs. for Nuclear Research); Chintalapudi, S.N. (Bhabha Atomic Research Centre, Calcutta (India). Variable Energy Cyclotron Centre)

    1990-01-01

    The K-electron capture probabilities for the 1/2{sup -}yields3/2{sup -} and 1/2{sup -}yields1/2{sup +} transitions in the decay of {sup 105}Ag were measured for the first time using the sum coincidence method. The experimental P{sub K} values were estimated to be 0.824{plus minus}0.042 and 0.851{plus minus}0.046 for the allowed and first-forbidden beta transitions, respectively in agreement with the theory. The P{sub L} experimental values to these two levels were also computed using the experimental P{sub L}/P{sub K} values reported by earlier authors. These results are also found to be consistent with the theoretical P{sub L} values. (orig.).

  11. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  12. Κ-electron capture probability in 167Tm

    International Nuclear Information System (INIS)

    Sree Krishna Murty, G.; Chandrasekhar Rao, M.V.S.; Radha Krishna, K.; Bhuloka Reddy, S.; Satyanarayana, G.; Ramana Rao, P.V.; Sastry, D.L.

    1990-01-01

    The Κ-electron capture probability in the decay of 167 Tm for the first-forbidden transition 1/2 + →3/2 - was measured using the sum-coincidence method and employing a hyper-pure Ge system. The P Κ value is found to be 0.835±0.029, in agreement with the theoretical value of 0.829. (author)

  13. Kappa. -electron capture probability in sup 167 Tm

    Energy Technology Data Exchange (ETDEWEB)

    Sree Krishna Murty, G.; Chandrasekhar Rao, M.V.S.; Radha Krishna, K.; Bhuloka Reddy, S.; Satyanarayana, G.; Ramana Rao, P.V.; Sastry, D.L. (Andhra Univ., Visakhapatnam (India). Labs. for Nuclear Research); Chintalapudi, S.N. (Variable Energy Cyclotron Centre, Calcutta (India))

    1990-07-01

    The {Kappa}-electron capture probability in the decay of {sup 167}Tm for the first-forbidden transition 1/2{sup +}{yields}3/2{sup -} was measured using the sum-coincidence method and employing a hyper-pure Ge system. The P{sub {Kappa}} value is found to be 0.835{plus minus}0.029, in agreement with the theoretical value of 0.829. (author).

  14. On Z-dependence of probability of atomic capture of mesons in matter

    International Nuclear Information System (INIS)

    Vasil'ev, V.A.; Petrukhin, V.I.; Suvorov, V.M.; Khorvat, D.

    1976-01-01

    All experimental data available on the atomic capture of negative muons and pions are systematically studied to find more appropriate empirical expression for the capture probability as a function of the atomic number. It is shown that Z-dependence, as a rule, does not hold. Zsup(1/3)-dependence gives more satisfactory results. A modified Zsup(1/3-dependence is proposed which is more appropriate for hydrogen - containing compounds

  15. Probabilities and energies to obtain the counting efficiency of electron-capture nuclides, KLMN model

    International Nuclear Information System (INIS)

    Casas Galiano, G.; Grau Malonda, A.

    1994-01-01

    An intelligent computer program has been developed to obtain the mathematical formulae to compute the probabilities and reduced energies of the different atomic rearrangement pathways following electron-capture decay. Creation and annihilation operators for Auger and X processes have been introduced. Taking into account the symmetries associated with each process, 262 different pathways were obtained. This model allows us to obtain the influence of the M-electron-capture in the counting efficiency when the atomic number of the nuclide is high

  16. K-capture probabilities in the decay of /sup 133/Ba

    Energy Technology Data Exchange (ETDEWEB)

    Singh, K; Sahota, H S

    1983-07-01

    The K-capture probabilities in the decay of /sup 133/Ba to 437, 383, 161 and 81 keV levels have been determined from the analysis of the K X-ray-gamma ray sumpeaks observed with an intrinsic Ge detector. The measurements on 161 and 81 keV levels are reported for the first time in literature.

  17. Exact capture probability analysis of GSC receivers over i.n.d. Rayleigh fading channels

    KAUST Repository

    Nam, Sungsik

    2013-07-01

    A closed-form expression of the capture probability of generalized selection combining (GSC) RAKE receivers was introduced in [1]. The idea behind this new performance metric is to quantify how the remaining set of uncombined paths affects the overall performance both in terms of loss in power and increase in interference levels. In this previous work, the assumption was made that the fading is both independent and identically distributed from path to path. However, the average strength of each path is different in reality. In order to derive a closed-form expression of the capture probability over independent and non-identically distributed (i.n.d.) fading channels, we need to derive the joint statistics of ordered non-identical exponential variates. With this motivation in mind, we first provide in this paper some new order statistics results in terms of both moment generating function (MGF) and probability density function (PDF) expressions under an i.n.d. assumption and then derive a new exact closed-form expression for the capture probability GSC RAKE receivers in this more realistic scenario. © 2013 IEEE.

  18. Probabilities and energies to obtain the counting efficiency of electron-capture nuclides. KLMN model

    International Nuclear Information System (INIS)

    Galiano, G.; Grau, A.

    1994-01-01

    An intelligent computer program has been developed to obtain the mathematical formulae to compute the probabilities and reduced energies of the different atomic rearrangement pathways following electron-capture decay. Creation and annihilation operators for Auger and X processes have been introduced. Taking into account the symmetries associated with each process, 262 different pathways were obtained. This model allows us to obtain the influence of the M-electro capture in the counting efficiency when the atomic number of the nuclide is high. (Author)

  19. Exact capture probability analysis of GSC receivers over Rayleigh fading channel

    KAUST Repository

    Nam, Sungsik

    2010-01-01

    For third generation systems and ultrawideband systems, RAKE receivers have been introduced due to the advantage of RAKE receivers which is their ability to combine different replicas of the transmitted signal arriving at different delays in a rich multipath environment. In principle, RAKE receivers combine all resolvable paths which gives the best performance in a rich diversity environment. However, this is usually costly in terms of hardware required as the number of RAKE fingers increases. Therefore, generalized selection combining (GSC) RAKE reception was proposed and has been studied by many researcher as an alternative to the classical two fundamental diversity schemes: maximal ratio combining and selection combining. Previous work on performance analyses of GSC RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closedform expressions for various performance measures. However, the remaining set of uncombined paths affect the overall performance both in terms of loss in power. Therefore, to have a full understanding of the performance of GSC RAKE receivers, we introduce in this paper the notion of capture probability, which is defined as the ratio of the captured power (essentially combined paths power) to that of the total available power. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for the capture probability over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.

  20. Capturing alternative secondary structures of RNA by decomposition of base-pairing probabilities.

    Science.gov (United States)

    Hagio, Taichi; Sakuraba, Shun; Iwakiri, Junichi; Mori, Ryota; Asai, Kiyoshi

    2018-02-19

    It is known that functional RNAs often switch their functions by forming different secondary structures. Popular tools for RNA secondary structures prediction, however, predict the single 'best' structures, and do not produce alternative structures. There are bioinformatics tools to predict suboptimal structures, but it is difficult to detect which alternative secondary structures are essential. We proposed a new computational method to detect essential alternative secondary structures from RNA sequences by decomposing the base-pairing probability matrix. The decomposition is calculated by a newly implemented software tool, RintW, which efficiently computes the base-pairing probability distributions over the Hamming distance from arbitrary reference secondary structures. The proposed approach has been demonstrated on ROSE element RNA thermometer sequence and Lysine RNA ribo-switch, showing that the proposed approach captures conformational changes in secondary structures. We have shown that alternative secondary structures are captured by decomposing base-paring probabilities over Hamming distance. Source code is available from http://www.ncRNA.org/RintW .

  1. Derivation of capture and reaction cross sections from experimental quasi-elastic and elastic backscattering probabilities

    International Nuclear Information System (INIS)

    Sargsyan, V.V.; Adamian, G.G.; Antonenko, N.V.; Gomes, P.R.S.

    2014-01-01

    We suggest simple and useful methods to extract reaction and capture (fusion) cross sections from the experimental elastic and quasi-elastic backscattering data.The direct measurement of the reaction or capture (fusion) cross section is a difficult task since it would require the measurement of individual cross sections of many reaction channels, and most of them could be reached only by specific experiments. This would require different experimental setups not always available at the same laboratory and, consequently, such direct measurements would demand a large amount of beam time and would take probably some years to be reached. Because of that, the measurements of elastic scattering angular distributions that cover full angular ranges and optical model analysis have been used for the determination of reaction cross sections. This traditional method consists in deriving the parameters of the complex optical potentials which fit the experimental elastic scattering angular distributions and then of deriving the reaction cross sections predicted by these potentials. Even so, both the experimental part and the analysis of this latter method are not so simple. In the present work we present a much simpler method to determine reaction and capture (fusion) cross sections. It consists of measuring only elastic or quasi-elastic scattering at one backward angle, and from that, the extraction of the reaction or capture cross sections can easily be performed. (author)

  2. Sampling the stream landscape: Improving the applicability of an ecoregion-level capture probability model for stream fishes

    Science.gov (United States)

    Mollenhauer, Robert; Mouser, Joshua B.; Brewer, Shannon K.

    2018-01-01

    Temporal and spatial variability in streams result in heterogeneous gear capture probability (i.e., the proportion of available individuals identified) that confounds interpretation of data used to monitor fish abundance. We modeled tow-barge electrofishing capture probability at multiple spatial scales for nine Ozark Highland stream fishes. In addition to fish size, we identified seven reach-scale environmental characteristics associated with variable capture probability: stream discharge, water depth, conductivity, water clarity, emergent vegetation, wetted width–depth ratio, and proportion of riffle habitat. The magnitude of the relationship between capture probability and both discharge and depth varied among stream fishes. We also identified lithological characteristics among stream segments as a coarse-scale source of variable capture probability. The resulting capture probability model can be used to adjust catch data and derive reach-scale absolute abundance estimates across a wide range of sampling conditions with similar effort as used in more traditional fisheries surveys (i.e., catch per unit effort). Adjusting catch data based on variable capture probability improves the comparability of data sets, thus promoting both well-informed conservation and management decisions and advances in stream-fish ecology.

  3. Estimating the population size and colony boundary of subterranean termites by using the density functions of directionally averaged capture probability.

    Science.gov (United States)

    Su, Nan-Yao; Lee, Sang-Hee

    2008-04-01

    Marked termites were released in a linear-connected foraging arena, and the spatial heterogeneity of their capture probabilities was averaged for both directions at distance r from release point to obtain a symmetrical distribution, from which the density function of directionally averaged capture probability P(x) was derived. We hypothesized that as marked termites move into the population and given sufficient time, the directionally averaged capture probability may reach an equilibrium P(e) over the distance r and thus satisfy the equal mixing assumption of the mark-recapture protocol. The equilibrium capture probability P(e) was used to estimate the population size N. The hypothesis was tested in a 50-m extended foraging arena to simulate the distance factor of field colonies of subterranean termites. Over the 42-d test period, the density functions of directionally averaged capture probability P(x) exhibited four phases: exponential decline phase, linear decline phase, equilibrium phase, and postequilibrium phase. The equilibrium capture probability P(e), derived as the intercept of the linear regression during the equilibrium phase, correctly projected N estimates that were not significantly different from the known number of workers in the arena. Because the area beneath the probability density function is a constant (50% in this study), preequilibrium regression parameters and P(e) were used to estimate the population boundary distance 1, which is the distance between the release point and the boundary beyond which the population is absent.

  4. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  5. Exact capture probability analysis of GSC receivers over i.n.d. Rayleigh fading channels

    KAUST Repository

    Nam, Sungsik; Yang, Hongchuan; Alouini, Mohamed-Slim; Kim, Dongin

    2013-01-01

    variates. With this motivation in mind, we first provide in this paper some new order statistics results in terms of both moment generating function (MGF) and probability density function (PDF) expressions under an i.n.d. assumption and then derive a new

  6. Revisiting the U-238 thermal capture cross section and gamma-raymission probabilities from Np-239 decay

    Energy Technology Data Exchange (ETDEWEB)

    Trkov, A.; Molnar, G.L.; Revay, Zs.; Mughabghab, S.F.; Firestone,R.B.; Pronyaev, V.G.; Nichols, A.L.; Moxon, M.C.

    2005-03-03

    The precise value of the thermal capture cross section of238U is uncertain, and evaluated cross sections from various sourcesdiffer by more than their assigned uncertainties. A number of theoriginal publications have been reviewed to assess the discrepant data,corrections were made for more recent standard cross sections andotherconstants, and one new measurement was analyzed. Due to the strongcorrelations in activation measurements, the gamma-ray emissionprobabilities from the beta decay of 239Np were also analyzed. As aresult of the analysis, a value of 2.683 +- 0.012 barns was derived forthe thermal capture cross section of 238U. A new evaluation of thegamma-ray emission probabilities from 239Np decay was alsoundertaken.

  7. Towards saturation of the electron-capture delayed fission probability: The new isotopes 240Es and 236Bk

    Directory of Open Access Journals (Sweden)

    J. Konki

    2017-01-01

    Full Text Available The new neutron-deficient nuclei 240Es and 236Bk were synthesised at the gas-filled recoil separator RITU. They were identified by their radioactive decay chains starting from 240Es produced in the fusion–evaporation reaction 209Bi(34S,3n240Es. Half-lives of 6(2s and 22−6+13s were obtained for 240Es and 236Bk, respectively. Two groups of α particles with energies Eα=8.19(3MeV and 8.09(3MeV were unambiguously assigned to 240Es. Electron-capture delayed fission branches with probabilities of 0.16(6 and 0.04(2 were measured for 240Es and 236Bk, respectively. These new data show a continuation of the exponential increase of ECDF probabilities in more neutron-deficient isotopes.

  8. On the use of mean groundwater age, life expectancy and capture probability for defining aquifer vulnerability and time-of-travel zones for source water protection.

    Science.gov (United States)

    Molson, J W; Frind, E O

    2012-01-01

    Protection and sustainability of water supply wells requires the assessment of vulnerability to contamination and the delineation of well capture zones. Capture zones, or more generally, time-of-travel zones corresponding to specific contaminant travel times, are most commonly delineated using advective particle tracking. More recently, the capture probability approach has been used in which a probability of capture of P=1 is assigned to the well and the growth of a probability-of-capture plume is tracked backward in time using an advective-dispersive transport model. This approach accounts for uncertainty due to local-scale heterogeneities through the use of macrodispersion. In this paper, we develop an alternative approach to capture zone delineation by applying the concept of mean life expectancy E (time remaining before being captured by the well), and we show how life expectancy E is related to capture probability P. Either approach can be used to delineate time-of-travel zones corresponding to specific travel times, as well as the ultimate capture zone. The related concept of mean groundwater age A (time since recharge) can also be applied in the context of defining the vulnerability of a pumped aquifer. In the same way as capture probability, mean life expectancy and groundwater age account for local-scale uncertainty or unresolved heterogeneities through macrodispersion, which standard particle tracking neglects. The approach is tested on 2D and 3D idealized systems, as well as on several watershed-scale well fields within the Regional Municipality of Waterloo, Ontario, Canada. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Double K-shell ionization in electron capture decay

    International Nuclear Information System (INIS)

    Intemann, R.L.

    1985-01-01

    Using a semirelativistic theory previously developed by the author, we have computed the total probability per K-capture event for the ionization of the remaining K electron for a dozen nuclides of interest. Based on hydrogenic wave functions and accurate to relative order (Zα) 2 , the theory takes into account the correlation between the two initial K electrons and permits adjustments for screening. Numerical results exhibiting the effects of screening are presented. A comprehensive comparison of the predictions of this theory, as well as those of other theoretical models, with recent experimental data is also given

  10. K-capture probabilities to 400. 56 and 279. 48 keV levels in the decay of /sup 75/Se

    Energy Technology Data Exchange (ETDEWEB)

    Singh, K; Sahota, H S

    1984-02-01

    Precisely calibrated high resolution intrinsic semiconductor detector has been used to find K-capture probabilities to 400.56 and 279.48 keV levels in the decay of /sup 75/Se. The measurement to 279.48 keV level is reported for the first time in literature.

  11. Probabilities and energies to obtain the counting efficiency of electron-capture nuclides. KLMN model; Probabilidades y energias de reestructuracion atomica subsiguientes a la captura electronica. Modelo KLMN

    Energy Technology Data Exchange (ETDEWEB)

    Galiano, G.; Grau, A.

    1994-07-01

    An intelligent computer program has been developed to obtain the mathematical formulae to compute the probabilities and reduced energies of the different atomic rearrangement pathways following electron-capture decay. Creation and annihilation operators for Auger and X processes have been introduced. Taking into account the symmetries associated with each process, 262 different pathways were obtained. This model allows us to obtain the influence of the M-electro capture in the counting efficiency when the atomic number of the nuclide is high. (Author)

  12. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  13. Stopping power of K electrons at extreme relativistic energies

    International Nuclear Information System (INIS)

    Leung, P.T.; Rustgi, M.L.

    1983-01-01

    The recent work of Anholt on K-vacancy production by relativistic projectiles has been applied to calculate the stopping power of the K electrons. The results show that for protons of energy approx.10 3 GeV and heavy target elements, the relativistic contributions to the stopping power amount to several times the resuls due to the longitudinal terms obtained from Walske's work

  14. fatalityCMR: capture-recapture software to correct raw counts of wildlife fatalities using trial experiments for carcass detection probability and persistence time

    Science.gov (United States)

    Peron, Guillaume; Hines, James E.

    2014-01-01

    Many industrial and agricultural activities involve wildlife fatalities by collision, poisoning or other involuntary harvest: wind turbines, highway network, utility network, tall structures, pesticides, etc. Impacted wildlife may benefit from official protection, including the requirement to monitor the impact. Carcass counts can often be conducted to quantify the number of fatalities, but they need to be corrected for carcass persistence time (removal by scavengers and decay) and detection probability (searcher efficiency). In this article we introduce a new piece of software that fits a superpopulation capture-recapture model to raw count data. It uses trial data to estimate detection and daily persistence probabilities. A recurrent issue is that fatalities of rare, protected species are infrequent, in which case the software offers the option to switch to an ‘evidence of absence’ mode, i.e., estimate the number of carcasses that may have been missed by field crews. The software allows distinguishing between different turbine types (e.g. different vegetation cover under turbines, or different technical properties), as well between two carcass age-classes or states, with transition between those classes (e.g, fresh and dry). There is a data simulation capacity that may be used at the planning stage to optimize sampling design. Resulting mortality estimates can be used 1) to quantify the required amount of compensation, 2) inform mortality projections for proposed development sites, and 3) inform decisions about management of existing sites.

  15. Estimating reach-specific fish movement probabilities in rivers with a Bayesian state-space model: application to sea lamprey passage and capture at dams

    Science.gov (United States)

    Holbrook, Christopher M.; Johnson, Nicholas S.; Steibel, Juan P.; Twohey, Michael B.; Binder, Thomas R.; Krueger, Charles C.; Jones, Michael L.

    2014-01-01

    Improved methods are needed to evaluate barriers and traps for control and assessment of invasive sea lamprey (Petromyzon marinus) in the Great Lakes. A Bayesian state-space model provided reach-specific probabilities of movement, including trap capture and dam passage, for 148 acoustic tagged invasive sea lamprey in the lower Cheboygan River, Michigan, a tributary to Lake Huron. Reach-specific movement probabilities were combined to obtain estimates of spatial distribution and abundance needed to evaluate a barrier and trap complex for sea lamprey control and assessment. Of an estimated 21 828 – 29 300 adult sea lampreys in the river, 0%–2%, or 0–514 untagged lampreys, could have passed upstream of the dam, and 46%–61% were caught in the trap. Although no tagged lampreys passed above the dam (0/148), our sample size was not sufficient to consider the lock and dam a complete barrier to sea lamprey. Results also showed that existing traps are in good locations because 83%–96% of the population was vulnerable to existing traps. However, only 52%–69% of lampreys vulnerable to traps were caught, suggesting that traps can be improved. The approach used in this study was a novel use of Bayesian state-space models that may have broader applications, including evaluation of barriers for other invasive species (e.g., Asian carp (Hypophthalmichthys spp.)) and fish passage structures for other diadromous fishes.

  16. Role of projectile anti K-electrons in single and double K to anti K transfer: Comparison of passive anti K-electron models and of the IFPM with data for Cl17+,16+,≤14+ + Ti

    International Nuclear Information System (INIS)

    Becker, R.L.

    1987-01-01

    Electron transfer between a neutral target and a projectile ion is one of the more interesting and difficult processes to calculate. Experimentally, there is no simple, yet clean, way to measure transfer from a given shell to a given shell. For the case of K to anti K transfer (the bar designating the projectile) an indirect method is common. One measures K-vacancy cross sections for projectiles with ionic charges q=Z, Z-1, and ≤(Z-2). Then with the assumption that the initial anti K electrons are inert, one infers the K 1 to anti K 1 and K 2 to anti K 2 cross sections from linear combinations of the measured cross sections. The postulate that anti K-electrons are inert is brought into doubt by noting that the probability of inverse (anti K to K) transfer is equal by time-reversal invariance to that for K to anti K transfer. An extensive set of such measurements has been reported recently by Hall et al. for the nearly symmetric, strongly interacting systems 17 Cl q+ + 22 Ti. We have performed coupled-channels calculations for these systems and have compared results of various forms of the independent Fermi particle model (IFPM) with and without the assumption that any initially present anti K electron is passive. The passive anti K-electron models provide only a fair approximation to the results of the full IFPM. (orig.)

  17. Multiple electron capture in close ion-atom collisions

    International Nuclear Information System (INIS)

    Schlachter, A.S.; Stearns, J.W.; Berkner, K.H.

    1989-01-01

    Collisions in which a fast highly charged ion passes within the orbit of K electron of a target gas atom are selected by emission of a K x-ray from the projectile or target. Measurement of the projectile charge state after the collision, in coincidence with the K x-ray, allows measurement of the charge-transfer probability during these close collisions. When the projectile velocity is approximately the same as that of target electrons, a large number of electrons can be transferred to the projectile in a single collision. The electron-capture probability is found to be a linear function of the number of vacancies in the projectile L shell for 47-MeV calcium ions in an Ar target. 18 refs., 9 figs

  18. Capturing Thoughts, Capturing Minds?

    DEFF Research Database (Denmark)

    Nielsen, Janni

    2004-01-01

    Think Aloud is cost effective, promises access to the user's mind and is the applied usability technique. But 'keep talking' is difficult, besides, the multimodal interface is visual not verbal. Eye-tracking seems to get around the verbalisation problem. It captures the visual focus of attention...

  19. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  20. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  1. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  2. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  3. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  4. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  5. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  6. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  7. Gravitational capture

    International Nuclear Information System (INIS)

    Bondi, H.

    1979-01-01

    In spite of the strength of gravitational focres between celestial bodies, gravitational capture is not a simple concept. The principles of conservation of linear momentum and of conservation of angular momentum, always impose severe constraints, while conservation of energy and the vital distinction between dissipative and non-dissipative systems allows one to rule out capture in a wide variety of cases. In complex systems especially those without dissipation, long dwell time is a more significant concept than permanent capture. (author)

  8. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  9. Negative muon capture in noble gas mixtures

    International Nuclear Information System (INIS)

    Hutson, R.L.; Knight, J.D.; Leon, M.; Schillaci, M.E.; Knowles, H.B.; Reidy, J.J.

    1980-01-01

    We have determined the probabilities of atomic negative muon capture in binary mixtures of the gases He, Ne, Ar, and Kr at partial pressures near five atmospheres. Relative capture rates were deduced from measured muonic X-ray yields. (orig.)

  10. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  11. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  12. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  13. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  14. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  15. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  16. Negative meson capture in hydrogen

    International Nuclear Information System (INIS)

    Baird, T.J.

    1977-01-01

    The processes of deexcitation and capture of negative mesons and hadrons in atomic hydrogen are investigated. Only slow collisions in which the projectile-atom relative velocity is less than one atomic unit are considered, and the motion of the incident particle is treated classically. For each classical trajectory the probability of ionizing the hydrogen atom is determined, together with the energy spectrum of the emitted electron. Ionization probabilities are calculated using the time-dependent formulation of the perturbed stationary state method. Exact two-center electronic wave functions are used for both bound and continuum states. The total ionization cross section and electron energy spectrum have been calculated for negative muons, kaons and antiprotons at incident relative velocities between 0.04 and 1.0 atomic units. The electron energy spectrum has a sharp peak for electron kinetic energies on the order of 10 -3 Rydbergs. The ionization process thus favors the emission of very slow electrons. The cross section for ionization with capture of the incident particle was calculated for relative kinetic energies greater than 1.0 Rydberg. Since ionization was found to occur with the emission of electrons of nearly zero kinetic energy, the fraction of ionizing collisions which result in capture decreases very rapidly with projectile kinetic energy. The energy distributions of slowed down muons and hadrons were also computed. These distributions were used together with the capture cross section to determine the distribution of kinetic energies at which capture takes place. It was found that most captures occur for kinetic energies slightly less than 1.0 Rydbergs with relatively little capture at thermal energies. The captured particles therefore tend to go into very large and loosely found orbits with binding energies less than 0.1 Rydbergs

  17. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  18. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  19. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  20. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  1. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  2. Quantum probability and conceptual combination in conjunctions.

    Science.gov (United States)

    Hampton, James A

    2013-06-01

    I consider the general problem of category conjunctions in the light of Pothos & Busemeyer (P&B)'s quantum probability (QP) account of the conjunction fallacy. I argue that their account as presented cannot capture the "guppy effect" - the case in which a class is a better member of a conjunction A^B than it is of either A or B alone.

  3. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  4. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  5. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  6. Device of capturing for radioactive corrosion products

    International Nuclear Information System (INIS)

    Ohara, Atsushi; Fukushima, Kimichika.

    1984-01-01

    Purpose: To increase the area of contact between the capturing materials for the radioactive corrosion products contained in the coolants and the coolants by producing stirred turbulent flows in the coolant flow channel of LMFBR type reactors. Constitution: Constituent materials for the nuclear fuel elements or the reactor core structures are activated under the neutron irradiation, corroded and transferred into the coolants. While capturing devices made of pure metal nickel are used for the elimination of the corrosion products, since the coolants form laminar flows due to the viscosity thereof near the surface of the capturing materials, the probability that the corrosion products in the coolants flowing through the middle portion of the channel contact the capturing materials is reduced. In this invention, rotating rolls and flow channels in which the balls are rotated are disposed at the upstream of the capturing device to forcively disturb the flow of the liquid sodium, whereby the radioactive corrosion products can effectively be captured. (Kamimura, M.)

  7. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  8. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  9. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  10. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  11. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  12. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  13. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  14. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  15. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  16. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  17. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  18. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  19. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  20. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  1. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  2. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  3. Negative pion capture in chemical compounds

    International Nuclear Information System (INIS)

    Butsev, V.S.; Chultem, D.; Gavrilov, Yu.K.; Ganzorig, Dz.; Norseev, Yu.V.; Presperin, V.

    1976-01-01

    The results are reported of an experiment of determination of the probability of capture of resting negative pions by iodine nuclei in alkali metal iodides (LiI, NaI, KI, RbI, CsI). The yield of an isomer sup(116m)(Sb/8 - ) with a high spin number, formed in the reaction 127 I(π - , lp 10n) allows to determine the relative probability of the nuclear capture of pions in the above compounds. The results obrained are compared with the predictions of the Fermi-Teller Z-law

  4. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  5. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  6. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  7. Radiative electron capture by channeled ions

    International Nuclear Information System (INIS)

    Pitarke, J.M.; Ritchie, R.H.; Tennessee Univ., Knoxville, TN

    1989-01-01

    Considerable experimental data have been accumulated relative to the emission of photons accompanying electron capture by swift, highly stripped atoms penetrating crystalline matter under channeling conditions. Recent data suggest that the photon energies may be less than that expected from simple considerations of transitions from the valence band of the solid to hydrogenic states on the moving ion. We have studied theoretically the impact parameter dependence of the radiative electron capture (REC) process, the effect of the ion's wake and the effect of capture from inner shells of the solid on the photon emission probability, using a statistical approach. Numerical comparisons of our results with experiment are made. 13 refs., 6 figs

  8. Radiative electron capture

    International Nuclear Information System (INIS)

    Biggerstaff, J.A.; Appleton, B.R.; Datz, S.; Moak, C.D.; Neelavathi, V.N.; Noggle, T.S.; Ritchie, R.H.; VerBeek, H.

    1975-01-01

    Some data are presented for radiative electron capture by fast moving ions. The radiative electron capture spectrum is shown for O 8+ in Ag, along with the energy dependence of the capture cross-section. A discrepancy between earlier data, theoretical prediction, and the present data is pointed out. (3 figs) (U.S.)

  9. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  10. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  11. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  12. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  13. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  14. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  15. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  16. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  17. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  18. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  19. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  20. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  1. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  2. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  3. Stimulus-driven capture and contingent capture

    NARCIS (Netherlands)

    Theeuwes, J.; Olivers, C.N.L.; Belopolsky, A.V.

    2010-01-01

    Whether or not certain physical events can capture attention has been one of the most debated issues in the study of attention. This discussion is concerned with how goal-directed and stimulus-driven processes interact in perception and cognition. On one extreme of the spectrum is the idea that

  4. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  5. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  6. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  7. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  8. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  9. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  10. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  11. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  12. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  13. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  14. Capture ready study

    Energy Technology Data Exchange (ETDEWEB)

    Minchener, A.

    2007-07-15

    There are a large number of ways in which the capture of carbon as carbon dioxide (CO{sub 2}) can be integrated into fossil fuel power stations, most being applicable for both gas and coal feedstocks. To add to the choice of technology is the question of whether an existing plant should be retrofitted for capture, or whether it is more attractive to build totally new. This miscellany of choices adds considerably to the commercial risk of investing in a large power station. An intermediate stage between the non-capture and full capture state would be advantageous in helping to determine the best way forward and hence reduce those risks. In recent years the term 'carbon capture ready' or 'capture ready' has been coined to describe such an intermediate stage plant and is now widely used. However a detailed and all-encompassing definition of this term has never been published. All fossil fuel consuming plant produce a carbon dioxide gas byproduct. There is a possibility of scrubbing it with an appropriate CO{sub 2} solvent. Hence it could be said that all fossil fuel plant is in a condition for removal of its CO{sub 2} effluent and therefore already in a 'capture ready' state. Evidently, the practical reality of solvent scrubbing could cost more than the rewards offered by such as the ETS (European Trading Scheme). In which case, it can be said that although the possibility exists of capturing CO{sub 2}, it is not a commercially viable option and therefore the plant could not be described as ready for CO{sub 2} capture. The boundary between a capture ready and a non-capture ready condition using this definition cannot be determined in an objective and therefore universally acceptable way and criteria must be found which are less onerous and less potentially contentious to assess. 16 refs., 2 annexes.

  15. Capture of a quantum particle by a moving trapping potential

    International Nuclear Information System (INIS)

    Shegelski, Mark R A; Poole, Tyler; Thompson, Cole

    2013-01-01

    We investigate the capture of a quantum particle in free space by a moving trapping potential. The capture is investigated for various initial conditions. We examine the dependence of the probability of capture on the shape and depth of the trapping potential, initial speed and mass of the particle, and other parameters. We take the trapping potential to move with an initial speed v 0 and to decelerate with constant acceleration a c until the well stops moving. We study the motion during the time the well is moving and after the well has stopped. We compare the probability of capture to the probability that the particle is in a stationary state (while the well is moving) and the probability the particle is in a bound state (after the well's motion has stopped). Our work could be of interest to instructors and students in upper-year undergraduate quantum mechanics courses. (paper)

  16. Carbon Capture and Storage

    NARCIS (Netherlands)

    Benson, S.M.; Bennaceur, K.; Cook, P.; Davison, J.; Coninck, H. de; Farhat, K.; Ramirez, C.A.; Simbeck, D.; Surles, T.; Verma, P.; Wright, I.

    2012-01-01

    Emissions of carbon dioxide, the most important long-lived anthropogenic greenhouse gas, can be reduced by Carbon Capture and Storage (CCS). CCS involves the integration of four elements: CO 2 capture, compression of the CO2 from a gas to a liquid or a denser gas, transportation of pressurized CO 2

  17. CAPTURED India Country Evaluation

    NARCIS (Netherlands)

    O'Donoghue, R.; Brouwers, J.H.A.M.

    2012-01-01

    This report provides the findings of the India Country Evaluation and is produced as part of the overall CAPTURED End Evaluation. After five years of support by the CAPTURED project the End Evaluation has assessed that results are commendable. I-AIM was able to design an approach in which health

  18. Interatomic Coulombic electron capture

    International Nuclear Information System (INIS)

    Gokhberg, K.; Cederbaum, L. S.

    2010-01-01

    In a previous publication [K. Gokhberg and L. S. Cederbaum, J. Phys. B 42, 231001 (2009)] we presented the interatomic Coulombic electron capture process--an efficient electron capture mechanism by atoms and ions in the presence of an environment. In the present work we derive and discuss the mechanism in detail. We demonstrate thereby that this mechanism belongs to a family of interatomic electron capture processes driven by electron correlation. In these processes the excess energy released in the capture event is transferred to the environment and used to ionize (or to excite) it. This family includes the processes where the capture is into the lowest or into an excited unoccupied orbital of an atom or ion and proceeds in step with the ionization (or excitation) of the environment, as well as the process where an intermediate autoionizing excited resonance state is formed in the capturing center which subsequently deexcites to a stable state transferring its excess energy to the environment. Detailed derivation of the asymptotic cross sections of these processes is presented. The derived expressions make clear that the environment assisted capture processes can be important for many systems. Illustrative examples are presented for a number of model systems for which the data needed to construct the various capture cross sections are available in the literature.

  19. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  20. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  1. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  2. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  3. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  4. Optional carbon capture

    Energy Technology Data Exchange (ETDEWEB)

    Alderson, T.; Scott, S.; Griffiths, J. [Jacobs Engineering, London (United Kingdom)

    2007-07-01

    In the case of IGCC power plants, carbon capture can be carried out before combustion. The carbon monoxide in the syngas is catalytically shifted to carbon dioxide and then captured in a standard gas absorption system. However, the insertion of a shift converter into an existing IGCC plant with no shift would mean a near total rebuild of the gasification waste heat recovery, gas treatment system and HRSG, with only the gasifier and gas turbine retaining most of their original features. To reduce the extent, cost and time taken for the revamping, the original plant could incorporate the shift, and the plant would then be operated without capture to advantage, and converted to capture mode of operation when commercially appropriate. This paper examines this concept of placing a shift converter into an IGCC plant before capture is required, and operating the same plant first without and then later with CO{sub 2} capture in a European context. The advantages and disadvantages of this 'capture ready' option are discussed. 6 refs., 2 figs., 4 tabs.

  5. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  6. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  7. CAPTURE OF TROJANS BY JUMPING JUPITER

    International Nuclear Information System (INIS)

    Nesvorný, David; Vokrouhlický, David; Morbidelli, Alessandro

    2013-01-01

    Jupiter Trojans are thought to be survivors of a much larger population of planetesimals that existed in the planetary region when planets formed. They can provide important constraints on the mass and properties of the planetesimal disk, and its dispersal during planet migration. Here, we tested a possibility that the Trojans were captured during the early dynamical instability among the outer planets (aka the Nice model), when the semimajor axis of Jupiter was changing as a result of scattering encounters with an ice giant. The capture occurs in this model when Jupiter's orbit and its Lagrange points become radially displaced in a scattering event and fall into a region populated by planetesimals (that previously evolved from their natal transplanetary disk to ∼5 AU during the instability). Our numerical simulations of the new capture model, hereafter jump capture, satisfactorily reproduce the orbital distribution of the Trojans and their total mass. The jump capture is potentially capable of explaining the observed asymmetry in the number of leading and trailing Trojans. We find that the capture probability is (6-8) × 10 –7 for each particle in the original transplanetary disk, implying that the disk contained (3-4) × 10 7 planetesimals with absolute magnitude H disk ∼ 14-28 M Earth , is consistent with the mass deduced from recent dynamical simulations of the planetary instability.

  8. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  9. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  10. Stability of a slotted ALOHA system with capture effect

    Science.gov (United States)

    Onozato, Yoshikuni; Liu, Jin; Noguchi, Shoichi

    1989-02-01

    The stability of a slotted ALOHA system with capture effect is investigated under a general communication environment where terminals are divided into two groups (low-power and high-power) and the capture effect is modeled by capture probabilities. An approximate analysis is developed using catastrophe theory, in which the effects of system and user parameters on the stability are characterized by the cusp catastrophe. Particular attention is given to the low-power group, since it must bear the strain under the capture effect. The stability conditions of the two groups are given explicitly by bifurcation sets.

  11. US Spacesuit Knowledge Capture

    Science.gov (United States)

    Chullen, Cinda; Thomas, Ken; McMann, Joe; Dolan, Kristi; Bitterly, Rose; Lewis, Cathleen

    2011-01-01

    The ability to learn from both the mistakes and successes of the past is vital to assuring success in the future. Due to the close physical interaction between spacesuit systems and human beings as users, spacesuit technology and usage lends itself rather uniquely to the benefits realized from the skillful organization of historical information; its dissemination; the collection and identification of artifacts; and the education of those in the field. The National Aeronautics and Space Administration (NASA), other organizations and individuals have been performing United States (U.S.) Spacesuit Knowledge Capture since the beginning of space exploration. Avenues used to capture the knowledge have included publication of reports; conference presentations; specialized seminars; and classes usually given by veterans in the field. More recently the effort has been more concentrated and formalized whereby a new avenue of spacesuit knowledge capture has been added to the archives in which videotaping occurs engaging both current and retired specialists in the field presenting technical scope specifically for education and preservation of knowledge. With video archiving, all these avenues of learning can now be brought to life with the real experts presenting their wealth of knowledge on screen for future learners to enjoy. Scope and topics of U.S. spacesuit knowledge capture have included lessons learned in spacesuit technology, experience from the Gemini, Apollo, Skylab and Shuttle programs, hardware certification, design, development and other program components, spacesuit evolution and experience, failure analysis and resolution, and aspects of program management. Concurrently, U.S. spacesuit knowledge capture activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive closed-looped spacesuit knowledge capture system which includes

  12. Adiabatic capture and debunching

    International Nuclear Information System (INIS)

    Ng, K.Y.

    2012-01-01

    In the study of beam preparation for the g-2 experiment, adiabatic debunching and adiabatic capture are revisited. The voltage programs for these adiabbatic processes are derived and their properties discussed. Comparison is made with some other form of adiabatic capture program. The muon g-2 experiment at Fermilab calls for intense proton bunches for the creation of muons. A booster batch of 84 bunches is injected into the Recycler Ring, where it is debunched and captured into 4 intense bunches with the 2.5-MHz rf. The experiment requires short bunches with total width less than 100 ns. The transport line from the Recycler to the muon-production target has a low momentum aperture of ∼ ±22 MeV. Thus each of the 4 intense proton bunches required to have an emittance less than ∼ 3.46 eVs. The incoming booster bunches have total emittance ∼ 8.4 eVs, or each one with an emittance ∼ 0.1 eVs. However, there is always emittance increase when the 84 booster bunches are debunched. There will be even larger emittance increase during adiabatic capture into the buckets of the 2.5-MHz rf. In addition, the incoming booster bunches may have emittances larger than 0.1 eVs. In this article, we will concentrate on the analysis of the adiabatic capture process with the intention of preserving the beam emittance as much as possible. At this moment, beam preparation experiment is being performed at the Main Injector. Since the Main Injector and the Recycler Ring have roughly the same lattice properties, we are referring to adiabatic capture in the Main Injector instead in our discussions.

  13. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  14. Motion Capturing Emotions

    OpenAIRE

    Wood Karen; Cisneros Rosemary E.; Whatley Sarah

    2017-01-01

    The paper explores the activities conducted as part of WhoLoDancE: Whole Body Interaction Learning for Dance Education which is an EU-funded Horizon 2020 project. In particular, we discuss the motion capture sessions that took place at Motek, Amsterdam as well as the dancers’ experience of being captured and watching themselves or others as varying visual representations through the HoloLens. HoloLens is Microsoft’s first holographic computer that you wear as you would a pair of glasses. The ...

  15. Nuclear muon capture

    CERN Document Server

    Mukhopadhyay, N C

    1977-01-01

    Our present knowledge of the nuclear muon capture reactions is surveyed. Starting from the formation of the muonic atom, various phenomena, having a bearing on the nuclear capture, are reviewed. The nuclear reactions are then studied from two angles-to learn about the basic muon+nucleon weak interaction process, and to obtain new insights on the nuclear dynamics. Future experimental prospects with the newer generation muon 'factories' are critically examined. Possible modification of the muon+nucleon weak interaction in complex nuclei remains the most important open problem in this field. (380 refs).

  16. Proton capture resonance studies

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, G.E. [North Carolina State University, Raleigh, North Carolina (United States) 27695]|[Triangle Universities Nuclear Laboratory, Durham, North Carolina (United States) 27708; Bilpuch, E.G. [Duke University, Durham, North Carolina (United States) 27708]|[Triangle Universities Nuclear Laboratory, Durham, North Carolina (United States) 27708; Bybee, C.R. [North Carolina State University, Raleigh, North Carolina (United States) 27695]|[Triangle Universities Nuclear Laboratory, Durham, North Carolina (United States) 27708; Cox, J.M.; Fittje, L.M. [Tennessee Technological University, Cookeville, Tennessee (United States) 38505]|[Triangle Universities Nuclear Laboratory, Durham, North Carolina (United States) 27708; Labonte, M.A.; Moore, E.F.; Shriner, J.D. [North Carolina State University, Raleigh, North Carolina (United States) 27695]|[Triangle Universities Nuclear Laboratory, Durham, North Carolina (United States) 27708; Shriner, J.F. Jr. [Tennessee Technological University, Cookeville, Tennessee (United States) 38505]|[Triangle Universities Nuclear Laboratory, Durham, North Carolina (United States) 27708; Vavrina, G.A. [North Carolina State University, Raleigh, North Carolina (United States) 27695]|[Triangle Universities Nuclear Laboratory, Durham, North Carolina (United States) 27708; Wallace, P.M. [Duke University, Durham, North Carolina (United States) 27708]|[Triangle Universities Nuclear Laboratory, Durham, North Carolina (United States) 27708

    1997-02-01

    The fluctuation properties of quantum systems now are used as a signature of quantum chaos. The analyses require data of extremely high quality. The {sup 29}Si(p,{gamma}) reaction is being used to establish a complete level scheme of {sup 30}P to study chaos and isospin breaking in this nuclide. Determination of the angular momentum J, the parity {pi}, and the isospin T from resonance capture data is considered. Special emphasis is placed on the capture angular distributions and on a geometric description of these angular distributions. {copyright} {ital 1997 American Institute of Physics.}

  17. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  18. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  19. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  20. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  1. Muon capture in deuterium

    Czech Academy of Sciences Publication Activity Database

    Ricci, P.; Truhlík, Emil; Mosconi, B.; Smejkal, J.

    2010-01-01

    Roč. 837, - (2010), s. 110-144 ISSN 0375-9474 Institutional research plan: CEZ:AV0Z10480505 Keywords : Negative muon capture * Deuteron * Potential models Subject RIV: BE - Theoretical Physics Impact factor: 1.986, year: 2010

  2. Capture Matrices Handbook

    Science.gov (United States)

    2014-04-01

    materials, the affinity ligand would need identification , as well as chemistries that graft the affinity ligand onto the surface of magnetic...ACTIVE CAPTURE MATRICES FOR THE DETECTION/ IDENTIFICATION OF PHARMACEUTICALS...6 As shown in Figure 2.3-1a, the spectra exhibit similar baselines and the spectral peaks lineup . Under these circumstances, the spectral

  3. Capacitance for carbon capture

    International Nuclear Information System (INIS)

    Landskron, Kai

    2018-01-01

    Metal recycling: A sustainable, capacitance-assisted carbon capture and sequestration method (Supercapacitive Swing Adsorption) can turn scrap metal and CO 2 into metal carbonates at an attractive energy cost. (copyright 2018 Wiley-VCH Verlag GmbH and Co. KGaA, Weinheim)

  4. Capacitance for carbon capture

    Energy Technology Data Exchange (ETDEWEB)

    Landskron, Kai [Department of Chemistry, Lehigh University, Bethlehem, PA (United States)

    2018-03-26

    Metal recycling: A sustainable, capacitance-assisted carbon capture and sequestration method (Supercapacitive Swing Adsorption) can turn scrap metal and CO{sub 2} into metal carbonates at an attractive energy cost. (copyright 2018 Wiley-VCH Verlag GmbH and Co. KGaA, Weinheim)

  5. Embedded enzymes catalyse capture

    Science.gov (United States)

    Kentish, Sandra

    2018-05-01

    Membrane technologies for carbon capture can offer economic and environmental advantages over conventional amine-based absorption, but can suffer from limited gas flux and selectivity to CO2. Now, a membrane based on enzymes embedded in hydrophilic pores is shown to exhibit combined flux and selectivity that challenges the state of the art.

  6. Attention Capture by Faces

    Science.gov (United States)

    Langton, Stephen R. H.; Law, Anna S.; Burton, A. Mike; Schweinberger, Stefan R.

    2008-01-01

    We report three experiments that investigate whether faces are capable of capturing attention when in competition with other non-face objects. In Experiment 1a participants took longer to decide that an array of objects contained a butterfly target when a face appeared as one of the distracting items than when the face did not appear in the array.…

  7. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  8. Formation of double galaxies by tidal capture

    International Nuclear Information System (INIS)

    Alladin, S.M.; Potdar, A.; Sastry, K.S.

    1975-01-01

    The conditions under which double galaxies may be formed by tidal capture are considered. Estimates for the increase in the internal energy of colliding galaxies due to tidal effects are used to determine the magnitudes Vsub(cap) and Vsub(dis) of the maximum relative velocities at infinite separation required for tidal capture and tidal disruption respectively. A double galaxy will be formed by tidal capture without tidal disruption of a component if Vsub(cap)>Vsub(i) and Vsub(cap)>Vsub(dis) where Vsub(i) is the initial relative speed of the two galaxies at infinite separation. If the two galaxies are of the same dimension, formulation of double galaxies by tidal capture is possible in a close collision either if the two galaxies do not differ much in mass and density distribution or if the more massive galaxy is less centrally concentrated than the other. If it is assumed as statistics suggest, that the mass of a galaxy is proportional to the square of its radius, it follows that the probability of the formation of double galaxies by tidal capture increases with the increase in mass of the galaxies and tidal distribution does not occur in a single collision for any distance of closest approach of the two galaxies. (Auth.)

  9. Experimental study on pion capture by hydrogen bound in molecules

    International Nuclear Information System (INIS)

    Horvath, D.; Aniol, K.A.; Entezami, F.; Measday, D.F.; Noble, A.J.; Stanislaus, S.; Virtue, C.J.

    1988-08-01

    An experiment was performed at TRIUMF to study the formation of pionic hydrogen atoms and molecules in solids, particularly in groups of organic molecules of slightly different structure in order to help further clarify the problem. The nuclear capture of pions by hydrogen was measured using the charge exchange of stopped pions. The coincident photons emitted by the decaying π 0 mesons were detected by TRIUMF's two large NaI spectrometers. New experimental results were obtained for the capture probability of stopped π - mesons in the nuclei of hydrogen atoms, chemically bound in molecules of some simple hydrides, acid anhydrides, and sugar isomers. A linear relation was found between pion capture in hydrogen and melting point in sugar isomers. The pion capture probability in acid anhydrides is fairly well described by a simple atomic capture model in which the capture probability on the hydrogen dramatically increases as the hydrogen atom is separated from the strongly electronegative C 2 O 3 group. Both effects are consistent with a correlation between pion capture and electron density on hydrogen atoms. (Author) (38 refs., 4 tabs., 7 figs.)

  10. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  11. A Generalized Estimating Equations Approach to Model Heterogeneity and Time Dependence in Capture-Recapture Studies

    Directory of Open Access Journals (Sweden)

    Akanda Md. Abdus Salam

    2017-03-01

    Full Text Available Individual heterogeneity in capture probabilities and time dependence are fundamentally important for estimating the closed animal population parameters in capture-recapture studies. A generalized estimating equations (GEE approach accounts for linear correlation among capture-recapture occasions, and individual heterogeneity in capture probabilities in a closed population capture-recapture individual heterogeneity and time variation model. The estimated capture probabilities are used to estimate animal population parameters. Two real data sets are used for illustrative purposes. A simulation study is carried out to assess the performance of the GEE estimator. A Quasi-Likelihood Information Criterion (QIC is applied for the selection of the best fitting model. This approach performs well when the estimated population parameters depend on the individual heterogeneity and the nature of linear correlation among capture-recapture occasions.

  12. Gadolinium neutron capture therapy

    International Nuclear Information System (INIS)

    Akine, Yasuyuki; Tokita, Nobuhiko; Tokuuye, Koichi; Satoh, Michinao; Churei, Hisahiko

    1993-01-01

    Gadolinium neutron capture therapy makes use of photons and electrons produced by nuclear reactions between gadolinium and lower-energy neutrons which occur within the tumor. The results of our studies have shown that its radiation effect is mostly of low LET and that the electrons are the significant component in the over-all dose. The dose from gadolinium neutron capture reactions does not seem to increase in proportion to the gadolinium concentration, and the Gd-157 concentration of about 100 μg/ml appears most optimal for therapy. Close contact between gadolinium and the cell is not necessarily required for cell inactivation, however, the effect of electrons released from intracellular gadolinium may be significant. Experimental studies on tumor-bearing mice and rabbits have shown that this is a very promising modality though further improvements in gadolinium delivery to tumors are needed. (author)

  13. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  14. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  15. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  16. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  17. Capture and transfer of stopped pions in alcohols

    International Nuclear Information System (INIS)

    Harston, M.R.; Armstrong, D.S.; Measday, D.F.; Stanislaus, S.; Weber, P.; Horvath, D.

    1990-02-01

    The pion charge exchange probability in hydrogen for stopped π - has been measured for a series of alcohols. The relative atomic capture probabilities for hydrogen in different chemical environments as well as for the other molecular constituents were extracted from the data using a phenomenological approach. The results allow the prediction of the charge exchange probability in other molecules of similar chemical structure. The charge exchange probability in deuterated methanols was measured and compared to the prediction of our model. A comprehensive picture is obtained if pion transfer from hydrogen to deuterium is included

  18. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  19. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  20. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  1. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  2. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  3. Motion Capturing Emotions

    Directory of Open Access Journals (Sweden)

    Wood Karen

    2017-12-01

    Full Text Available The paper explores the activities conducted as part of WhoLoDancE: Whole Body Interaction Learning for Dance Education which is an EU-funded Horizon 2020 project. In particular, we discuss the motion capture sessions that took place at Motek, Amsterdam as well as the dancers’ experience of being captured and watching themselves or others as varying visual representations through the HoloLens. HoloLens is Microsoft’s first holographic computer that you wear as you would a pair of glasses. The study embraced four dance genres: Ballet, Contemporary, Flamenco and Greek Folk dance. We are specifically interested in the kinesthetic and emotional engagement with the moving body and what new corporeal awareness may be experienced. Positioning the moving, dancing body as fundamental to technological advancements, we discuss the importance of considering the dancer’s experience in the real and virtual space. Some of the artists involved in the project have offered their experiences, which are included, and they form the basis of the discussion. In addition, we discuss the affect of immersive environments, how these environments expand reality and what effect (emotionally and otherwise that has on the body. The research reveals insights into relationships between emotion, movement and technology and what new sensorial knowledge this evokes for the dancer.

  4. Synovectomy by Neutron capture

    International Nuclear Information System (INIS)

    Vega C, H.R.; Torres M, C.

    1998-01-01

    The Synovectomy by Neutron capture has as purpose the treatment of the rheumatoid arthritis, illness which at present does not have a definitive curing. This therapy requires a neutron source for irradiating the articulation affected. The energy spectra and the intensity of these neutrons are fundamental since these neutrons induce nuclear reactions of capture with Boron-10 inside the articulation and the freely energy of these reactions is transferred at the productive tissue of synovial liquid, annihilating it. In this work it is presented the neutron spectra results obtained with moderator packings of spherical geometry which contains in its center a Pu 239 Be source. The calculations were realized through Monte Carlo method. The moderators assayed were light water, heavy water base and the both combination of them. The spectra obtained, the average energy, the neutron total number by neutron emitted by source, the thermal neutron percentage and the dose equivalent allow us to suggest that the moderator packing more adequate is what has a light water thickness 0.5 cm (radius 2 cm) and 24.5 cm heavy water (radius 26.5 cm). (Author)

  5. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  6. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  7. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  8. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  9. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  10. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  12. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  13. The problem of latent attentional capture: Easy visual search conceals capture by task-irrelevant abrupt onsets.

    Science.gov (United States)

    Gaspelin, Nicholas; Ruthruff, Eric; Lien, Mei-Ching

    2016-08-01

    Researchers are sharply divided regarding whether irrelevant abrupt onsets capture spatial attention. Numerous studies report that they do and a roughly equal number report that they do not. This puzzle has inspired numerous attempts at reconciliation, none gaining general acceptance. The authors propose that abrupt onsets routinely capture attention, but the size of observed capture effects depends critically on how long attention dwells on distractor items which, in turn, depends critically on search difficulty. In a series of spatial cuing experiments, the authors show that irrelevant abrupt onsets produce robust capture effects when visual search is difficult, but not when search is easy. Critically, this effect occurs even when search difficulty varies randomly across trials, preventing any strategic adjustments of the attentional set that could modulate probability of capture by the onset cue. The authors argue that easy visual search provides an insensitive test for stimulus-driven capture by abrupt onsets: even though onsets truly capture attention, the effects of capture can be latent. This observation helps to explain previous failures to find capture by onsets, nearly all of which used an easy visual search. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  14. Robust automated knowledge capture.

    Energy Technology Data Exchange (ETDEWEB)

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  15. Fragment capture device

    Science.gov (United States)

    Payne, Lloyd R.; Cole, David L.

    2010-03-30

    A fragment capture device for use in explosive containment. The device comprises an assembly of at least two rows of bars positioned to eliminate line-of-sight trajectories between the generation point of fragments and a surrounding containment vessel or asset. The device comprises an array of at least two rows of bars, wherein each row is staggered with respect to the adjacent row, and wherein a lateral dimension of each bar and a relative position of each bar in combination provides blockage of a straight-line passage of a solid fragment through the adjacent rows of bars, wherein a generation point of the solid fragment is located within a cavity at least partially enclosed by the array of bars.

  16. Capturing the Daylight Dividend

    Energy Technology Data Exchange (ETDEWEB)

    Peter Boyce; Claudia Hunter; Owen Howlett

    2006-04-30

    Capturing the Daylight Dividend conducted activities to build market demand for daylight as a means of improving indoor environmental quality, overcoming technological barriers to effective daylighting, and informing and assisting state and regional market transformation and resource acquisition program implementation efforts. The program clarified the benefits of daylight by examining whole building systems energy interactions between windows, lighting, heating, and air conditioning in daylit buildings, and daylighting's effect on the human circadian system and productivity. The project undertook work to advance photosensors, dimming systems, and ballasts, and provided technical training in specifying and operating daylighting controls in buildings. Future daylighting work is recommended in metric development, technology development, testing, training, education, and outreach.

  17. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  18. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  19. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  20. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  1. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  2. A Lakatosian Encounter with Probability

    Science.gov (United States)

    Chick, Helen

    2010-01-01

    There is much to be learned and pondered by reading "Proofs and Refutations" by Imre Lakatos (Lakatos, 1976). It highlights the importance of mathematical definitions, and how definitions evolve to capture the essence of the object they are defining. It also provides an exhilarating encounter with the ups and downs of the mathematical reasoning…

  3. Rcapture: Loglinear Models for Capture-Recapture in R

    Directory of Open Access Journals (Sweden)

    Sophie Baillargeon

    2007-04-01

    Full Text Available This article introduces Rcapture, an R package for capture-recapture experiments. The data for analysis consists of the frequencies of the observable capture histories over the t capture occasions of the experiment. A capture history is a vector of zeros and ones where one stands for a capture and zero for a miss. Rcapture can fit three types of models. With a closed population model, the goal of the analysis is to estimate the size N of the population which is assumed to be constant throughout the experiment. The estimator depends on the way in which the capture probabilities of the animals vary. Rcapture features several models for these capture probabilities that lead to different estimators for N. In an open population model, immigration and death occur between sampling periods. The estimation of survival rates is of primary interest. Rcapture can fit the basic Cormack-Jolly-Seber and Jolly-Seber model to such data. The third type of models fitted by Rcapture are robust design models. It features two levels of sampling; closed population models apply within primary periods and an open population model applies between periods. Most models in Rcapture have a loglinear form; they are fitted by carrying out a Poisson regression with the R function glm. Estimates of the demographic parameters of interest are derived from the loglinear parameter estimates; their variances are obtained by linearization. The novel feature of this package is the provision of several new options for modeling capture probabilities heterogeneity between animals in both closed population models and the primary periods of a robust design. It also implements many of the techniques developed by R. M. Cormack for open population models.

  4. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  5. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Neutron capture therapy

    International Nuclear Information System (INIS)

    Jun, B. J.

    1998-11-01

    The overall state of the art related with neutron capture therapy(NCT) is surveyed. Since the field related with NCT is very wide, it is not intended to survey all related subjects in depth. The primary objective of this report is to help those working for the installation of a NCT facility and a PGNAA(prompt gamma ray neutron activation analysis) system for the boron analysis understand overall NCT at Hanaro. Therefore, while the parts of reactor neutron source and PGNAA are dealt in detail, other parts are limited to the level necessary to understand related fields. For example, the subject of chemical compound which requires intensive knowledge on chemistry, is not dealt as a separated item. However, the requirement of a compound for NCT, currently available compounds, their characteristics, etc. could be understood through this report. Although the subject of cancer treated by NCT is out of the capability of the author, it is dealt focussing its characteristics related with the success of NCT. Each detailed subject is expected to be dealt more detail by specialists in future. This report would be helpful for the researchers working for the NCT to understand related fields. (author). 128 refs., 3 tabs., 12 figs

  7. Captured by Aliens

    Science.gov (United States)

    Achenbach, Joel

    2000-03-01

    Captured by Aliens is a long and twisted voyage from science to the supernatural and back again. I hung out in Roswell, N.M., spent time with the Mars Society, met a guy who was figuring out the best way to build a spaceship to go to Alpha Centauri. I visited the set of the X-Files and talked to Mulder and Scully. One day over breakfast I was told by NASA administrator Dan Goldin, We live in a fog, man! He wants the big answers to the big questions. I spent a night in the base of a huge radio telescope in the boondocks of West Virginia, awaiting the signal from the aliens. I was hypnotized in a hotel room by someone who suspected that I'd been abducted by aliens and that this had triggered my interest in the topic. In the last months of his life, I talked to Carl Sagan, who believed that the galaxy riots with intelligent civilizations. He's my hero, for his steadfast adherence to the scientific method. What I found in all this is that the big question that needs immediate attention is not what's out THERE, but what's going on HERE, on Earth, and why we think the way we do, and how we came to be here in the first place.

  8. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  9. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  10. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  11. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  12. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  13. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  14. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  15. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  16. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  17. Capture into resonance and phase space dynamics in optical centrifuge

    Science.gov (United States)

    Armon, Tsafrir; Friedland, Lazar

    2016-05-01

    The process of capture of a molecular enesemble into rotational resonance in the optical centrifuge is investigated. The adiabaticity and phase space incompressibility are used to find the resonant capture probability in terms of two dimensionless parameters P1 , 2 characterising the driving strength and the nonlinearity, and related to three characteristic time scales in the problem. The analysis is based on the transformation to action-angle variables and the single resonance approximation, yielding reduction of the three-dimensional rotation problem to one degree of freedom. The analytic results for capture probability are in a good agreement with simulations. The existing experiments satisfy the validity conditions of the theory. This work was supported by the Israel Science Foundation Grant 30/14.

  18. The Generic Data Capture Facility

    Science.gov (United States)

    Connell, Edward B.; Barnes, William P.; Stallings, William H.

    1987-01-01

    The Generic Data Capture Facility, which can provide data capture support for a variety of different types of spacecraft while enabling operations costs to be carefully controlled, is discussed. The data capture functions, data protection, isolation of users from data acquisition problems, data reconstruction, and quality and accounting are addressed. The TDM and packet data formats utilized by the system are described, and the development of generic facilities is considered.

  19. Neutron capture therapy. Principles and applications

    International Nuclear Information System (INIS)

    Sauerwein, Wolfgang A.G.; Moss, Raymond; Wittig, Andrea; Nakagawa, Yoshinobu

    2012-01-01

    State of the art report on neutron capture therapy. Summarizes the progress made in recent decades. Multidisciplinary approach. Written by the most experienced specialists Neutron capture therapy (NCT) is based on the ability of the non-radioactive isotope boron-10 to capture thermal neutrons with very high probability and immediately to release heavy particles with a path length of one cell diameter. This in principle allows for tumor cell-selective high-LET particle radiotherapy. NCT is exciting scientifically but challenging clinically, and a key factor in success is close collaboration among very different disciplines. This book provides a comprehensive summary of the progress made in NCT in recent years. Individual sections cover all important aspects, including neutron sources, boron chemistry, drugs for NCT, dosimetry, and radiation biology. The use of NCT in a variety of malignancies and also some non-malignant diseases is extensively discussed. NCT is clearly shown to be a promising modality at the threshold of wider clinical application. All of the chapters are written by experienced specialists in language that will be readily understood by all participating disciplines.

  20. Carbon captured from the air

    Energy Technology Data Exchange (ETDEWEB)

    Keith, D. [Calgary Univ., AB (Canada)

    2008-10-15

    This article presented an innovative way to achieve the efficient capture of atmospheric carbon. A team of scientists from the University of Calgary's Institute for Sustainable Energy, Environment and Economy have shown that it is possible to reduce carbon dioxide (CO{sub 2}) using a simple machine that can capture the trace amount of CO{sub 2} present in ambient air at any place on the planet. The thermodynamics of capturing the small concentrations of CO{sub 2} from the air is only slightly more difficult than capturing much larger concentrations of CO{sub 2} from power plants. The research is significant because it offers a way to capture CO{sub 2} emissions from transportation sources such as vehicles and airplanes, which represent more than half of the greenhouse gases emitted on Earth. The energy efficient and cost effective air capture technology could complement other approaches for reducing emissions from the transportation sector, such as biofuels and electric vehicles. Air capture differs from carbon capture and storage (CCS) technology used at coal-fired power plants where CO{sub 2} is captured and pipelined for permanent storage underground. Air capture can capture the CO{sub 2} that is present in ambient air and store it wherever it is cheapest. The team at the University of Calgary showed that CO{sub 2} could be captured directly from the air with less than 100 kWhrs of electricity per tonne of CO{sub 2}. A custom-built tower was able to capture the equivalent of 20 tonnes per year of CO{sub 2} on a single square meter of scrubbing material. The team devised a way to use a chemical process from the pulp and paper industry to cut the energy cost of air capture in half. Although the technology is only in its early stage, it appears that CO{sub 2} could be captured from the air with an energy demand comparable to that needed for CO{sub 2} capture from conventional power plants, but costs will be higher. The simple, reliable and scalable technology

  1. Resource capture by single leaves

    Energy Technology Data Exchange (ETDEWEB)

    Long, S.P.

    1992-05-01

    Leaves show a variety of strategies for maximizing CO{sub 2} and light capture. These are more meaningfully explained if they are considered in the context of maximizing capture relative to the utilization of water, nutrients and carbohydrates reserves. There is considerable variation between crops in their efficiency of CO{sub 2} and light capture at the leaf level. Understanding of these mechanisms indicate some ways in which efficiency of resource capture could be level cannot be meaningfully considered without simultaneous understanding of implications at the canopy level. 36 refs., 5 figs., 1 tab.

  2. Carbon captured from the air

    International Nuclear Information System (INIS)

    Keith, D.

    2008-01-01

    This article presented an innovative way to achieve the efficient capture of atmospheric carbon. A team of scientists from the University of Calgary's Institute for Sustainable Energy, Environment and Economy have shown that it is possible to reduce carbon dioxide (CO 2 ) using a simple machine that can capture the trace amount of CO 2 present in ambient air at any place on the planet. The thermodynamics of capturing the small concentrations of CO 2 from the air is only slightly more difficult than capturing much larger concentrations of CO 2 from power plants. The research is significant because it offers a way to capture CO 2 emissions from transportation sources such as vehicles and airplanes, which represent more than half of the greenhouse gases emitted on Earth. The energy efficient and cost effective air capture technology could complement other approaches for reducing emissions from the transportation sector, such as biofuels and electric vehicles. Air capture differs from carbon capture and storage (CCS) technology used at coal-fired power plants where CO 2 is captured and pipelined for permanent storage underground. Air capture can capture the CO 2 that is present in ambient air and store it wherever it is cheapest. The team at the University of Calgary showed that CO 2 could be captured directly from the air with less than 100 kWhrs of electricity per tonne of CO 2 . A custom-built tower was able to capture the equivalent of 20 tonnes per year of CO 2 on a single square meter of scrubbing material. The team devised a way to use a chemical process from the pulp and paper industry to cut the energy cost of air capture in half. Although the technology is only in its early stage, it appears that CO 2 could be captured from the air with an energy demand comparable to that needed for CO 2 capture from conventional power plants, but costs will be higher. The simple, reliable and scalable technology offers an opportunity to build a commercial-scale plant. 1 fig

  3. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  4. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  5. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  6. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  7. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  8. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  9. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  10. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  11. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  12. Electron capture rate of a composite of partially ionized atomic nuclei

    International Nuclear Information System (INIS)

    Yokoi, K.; Takahashi, K.

    1979-01-01

    Electron captures (or more generally β-transitions) are known to play key roles at various stages of stellar evolution and in many nucleosynthesis processes. With decreasing temperatures and densities, the bound electron captures start to compete with the free electron captures, and eventually in the low-temperature, low-density limit the total capture rate shall converge to that of the orbital electrons observed in laboratory. The authors calculate the occupation probabilities of the electron orbits and the electron capture rates in a mixture of atoms and ions which are supposedly under a chemical equilibrium. (orig./AH)

  13. Iodine neutron capture therapy

    Science.gov (United States)

    Ahmed, Kazi Fariduddin

    A new technique, Iodine Neutron Capture Therapy (INCT) is proposed to treat hyperthyroidism in people. Present thyroid therapies, surgical removal and 131I treatment, result in hypothyroidism and, for 131I, involve protracted treatment times and excessive whole-body radiation doses. The new technique involves using a low energy neutron beam to convert a fraction of the natural iodine stored in the thyroid to radioactive 128I, which has a 24-minute half-life and decays by emitting 2.12-MeV beta particles. The beta particles are absorbed in and damage some thyroid tissue cells and consequently reduce the production and release of thyroid hormones to the blood stream. Treatment times and whole-body radiation doses are thus reduced substantially. This dissertation addresses the first of the several steps needed to obtain medical profession acceptance and regulatory approval to implement this therapy. As with other such programs, initial feasibility is established by performing experiments on suitable small mammals. Laboratory rats were used and their thyroids were exposed to the beta particles coming from small encapsulated amounts of 128I. Masses of 89.0 mg reagent-grade elemental iodine crystals have been activated in the ISU AGN-201 reactor to provide 0.033 mBq of 128I. This activity delivers 0.2 Gy to the thyroid gland of 300-g male rats having fresh thyroid tissue masses of ˜20 mg. Larger iodine masses are used to provide greater doses. The activated iodine is encapsulated to form a thin (0.16 cm 2/mg) patch that is then applied directly to the surgically exposed thyroid of an anesthetized rat. Direct neutron irradiation of a rat's thyroid was not possible due to its small size. Direct in-vivo exposure of the thyroid of the rat to the emitted radiation from 128I is allowed to continue for 2.5 hours (6 half-lives). Pre- and post-exposure blood samples are taken to quantify thyroid hormone levels. The serum T4 concentration is measured by radioimmunoassay at

  14. Exact capture probability analysis of GSC receivers over Rayleigh fading channel

    KAUST Repository

    Nam, Sungsik; Alouini, Mohamed-Slim; Hasna, Mazen Omar

    2010-01-01

    combined paths power) to that of the total available power. The major difficulty in these problems is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics

  15. Do key dimensions of seed and seedling functional trait variation capture variation in recruitment probability?

    Science.gov (United States)

    1. Plant functional traits provide a mechanistic basis for understanding ecological variation among plant species and the implications of this variation for species distribution, community assembly and restoration. 2. The bulk of our functional trait understanding, however, is centered on traits rel...

  16. Fish welfare in capture fisheries

    NARCIS (Netherlands)

    Veldhuizen, L.J.L.; Berentsen, P.B.M.; Boer, de I.J.M.; Vis, van de J.W.; Bokkers, E.A.M.

    2018-01-01

    Concerns about the welfare of production animals have extended from farm animals to fish, but an overview of the impact of especially capture fisheries on fish welfare is lacking. This review provides a synthesis of 85 articles, which demonstrates that research interest in fish welfare in capture

  17. Calculating the albedo characteristics by the method of transmission probabilities

    International Nuclear Information System (INIS)

    Lukhvich, A.A.; Rakhno, I.L.; Rubin, I.E.

    1983-01-01

    The possibility to use the method of transmission probabilities for calculating the albedo characteristics of homogeneous and heterogeneous zones is studied. The transmission probabilities method is a numerical method for solving the transport equation in the integrated form. All calculations have been conducted as a one-group approximation for the planes and rods with different optical thicknesses and capture-to-scattering ratios. Above calculations for plane and cylindrical geometries have shown the possibility to use the numerical method of transmission probabilities for calculating the albedo characteristics of homogeneous and heterogeneous zones with high accuracy. In this case the computer time consumptions are minimum even with the cylindrical geometry, if the interpolation calculation of characteristics is used for the neutrons of the first path

  18. Selection of risk reduction portfolios under interval-valued probabilities

    International Nuclear Information System (INIS)

    Toppila, Antti; Salo, Ahti

    2017-01-01

    A central problem in risk management is that of identifying the optimal combination (or portfolio) of improvements that enhance the reliability of the system most through reducing failure event probabilities, subject to the availability of resources. This optimal portfolio can be sensitive with regard to epistemic uncertainties about the failure events' probabilities. In this paper, we develop an optimization model to support the allocation of resources to improvements that mitigate risks in coherent systems in which interval-valued probabilities defined by lower and upper bounds are employed to capture epistemic uncertainties. Decision recommendations are based on portfolio dominance: a resource allocation portfolio is dominated if there exists another portfolio that improves system reliability (i) at least as much for all feasible failure probabilities and (ii) strictly more for some feasible probabilities. Based on non-dominated portfolios, recommendations about improvements to implement are derived by inspecting in how many non-dominated portfolios a given improvement is contained. We present an exact method for computing the non-dominated portfolios. We also present an approximate method that simplifies the reliability function using total order interactions so that larger problem instances can be solved with reasonable computational effort. - Highlights: • Reliability allocation under epistemic uncertainty about probabilities. • Comparison of alternatives using dominance. • Computational methods for generating the non-dominated alternatives. • Deriving decision recommendations that are robust with respect to epistemic uncertainty.

  19. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  20. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  1. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  2. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  3. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  4. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  5. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  6. Materials For Gas Capture, Methods Of Making Materials For Gas Capture, And Methods Of Capturing Gas

    KAUST Repository

    Polshettiwar, Vivek

    2013-06-20

    In accordance with the purpose(s) of the present disclosure, as embodied and broadly described herein, embodiments of the present disclosure, in one aspect, relate to materials that can be used for gas (e.g., CO.sub.2) capture, methods of making materials, methods of capturing gas (e.g., CO.sub.2), and the like, and the like.

  7. Electron capture and stellar collapse

    International Nuclear Information System (INIS)

    Chung, K.C.

    1979-01-01

    In order, to investigate the function of electron capture in the phenomenon of pre-supernovae gravitacional collapse, an hydrodynamic caculation was carried out, coupling capture, decay and nuclear reaction equation system. A star simplified model (homogeneous model) was adopted using fermi ideal gas approximation for tthe sea of free electrons and neutrons. The non simplified treatment from quasi-static evolution to collapse is presented. The capture and beta decay rates, as wellas neutron delayed emission, were calculated by beta decay crude theory, while the other reaction rates were determined by usual theories. The preliminary results are presented. (M.C.K.) [pt

  8. Proton capture by magnetic monopoles

    International Nuclear Information System (INIS)

    Olaussen, K.; Olsen, H.A.; Oeverboe, I.; Osland, P.

    1983-09-01

    In the Kazama-Yang approximation, the lowest monopole-proton bound states have binding energies of 938 MeV, 263 keV, 105 eV, and 0.04 eV. The cross section for radiative capture to these states is for velocities β = 10 -5 - 10 -3 found to be of the order of 10 -28 - 10 -26 cm 2 . For the state that has a binding energy of 263 keV, the capture length in water is 171 x (β/10 -4 )sup(0.48) m. Observation of photons from the capture process would indicate the presence of monopoles. (orig.)

  9. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  10. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  11. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  12. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  13. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  14. Atomic and nuclear parameters of single electron capture decaying nuclides

    International Nuclear Information System (INIS)

    Grau, A.

    1981-01-01

    Atomic and nuclear parameters of the following nuclides which decay by electron capture have been calculated: 37 A r, 41 C a, 49 V , 53 M n, 55 F e,59 N i, 68Ge,82 S r, 97 T c, 118 T e, 131 C s, 137 L a, 140 N d, 157 T b, 165 E r, 193 p t, 194 H g, and 205 P h The evaluation rules are included in the first part of the paper. The values and the associated uncertainties of the following parameters have been tabulated: decay energy, electron capture probabilities, fluorescence yield, electron emission and X-ray emission. (Author) 27 refs

  15. Carbon capture and sequestration (CCS)

    Science.gov (United States)

    2009-06-19

    Carbon capture and sequestration (or storage)known as CCShas attracted interest as a : measure for mitigating global climate change because large amounts of carbon dioxide (CO2) : emitted from fossil fuel use in the United States are potentiall...

  16. Enzymes in CO2 Capture

    DEFF Research Database (Denmark)

    Fosbøl, Philip Loldrup; Gladis, Arne; Thomsen, Kaj

    The enzyme Carbonic Anhydrase (CA) can accelerate the absorption rate of CO2 into aqueous solutions by several-fold. It exist in almost all living organisms and catalyses different important processes like CO2 transport, respiration and the acid-base balances. A new technology in the field...... of carbon capture is the application of enzymes for acceleration of typically slow ternary amines or inorganic carbonates. There is a hidden potential to revive currently infeasible amines which have an interesting low energy consumption for regeneration but too slow kinetics for viable CO2 capture. The aim...... of this work is to discuss the measurements of kinetic properties for CA promoted CO2 capture solvent systems. The development of a rate-based model for enzymes will be discussed showing the principles of implementation and the results on using a well-known ternary amine for CO2 capture. Conclusions...

  17. Alignment in double capture processes

    International Nuclear Information System (INIS)

    Moretto-Capelle, P.; Benhenni, M.; Bordenave-Montesquieu, A.; Benoit-Cattin, P.; Gleizes, A.

    1993-01-01

    The electron spectra emitted when a double capture occurs in N 7+ +He and Ne 8+ +He systems at 10 qkeV collisional energy, allow us to determine the angular distributions of the 3 ell 3 ell ' lines through a special spectra fitting procedure which includes interferences between neighbouring states. It is found that the doubly excited states populated in double capture processes are generally aligned

  18. Alignment in double capture processes

    Energy Technology Data Exchange (ETDEWEB)

    Moretto-Capelle, P.; Benhenni, M.; Bordenave-Montesquieu, A.; Benoit-Cattin, P.; Gleizes, A. (IRSAMC, URA CNRS 770, Univ. Paul Sabatier, 118 rte de Narbonne, 31062 Toulouse Cedex (France))

    1993-06-05

    The electron spectra emitted when a double capture occurs in N[sup 7+]+He and Ne[sup 8+]+He systems at 10 qkeV collisional energy, allow us to determine the angular distributions of the 3[ell]3[ell] [prime] lines through a special spectra fitting procedure which includes interferences between neighbouring states. It is found that the doubly excited states populated in double capture processes are generally aligned.

  19. Toward transformational carbon capture systems

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C. [National Energy Technology Laboratory, U.S. Dept. of Energy, Pittsburgh PA (United States); Litynski, John T. [Office of Fossil Energy, U.S. Dept. of Energy, Washington DC (United States); Brickett, Lynn A. [National Energy Technology Laboratory, U.S. Dept. of Energy, Pittsburgh PA (United States); Morreale, Bryan D. [National Energy Technology Laboratory, U.S. Dept. of Energy, Pittsburgh PA (United States)

    2015-10-28

    This paper will briefly review the history and current state of Carbon Capture and Storage (CCS) research and development and describe the technical barriers to carbon capture. it will argue forcefully for a new approach to R&D, which leverages both simulation and physical systems at the laboratory and pilot scales to more rapidly move the best technoogies forward, prune less advantageous approaches, and simultaneously develop materials and processes.

  20. Carbon Capture: A Technology Assessment

    Science.gov (United States)

    2013-10-21

    whereas laboratory-scale experiments typically seek to validate or obtain data for specific components of a system. Laboratory- and bench-scale processes...Plant,” Energy, vol. 35 (2010), pp. 841-850. E. Favre, R. Bounaceur, and D. Roizard, “ Biogas , Membranes and Carbon Dioxide Capture,” Journal of...pp. 1-49. 64 Favre, “ Biogas , Membranes.” Carbon Capture: A Technology Assessment Congressional Research Service 42 materials have pore sizes

  1. Radiative muon capture on hydrogen

    International Nuclear Information System (INIS)

    Bertl, W.; Ahmad, S.; Chen, C.Q.; Gumplinger, P.; Hasinoff, M.D.; Larabee, A.J.; Sample, D.G.; Schott, W.; Wright, D.H.; Armstrong, D.S.; Blecher, M.; Azuelos, G.; Depommier, P.; Jonkmans, G.; Gorringe, T.P.; Henderson, R.; Macdonald, J.A.; Poutissou, J.M.; Poutissou, R.; Von Egidy, T.; Zhang, N.S.; Robertson, B.D.

    1992-01-01

    The radiative capture of negative muons by protons can be used to measure the weak induced pseudoscalar form factor. Brief arguments why this method is preferable to ordinary muon capture are given followed by a discussion of the experimental difficulties. The solution to these problems as attempted by experiment no. 452 at TRIUMF is presented together with preliminary results from the first run in August 1990. An outlook on the expected final precision and the experimental schedule is also given. (orig.)

  2. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  3. Short-term capture of the Earth-Moon system

    Science.gov (United States)

    Qi, Yi; de Ruiter, Anton

    2018-06-01

    In this paper, the short-term capture (STC) of an asteroid in the Earth-Moon system is proposed and investigated. First, the space condition of STC is analysed and five subsets of the feasible region are defined and discussed. Then, the time condition of STC is studied by parameter scanning in the Sun-Earth-Moon-asteroid restricted four-body problem. Numerical results indicate that there is a clear association between the distributions of the time probability of STC and the five subsets. Next, the influence of the Jacobi constant on STC is examined using the space and time probabilities of STC. Combining the space and time probabilities of STC, we propose a STC index to evaluate the probability of STC comprehensively. Finally, three potential STC asteroids are found and analysed.

  4. Probability in reasoning: a developmental test on conditionals.

    Science.gov (United States)

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Capture into resonance and phase-space dynamics in an optical centrifuge

    Science.gov (United States)

    Armon, Tsafrir; Friedland, Lazar

    2016-04-01

    The process of capture of a molecular ensemble into rotational resonance in the optical centrifuge is investigated. The adiabaticity and phase-space incompressibility are used to find the resonant capture probability in terms of two dimensionless parameters P1 ,2 characterizing the driving strength and the nonlinearity, and related to three characteristic time scales in the problem. The analysis is based on the transformation to action-angle variables and the single resonance approximation, yielding reduction of the three-dimensional rotation problem to one degree of freedom. The analytic results for capture probability are in good agreement with simulations. The existing experiments satisfy the validity conditions of the theory.

  6. Capture by colour: evidence for dimension-specific singleton capture.

    Science.gov (United States)

    Harris, Anthony M; Becker, Stefanie I; Remington, Roger W

    2015-10-01

    Previous work on attentional capture has shown the attentional system to be quite flexible in the stimulus properties it can be set to respond to. Several different attentional "modes" have been identified. Feature search mode allows attention to be set for specific features of a target (e.g., red). Singleton detection mode sets attention to respond to any discrepant item ("singleton") in the display. Relational search sets attention for the relative properties of the target in relation to the distractors (e.g., redder, larger). Recently, a new attentional mode was proposed that sets attention to respond to any singleton within a particular feature dimension (e.g., colour; Folk & Anderson, 2010). We tested this proposal against the predictions of previously established attentional modes. In a spatial cueing paradigm, participants searched for a colour target that was randomly either red or green. The nature of the attentional control setting was probed by presenting an irrelevant singleton cue prior to the target display and assessing whether it attracted attention. In all experiments, the cues were red, green, blue, or a white stimulus rapidly rotated (motion cue). The results of three experiments support the existence of a "colour singleton set," finding that all colour cues captured attention strongly, while motion cues captured attention only weakly or not at all. Notably, we also found that capture by motion cues in search for colour targets was moderated by their frequency; rare motion cues captured attention (weakly), while frequent motion cues did not.

  7. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  8. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  9. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  10. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  11. CHAOTIC CAPTURE OF NEPTUNE TROJANS

    International Nuclear Information System (INIS)

    Nesvorny, David; Vokrouhlicky, David

    2009-01-01

    Neptune Trojans (NTs) are swarms of outer solar system objects that lead/trail planet Neptune during its revolutions around the Sun. Observations indicate that NTs form a thick cloud of objects with a population perhaps ∼10 times more numerous than that of Jupiter Trojans and orbital inclinations reaching ∼25 deg. The high inclinations of NTs are indicative of capture instead of in situ formation. Here we study a model in which NTs were captured by Neptune during planetary migration when secondary resonances associated with the mean-motion commensurabilities between Uranus and Neptune swept over Neptune's Lagrangian points. This process, known as chaotic capture, is similar to that previously proposed to explain the origin of Jupiter's Trojans. We show that chaotic capture of planetesimals from an ∼35 Earth-mass planetesimal disk can produce a population of NTs that is at least comparable in number to that inferred from current observations. The large orbital inclinations of NTs are a natural outcome of chaotic capture. To obtain the ∼4:1 ratio between high- and low-inclination populations suggested by observations, planetary migration into a dynamically excited planetesimal disk may be required. The required stirring could have been induced by Pluto-sized and larger objects that have formed in the disk.

  12. Neutron capture cross section measurements: case of lutetium isotopes

    International Nuclear Information System (INIS)

    Roig, O.; Meot, V.; Belier, G.

    2011-01-01

    The neutron radiative capture is a nuclear reaction that occurs in the presence of neutrons on all isotopes and on a wide energy range. The neutron capture range on Lutetium isotopes, presented here, illustrates the variety of measurements leading to the determination of cross sections. These measurements provide valuable fundamental data needed for the stockpile stewardship program, as well as for nuclear astrophysics and nuclear structure. Measurements, made in France or in United-States, involving complex detectors associated with very rare targets have significantly improved the international databases and validated models of nuclear reactions. We present results concerning the measurement of neutron radiative capture on Lu 173 , Lu 175 , Lu 176 and Lu 177m , the measurement of the probability of gamma emission in the substitution reaction Yb 174 (He 3 ,pγ)Lu 176 . The measurement of neutron cross sections on Lu 177m have permitted to highlight the process of super-elastic scattering

  13. General considerations for neutron capture therapy at a reactor facility

    International Nuclear Information System (INIS)

    Binney, S.E.

    2001-01-01

    In addition to neutron beam intensity and quality, there are also a number of other significant criteria related to a nuclear reactor that contribute to a successful neutron capture therapy (NCT) facility. These criteria are classified into four main categories: Nuclear design factors, facility management and operations factors, facility resources, and non-technical factors. Important factors to consider are given for each of these categories. In addition to an adequate neutron beam intensity and quality, key requirements for a successful neutron capture therapy facility include necessary finances to construct or convert a facility for NCT, a capable medical staff to perform the NCT, and the administrative support for the facility. The absence of any one of these four factors seriously jeopardizes the overall probability of success of the facility. Thus nuclear reactor facility management considering becoming involved in neutron capture therapy, should it be proven clinically successful, should take all these factors into consideration. (author)

  14. Neutron capture on nitrogen as a means of detecting explosives

    International Nuclear Information System (INIS)

    Thompson, M.N.; Rassool, R.P.

    1995-01-01

    A research prototype was developed on the basis of neutron capture on nitrogen and is demonstrated to be able to detect parcel and letter bombs. Is the gamma radiation that is detected as an indication of the presence of nitrogen, and the probable presence of nitrogen-containing explosive. The conceptual design of the explosive detector and some experimental results are briefly presented. figs., ills

  15. Carbon capture and storage (CCS)

    International Nuclear Information System (INIS)

    Martin-Amouroux, Jean-Marie

    2016-01-01

    The author first defines what carbon capture and storage (CCS)is, describes more precisely the various technologies, methods and processes involved in carbon capture, carbon transport, and carbon geological storage. He briefly evokes the various applications and uses of CCS. In the second part, he proposes an overview of advances and deadlocks of CCS in the world, of the status of installations and projects, of the development of capture practices in the industry, of some existing and important storage sites, of some pilot installations developed by various industrial actors in different countries (26 installations in the world). He indicates power stations equipped for CCS (in Canada, USA, United-Kingdom, Netherlands, Norway, China, South Korea and United Arab Emirates). He evokes projects which have been given up or postponed. He proposes an overview of policies implemented in different countries (USA, Canada, European Union, Australia, and others) to promote CCS

  16. Alpha-particle and electron capture decay of 209Po

    International Nuclear Information System (INIS)

    Schima, F.J.; Colle, R.

    1996-01-01

    Gamma-ray and Kα X-ray emissions have been measured from a very pure 209 Po source containing less than 0.13% 208 Po activity and no detectable 210 Po (≤2 x 10 -4 %). The alpha-particle emission rate for this source has previously been determined. Data are presented that confirm alpha decay to the 205 Pb excited level at 262.8 keV, with an alpha-particle emission probability (±standard uncertainty) of 0.00559±0.00008. The ratio of K-shell electron capture to total electron capture for the second forbidden unique electron capture decay to the 896.6 keV level in 209 Bi was determined to be 0.594±0.018. The electron capture decay fraction was found to be 0.00454±0.00007, while the probabilities per decay for the 896.6, 262.8, and 260.5 keV gamma rays and the Bi Kα and Pb Kα X-rays were measured as 0.00445±0.00007, 0.00085±0.00002, 0.00254±0.00003, 0.00202±0.00005, and 0.00136±0.00005, respectively. (orig.)

  17. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  18. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  19. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  20. CO2 Capture and Reuse

    International Nuclear Information System (INIS)

    Thambimuthu, K.; Gupta, M.; Davison, J.

    2003-01-01

    CO2 capture and storage including its utilization or reuse presents an opportunity to achieve deep reductions in greenhouse gas emissions from fossil energy use. The development and deployment of this option could significantly assist in meeting a future goal of achieving stabilization of the presently rising atmospheric concentration of greenhouse gases. CO2 capture from process streams is an established concept that has achieved industrial practice. Examples of current applications include the use of primarily, solvent based capture technologies for the recovery of pure CO2 streams for chemical synthesis, for utilization as a food additive, for use as a miscible agent in enhanced oil recovery operations and removal of CO2 as an undesired contaminant from gaseous process streams for the production of fuel gases such as hydrogen and methane. In these applications, the technologies deployed for CO2 capture have focused on gas separation from high purity, high pressure streams and in reducing (or oxygen deficient) environments, where the energy penalties and cost for capture are moderately low. However, application of the same capture technologies for large scale abatement of greenhouse gas emissions from fossil fuel use poses significant challenges in achieving (at comparably low energy penalty and cost) gas separation in large volume, dilute concentration and/or low pressure flue gas streams. This paper will focus on a review of existing commercial methods of CO2 capture and the technology stretch, process integration and energy system pathways needed for their large scale deployment in fossil fueled processes. The assessment of potential capture technologies for the latter purpose will also be based on published literature data that are both 'transparent' and 'systematic' in their evaluation of the overall cost and energy penalties of CO2 capture. In view of the of the fact that many of the existing commercial processes for CO2 capture have seen applications in

  1. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran Kumar; Mai, Paul Martin

    2016-01-01

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  2. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  3. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  4. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  5. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  6. Electron capture in asymmetric collisions

    International Nuclear Information System (INIS)

    Graviele, M.S.; Miraglia, J.E.

    1988-01-01

    It is calculated the electronic capture of K shell by protons using the on-shell impulsive wave functions, exact and eikonal, in the initial and final channels respectively. Both wave functions are normalized and have the correct asyntotic conditions. A good agreement to the experimental data is found. (A.C.A.S.) [pt

  7. Capturing Attention When Attention "Blinks"

    Science.gov (United States)

    Wee, Serena; Chua, Fook K.

    2004-01-01

    Four experiments addressed the question of whether attention may be captured when the visual system is in the midst of an attentional blink (AB). Participants identified 2 target letters embedded among distractor letters in a rapid serial visual presentation sequence. In some trials, a square frame was inserted between the targets; as the only…

  8. Radiative muon capture on hydrogen

    International Nuclear Information System (INIS)

    Schott, W.; Ahmad, S.; Chen, C.Q.; Gumplinger, P.; Hasinoff, M.D.; Larabee, A.J.; Sample, D.G.; Zhang, N.S.; Armstrong, D.S.; Blecher, M.; Serna-Angel, A.; Azuelos, G.; von Egidy, T.; Macdonald, J.A.; Poutissou, J.M.; Poutissou, R.; Wright, D.H.; Henderson, R.S.; McDonald, S.C.; Taylor, G.N.; Doyle, B.; Depommier, P.; Jonkmans, G.; Bertl, W.; Gorringe, T.P.; Robertson, B.C.

    1991-03-01

    The induced pseudoscalar coupling constant, g P , of the weak hadronic current can be determined from the measurement of the branching ratio of radiative muon capture (RMC) on hydrogen. This rare process is being investigated in the TRIUMF RMC experiment which is now taking data. This paper describes the experiment and indicates the status of the data analysis. (Author) 8 refs., 7 figs

  9. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  10. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  11. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  12. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  13. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  14. Identification of Indicators’ Applicability to Settle Borrowers’ Probability of Default

    Directory of Open Access Journals (Sweden)

    Jurevičienė Daiva

    2016-06-01

    Full Text Available Borrowers default risk is one of the most relevant types of risk in commercial banking and its assessment is important to secure business profitability and avoid huge losses during economic turbulences. This leads to necessity to investigate topics related to assessment of borrowers’ default probability and applicability of factors, which would enable to capture the newest trends of borrowers’ markets. Leading economic indicators (in addition to financial and other economic indicators are often suggested as forward-looking in scientific literature. However, there is still a discussion going on applicability of financial ratios and economic indicators. As the problem is relevant in theoretical view as well as for practitioners, this article aims to identify applicability of leading economic indicators for the estimation of default probability. Further, the qualitative criteria for factor selection were identified and used when using detailing, grouping and SWOT analysis methods. Based on current scientific literature analysis, this paper concludes that although leading economic indicators are able to capture forward-looking signals, they should be used with careful analysis of its drawbacks and in combination with financial factors in order to avoid overshooting effects. The limitation of the article is the analysis of factors based on rather theoretical analysis than estimation of quantitative criteria. This suggests that every time using leading economic indicators requires using empirical study of particular indicators’ set.

  15. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  16. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  17. Materials For Gas Capture, Methods Of Making Materials For Gas Capture, And Methods Of Capturing Gas

    KAUST Repository

    Polshettiwar, Vivek; Patil, Umesh

    2013-01-01

    In accordance with the purpose(s) of the present disclosure, as embodied and broadly described herein, embodiments of the present disclosure, in one aspect, relate to materials that can be used for gas (e.g., CO.sub.2) capture, methods of making

  18. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  19. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  20. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  1. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  2. Stress evaluation in hares (Lepus europaeus Pallas captured for traslocation

    Directory of Open Access Journals (Sweden)

    Antonio Lavazza

    2010-01-01

    Full Text Available With the aim to evaluate the capturing techniques some haematic and physiological parameters were studied to discrim- inate stressed hares from non stressed hares. A total of 66 wild hares (experimental group were sampled in 14 different non-hunting areas, where hares are usually captured for later release in low-density areas. In the same season a total of 30 hares (about 1 year old, reared in cages and thus showing a reduced fear of man, were sampled (control group. In each area the hares were captured by cours- ing with 3-4 dogs (greyhounds or lurches. The dogs were released by the different hunter teams to find and drive into trammel nets any hare that was seen running. After capture, the hares remained inside darkened, wooden capture-boxes for a variable period of time before blood drawing. For blood sample collection all the hares were physically restrained and their eyes immediately covered. Blood, always collected within 1-2 minutes, was drawn from the auricular vein. Blood samples (plasma were analysed for glucose, AST, ALT, CPK and cortisol concentrations. Body temperature, heart and respiratory rate, sex, and age were evaluated in each hare. The effect of origin, sex and age on haematic and physiolog- ical parameters was analysed by ANOVA. Every measured parameter of the hares bearing to the capture group or the control group (reared was then subjected to stepwise and to discriminant analysis, in order to select the groups of stressed (discriminated by the controls and non-stressed hares. CPK, AST and glucose were found to be the best parameters for distinguishing stressed from non-stressed hares. The intensive exercise suffered by the wild hares induced a depletion of energetic reserves, so that most of the captured hares showed lower glucose and higher CPK activity in the plasma, probably due to muscle damage (P< 0.05. After reclassi- fying the hares in the two groups of stressed and non stressed hares, the reference values (means

  3. Experimental studies of electron capture

    International Nuclear Information System (INIS)

    Pedersen, E.H.

    1983-01-01

    This thesis discusses the main results of recent experimental studies of electron capture in asymmetric collisions. Most of the results have been published, but the thesis also contains yet unpublished data, or data presented only in unrefereed conference proceedings. The thesis aims at giving a coherent discussion of the understanding of the experimental results, based first on simple qualitative considerations and subsequently on quantitative comparisons with the best theoretical calculations currently available. (Auth.)

  4. Ubiquitous log odds: a common representation of probability and frequency distortion in perception, action and cognition

    Directory of Open Access Journals (Sweden)

    Hang eZhang

    2012-01-01

    Full Text Available In decision from experience, the source of probability information affects how probability is distorted in the decision task. Understanding how and why probability is distorted is a key issue in understanding the peculiar character of experience-based decision. We consider how probability information is used not just in decision making but also in a wide variety of cognitive, perceptual and motor tasks. Very similar patterns of distortion of probability/frequency information have been found in visual frequency estimation, frequency estimation based on memory, signal detection theory, and in the use of probability information in decision-making under risk and uncertainty. We show that distortion of probability in all cases is well captured as linear transformations of the log odds of frequency and/or probability, a model with a slope parameter and an intercept parameter. We then consider how task and experience influence these two parameters and the resulting distortion of probability. We review how the probability distortions change in systematic ways with task and report three experiments on frequency distortion where the distortions change systematically in the same task. We found that the slope of frequency distortions decreases with the sample size, which is echoed by findings in decision from experience. We review previous models of the representation of uncertainty and find that none can account for the empirical findings.

  5. Saliency Detection via Absorbing Markov Chain With Learnt Transition Probability.

    Science.gov (United States)

    Lihe Zhang; Jianwu Ai; Bowen Jiang; Huchuan Lu; Xiukui Li

    2018-02-01

    In this paper, we propose a bottom-up saliency model based on absorbing Markov chain (AMC). First, a sparsely connected graph is constructed to capture the local context information of each node. All image boundary nodes and other nodes are, respectively, treated as the absorbing nodes and transient nodes in the absorbing Markov chain. Then, the expected number of times from each transient node to all other transient nodes can be used to represent the saliency value of this node. The absorbed time depends on the weights on the path and their spatial coordinates, which are completely encoded in the transition probability matrix. Considering the importance of this matrix, we adopt different hierarchies of deep features extracted from fully convolutional networks and learn a transition probability matrix, which is called learnt transition probability matrix. Although the performance is significantly promoted, salient objects are not uniformly highlighted very well. To solve this problem, an angular embedding technique is investigated to refine the saliency results. Based on pairwise local orderings, which are produced by the saliency maps of AMC and boundary maps, we rearrange the global orderings (saliency value) of all nodes. Extensive experiments demonstrate that the proposed algorithm outperforms the state-of-the-art methods on six publicly available benchmark data sets.

  6. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  7. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  8. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  9. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  10. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  11. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  12. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  13. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  14. Realistic costs of carbon capture

    Energy Technology Data Exchange (ETDEWEB)

    Al Juaied, Mohammed (Harvard Univ., Cambridge, MA (US). Belfer Center for Science and International Affiaris); Whitmore, Adam (Hydrogen Energy International Ltd., Weybridge (GB))

    2009-07-01

    There is a growing interest in carbon capture and storage (CCS) as a means of reducing carbon dioxide (CO2) emissions. However there are substantial uncertainties about the costs of CCS. Costs for pre-combustion capture with compression (i.e. excluding costs of transport and storage and any revenue from EOR associated with storage) are examined in this discussion paper for First-of-a-Kind (FOAK) plant and for more mature technologies, or Nth-of-a-Kind plant (NOAK). For FOAK plant using solid fuels the levelised cost of electricity on a 2008 basis is approximately 10 cents/kWh higher with capture than for conventional plants (with a range of 8-12 cents/kWh). Costs of abatement are found typically to be approximately US$150/tCO2 avoided (with a range of US$120-180/tCO2 avoided). For NOAK plants the additional cost of electricity with capture is approximately 2-5 cents/kWh, with costs of the range of US$35-70/tCO2 avoided. Costs of abatement with carbon capture for other fuels and technologies are also estimated for NOAK plants. The costs of abatement are calculated with reference to conventional SCPC plant for both emissions and costs of electricity. Estimates for both FOAK and NOAK are mainly based on cost data from 2008, which was at the end of a period of sustained escalation in the costs of power generation plant and other large capital projects. There are now indications of costs falling from these levels. This may reduce the costs of abatement and costs presented here may be 'peak of the market' estimates. If general cost levels return, for example, to those prevailing in 2005 to 2006 (by which time significant cost escalation had already occurred from previous levels), then costs of capture and compression for FOAK plants are expected to be US$110/tCO2 avoided (with a range of US$90-135/tCO2 avoided). For NOAK plants costs are expected to be US$25-50/tCO2. Based on these considerations a likely representative range of costs of abatement from CCS

  15. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  16. Algal Energy Conversion and Capture

    Science.gov (United States)

    Hazendonk, P.

    2015-12-01

    We address the potential for energy conversions and capture for: energy generation; reduction in energy use; reduction in greenhouse gas emissions; remediation of water and air pollution; protection and enhancement of soil fertility. These processes have the potential to sequester carbon at scales that may have global impact. Energy conversion and capture strategies evaluate energy use and production from agriculture, urban areas and industries, and apply existing and emerging technologies to reduce and recapture energy embedded in waste products. The basis of biocrude production from Micro-algal feedstocks: 1) The nutrients from the liquid fraction of waste streams are concentrated and fed into photo bioreactors (essentially large vessels in which microalgae are grown) along with CO2 from flue gasses from down stream processes. 2) The algae are processed to remove high value products such as proteins and beta-carotenes. The advantage of algae feedstocks is the high biomass productivity is 30-50 times that of land based crops and the remaining biomass contains minimal components that are difficult to convert to biocrude. 3) The remaining biomass undergoes hydrothermal liquefaction to produces biocrude and biochar. The flue gasses of this process can be used to produce electricity (fuel cell) and subsequently fed back into the photobioreactor. The thermal energy required for this process is small, hence readily obtained from solar-thermal sources, and furthermore no drying or preprocessing is required keeping the energy overhead extremely small. 4) The biocrude can be upgraded and refined as conventional crude oil, creating a range of liquid fuels. In principle this process can be applied on the farm scale to the municipal scale. Overall, our primary food production is too dependent on fossil fuels. Energy conversion and capture can make food production sustainable.

  17. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  18. EJECTION AND CAPTURE DYNAMICS IN RESTRICTED THREE-BODY ENCOUNTERS

    International Nuclear Information System (INIS)

    Kobayashi, Shiho; Hainick, Yanir; Sari, Re'em; Rossi, Elena M.

    2012-01-01

    We study the tidal disruption of binaries by a massive point mass (e.g., the black hole at the Galactic center), and we discuss how the ejection and capture preference between unequal-mass binary members depends on which orbit they approach the massive object. We show that the restricted three-body approximation provides a simple and clear description of the dynamics. The orbit of a binary with mass m around a massive object M should be almost parabolic with an eccentricity of |1 – e| ∼ 1/3 1/3 times the binary rotation velocity, it would be abruptly disrupted, and the energy change at the encounter can be evaluated in a simple disruption model. We evaluate the probability distributions for the ejection and capture of circular binary members and for the final energies. In principle, for any hyperbolic (elliptic) orbit, the heavier member has more chance to be ejected (captured), because it carries a larger fraction of the orbital energy. However, if the orbital energy is close to zero, the difference between the two members becomes small, and there is practically no ejection and capture preferences. The preference becomes significant when the orbital energy is comparable to the typical energy change at the encounter. We discuss its implications to hypervelocity stars and irregular satellites around giant planets.

  19. Active Traffic Capture for Network Forensics

    Science.gov (United States)

    Slaviero, Marco; Granova, Anna; Olivier, Martin

    Network traffic capture is an integral part of network forensics, but current traffic capture techniques are typically passive in nature. Under heavy loads, it is possible for a sniffer to miss packets, which affects the quality of forensic evidence.

  20. CAPTURING REALITY AT CENTRE BLOCK

    Directory of Open Access Journals (Sweden)

    C. Boulanger

    2017-08-01

    Full Text Available The Centre Block of Canada’s Parliament buildings, National Historic Site of Canada is set to undergo a major rehabilitation project that will take approximately 10 years to complete. In preparation for this work, Heritage Conservation Services (HCS of Public Services and Procurement Canada has been completing heritage documentation of the entire site which includes laser scanning of all interior rooms and accessible confined spaces such as attics and other similar areas. Other documentation completed includes detailed photogrammetric documentation of rooms and areas of high heritage value. Some of these high heritage value spaces present certain challenges such as accessibility due to the height and the size of the spaces. Another challenge is the poor lighting conditions, requiring the use of flash or strobe lighting to either compliment or completely eliminate the available ambient lighting. All the spaces captured at this higher level of detail were also captured with laser scanning. This allowed the team to validate the information and conduct a quality review of the photogrammetric data. As a result of this exercise, the team realized that in most, if not all cases, the photogrammetric data was more detailed and at a higher quality then the terrestrial laser scanning data. The purpose and motivation of this paper is to present these findings, as well provide the advantages and disadvantages of the two methods and data sets.

  1. Adaptive capture of expert knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C.L.; Jones, R.D. [Los Alamos National Lab., NM (United States); Hand, Un Kyong [Los Alamos National Lab., NM (United States)]|[US Navy (United States)

    1995-05-01

    A method is introduced that can directly acquire knowledge-engineered, rule-based logic in an adaptive network. This adaptive representation of the rule system can then replace the rule system in simulated intelligent agents and thereby permit further performance-based adaptation of the rule system. The approach described provides both weight-fitting network adaptation and potentially powerful rule mutation and selection mechanisms. Nonlinear terms are generated implicitly in the mutation process through the emergent interaction of multiple linear terms. By this method it is possible to acquire nonlinear relations that exist in the training data without addition of hidden layers or imposition of explicit nonlinear terms in the network. We smoothed and captured a set of expert rules with an adaptive network. The motivation for this was to (1) realize a speed advantage over traditional rule-based simulations; (2) have variability in the intelligent objects not possible by rule-based systems but provided by adaptive systems: and (3) maintain the understandability of rule-based simulations. A set of binary rules was smoothed and converted into a simple set of arithmetic statements, where continuous, non-binary rules are permitted. A neural network, called the expert network, was developed to capture this rule set, which it was able to do with zero error. The expert network is also capable of learning a nonmonotonic term without a hidden layer. The trained network in feedforward operation is fast running, compact, and traceable to the rule base.

  2. Capturing Reality at Centre Block

    Science.gov (United States)

    Boulanger, C.; Ouimet, C.; Yeomans, N.

    2017-08-01

    The Centre Block of Canada's Parliament buildings, National Historic Site of Canada is set to undergo a major rehabilitation project that will take approximately 10 years to complete. In preparation for this work, Heritage Conservation Services (HCS) of Public Services and Procurement Canada has been completing heritage documentation of the entire site which includes laser scanning of all interior rooms and accessible confined spaces such as attics and other similar areas. Other documentation completed includes detailed photogrammetric documentation of rooms and areas of high heritage value. Some of these high heritage value spaces present certain challenges such as accessibility due to the height and the size of the spaces. Another challenge is the poor lighting conditions, requiring the use of flash or strobe lighting to either compliment or completely eliminate the available ambient lighting. All the spaces captured at this higher level of detail were also captured with laser scanning. This allowed the team to validate the information and conduct a quality review of the photogrammetric data. As a result of this exercise, the team realized that in most, if not all cases, the photogrammetric data was more detailed and at a higher quality then the terrestrial laser scanning data. The purpose and motivation of this paper is to present these findings, as well provide the advantages and disadvantages of the two methods and data sets.

  3. Giant resonance effects in radiative capture

    International Nuclear Information System (INIS)

    Snover, K.A.

    1979-01-01

    The technique of capture reaction studies of giant resonance properties is described, and a number of examples are given. Most of the recent work of interest has been in proton capture, in part because of the great utility (and availability) of polarized beams; most of the discussion concerns this reaction. Alpha capture, which has been a useful tool for exploring isoscalar E2 strength, and neutron capture are, however, also treated. 46 references, 14 figures

  4. The Effectiveness of Classroom Capture Technology

    Science.gov (United States)

    Ford, Maire B.; Burns, Colleen E.; Mitch, Nathan; Gomez, Melissa M.

    2012-01-01

    The use of classroom capture systems (systems that capture audio and video footage of a lecture and attempt to replicate a classroom experience) is becoming increasingly popular at the university level. However, research on the effectiveness of classroom capture systems in the university classroom has been limited due to the recent development and…

  5. Exploratory investigations of hypervelocity intact capture spectroscopy

    Science.gov (United States)

    Tsou, P.; Griffiths, D. J.

    1993-01-01

    The ability to capture hypervelocity projectiles intact opens a new technique available for hypervelocity research. A determination of the reactions taking place between the projectile and the capture medium during the process of intact capture is extremely important to an understanding of the intact capture phenomenon, to improving the capture technique, and to developing a theory describing the phenomenon. The intact capture of hypervelocity projectiles by underdense media generates spectra, characteristic of the material species of projectile and capture medium involved. Initial exploratory results into real-time characterization of hypervelocity intact capture techniques by spectroscopy include ultra-violet and visible spectra obtained by use of reflecting gratings, transmitting gratings, and prisms, and recorded by photographic and electronic means. Spectrometry proved to be a valuable real-time diagnostic tool for hypervelocity intact capture events, offering understanding of the interactions of the projectile and the capture medium during the initial period and providing information not obtainable by other characterizations. Preliminary results and analyses of spectra produced by the intact capture of hypervelocity aluminum spheres in polyethylene (PE), polystyrene (PS), and polyurethane (PU) foams are presented. Included are tentative emission species identifications, as well as gray body temperatures produced in the intact capture process.

  6. Marker-Free Human Motion Capture

    DEFF Research Database (Denmark)

    Grest, Daniel

    Human Motion Capture is a widely used technique to obtain motion data for animation of virtual characters. Commercial optical motion capture systems are marker-based. This book is about marker-free motion capture and its possibilities to acquire motion from a single viewing direction. The focus...

  7. POD evaluation for joint angles from inertial and optical motion capturing system

    International Nuclear Information System (INIS)

    Shimizu, Kai; Kobayashi, Futoshi; Nakamoto, Hiroyuki; Kojima, Fumio

    2016-01-01

    It has been recognized that advances in preventive maintenance can improve the sustainment of systems, facilities, and infrastructure. Robot technologies have also received attention for maintenance applications. In order to operate delicate tasks, multi-fingered robot hands have been proposed in cases where human capability is deficient. This paper deals with motion capturing systems for controlling the hand/arm robot remotely. Several types of motion capturing systems have been developed so far. However, it is difficult for individual motion capturing systems to measure precise joint angles of a human arm. Therefore, in this paper, we integrate the inertial motion capturing system with the optical motion capturing system to capture a human arm posture. By evaluating the reliability of each motion capturing system, the integration is carried out. The probability of detection (POD) is applied to evaluate and compare the reliability of datasets measured by each motion capturing system. POD is one of the widely used statistical techniques to determine reliability. We apply the â analysis to determine the POD(a) function from the data set. Based on the POD evaluation, two motion capturing systems are integrated. (author)

  8. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  9. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  10. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  11. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  12. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  13. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  14. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  15. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  16. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  17. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  18. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  19. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  20. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  1. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  2. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  3. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  4. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  5. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  6. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  7. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  8. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  9. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  10. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  11. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  12. Integrating resource selection information with spatial capture--recapture

    Science.gov (United States)

    Royle, J. Andrew; Chandler, Richard B.; Sun, Catherine C.; Fuller, Angela K.

    2013-01-01

    1. Understanding space usage and resource selection is a primary focus of many studies of animal populations. Usually, such studies are based on location data obtained from telemetry, and resource selection functions (RSFs) are used for inference. Another important focus of wildlife research is estimation and modeling population size and density. Recently developed spatial capture–recapture (SCR) models accomplish this objective using individual encounter history data with auxiliary spatial information on location of capture. SCR models include encounter probability functions that are intuitively related to RSFs, but to date, no one has extended SCR models to allow for explicit inference about space usage and resource selection.

  13. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  14. Neutron capture therapy for melanoma

    Energy Technology Data Exchange (ETDEWEB)

    Coderre, J.A.; Glass, J.D.; Micca, P.; Fairchild, R.G.

    1988-01-01

    The development of boron-containing compounds which localize selectively in tumor may require a tumor-by-tumor type of approach that exploits any metabolic pathways unique to the particular type of tumor. Melanin-producing melanomas actively transport and metabolize aromatic amino acids for use as precursors in the synthesis of the pigment melanin. It has been shown that the boron-containing amino acid analog p-borono-phenylalanine (BPA) is selectively accumulated in melanoma tissue, producing boron concentrations in tumor that are within the range estimated to be necessary for successful boron neutron capture therapy (BNCT). We report here the results of therapy experiments carried out at the Brookhaven Medical Research Reactor (BMRR). 21 refs., 5 figs., 3 tabs.

  15. Brownian motion using video capture

    International Nuclear Information System (INIS)

    Salmon, Reese; Robbins, Candace; Forinash, Kyle

    2002-01-01

    Although other researchers had previously observed the random motion of pollen grains suspended in water through a microscope, Robert Brown's name is associated with this behaviour based on observations he made in 1828. It was not until Einstein's work in the early 1900s however, that the origin of this irregular motion was established to be the result of collisions with molecules which were so small as to be invisible in a light microscope (Einstein A 1965 Investigations on the Theory of the Brownian Movement ed R Furth (New York: Dover) (transl. Cowper A D) (5 papers)). Jean Perrin in 1908 (Perrin J 1923 Atoms (New York: Van Nostrand-Reinhold) (transl. Hammick D)) was able, through a series of painstaking experiments, to establish the validity of Einstein's equation. We describe here the details of a junior level undergraduate physics laboratory experiment where students used a microscope, a video camera and video capture software to verify Einstein's famous calculation of 1905. (author)

  16. Cage-based performance capture

    CERN Document Server

    Savoye, Yann

    2014-01-01

    Nowadays, highly-detailed animations of live-actor performances are increasingly easier to acquire and 3D Video has reached considerable attentions in visual media production. In this book, we address the problem of extracting or acquiring and then reusing non-rigid parametrization for video-based animations. At first sight, a crucial challenge is to reproduce plausible boneless deformations while preserving global and local captured properties of dynamic surfaces with a limited number of controllable, flexible and reusable parameters. To solve this challenge, we directly rely on a skin-detached dimension reduction thanks to the well-known cage-based paradigm. First, we achieve Scalable Inverse Cage-based Modeling by transposing the inverse kinematics paradigm on surfaces. Thus, we introduce a cage inversion process with user-specified screen-space constraints. Secondly, we convert non-rigid animated surfaces into a sequence of optimal cage parameters via Cage-based Animation Conversion. Building upon this re...

  17. Workshop on neutron capture therapy

    Energy Technology Data Exchange (ETDEWEB)

    Fairchild, R.G.; Bond, V.P. (eds.)

    1986-01-01

    Potentially optimal conditions for Neutron Capture Therapy (NCT) may soon be in hand due to the anticipated development of band-pass filtered beams relatively free of fast neutron contaminations, and of broadly applicable biomolecules for boron transport such as porphyrins and monoclonal antibodies. Consequently, a number of groups in the US are now devoting their efforts to exploring NCT for clinical application. The purpose of this Workshop was to bring these groups together to exchange views on significant problems of mutual interest, and to assure a unified and effective approach to the solutions. Several areas of preclinical investigation were deemed to be necessary before it would be possible to initiate clinical studies. As neither the monomer nor the dimer of sulfhydryl boron hydride is unequivocally preferable at this time, studies on both compounds should be continued until one is proven superior.

  18. Neutron capture therapy for melanoma

    International Nuclear Information System (INIS)

    Coderre, J.A.; Glass, J.D.; Micca, P.; Fairchild, R.G.

    1988-01-01

    The development of boron-containing compounds which localize selectively in tumor may require a tumor-by-tumor type of approach that exploits any metabolic pathways unique to the particular type of tumor. Melanin-producing melanomas actively transport and metabolize aromatic amino acids for use as precursors in the synthesis of the pigment melanin. It has been shown that the boron-containing amino acid analog p-borono-phenylalanine (BPA) is selectively accumulated in melanoma tissue, producing boron concentrations in tumor that are within the range estimated to be necessary for successful boron neutron capture therapy (BNCT). We report here the results of therapy experiments carried out at the Brookhaven Medical Research Reactor (BMRR). 21 refs., 5 figs., 3 tabs

  19. Muon capture by helium-3

    International Nuclear Information System (INIS)

    Pascual de Sanz, R.

    1966-01-01

    In this paper we study the capture of a negative muon by H e 3 in the channel μ-+H e 3 +V. Following Primakoff we use the V-A theory of the weak interactions. We include also first order relativistic terms. To describe the initial and final nuclei we have used the most general wave function allowed by the Paul is exclusion principle, assuming that these nuclei are a mixture of an isospin doublet and quadruplet. For the part of the wave function depending on the inter nucleonic distances, we have taken four different function without hard-core, a gaussian and three kinds of Irving type. We present in several tables the results obtained varying g p /g v and g A /g y as well as the amplitudes of the fourteen terms forming the nuclear wave function. (Author) 35 refs

  20. Subsurface capture of carbon dioxide

    Science.gov (United States)

    Blount, Gerald; Siddal, Alvin A.; Falta, Ronald W.

    2014-07-22

    A process and apparatus of separating CO.sub.2 gas from industrial off-gas source in which the CO.sub.2 containing off-gas is introduced deep within an injection well. The CO.sub.2 gases are dissolved in the, liquid within the injection well while non-CO.sub.2 gases, typically being insoluble in water or brine, are returned to the surface. Once the CO.sub.2 saturated liquid is present within the injection well, the injection well may be used for long-term geologic storage of CO.sub.2 or the CO.sub.2 saturated liquid can be returned to the surface for capturing a purified CO.sub.2 gas.

  1. Opportunity Captures 'Lion King' Panorama

    Science.gov (United States)

    2004-01-01

    [figure removed for brevity, see original site] Click on the image for Opportunity Captures 'Lion King' Panorama (QTVR) This approximate true-color panorama, dubbed 'Lion King,' shows 'Eagle Crater' and the surrounding plains of Meridiani Planum. It was obtained by the Mars Exploration Rover Opportunity's panoramic camera on sols 58 and 60 using infrared (750-nanometer), green (530-nanometer) and blue (430-nanometer) filters. This is the largest panorama obtained yet by either rover. It was taken in eight segments using six filters per segment, for a total of 558 images and more than 75 megabytes of data. Additional lower elevation tiers were added to ensure that the entire crater was covered in the mosaic. This panorama depicts a story of exploration including the rover's lander, a thorough examination of the outcrop, a study of the soils at the near-side of the lander, a successful exit from Eagle Crater and finally the rover's next desination, the large crater dubbed 'Endurance'.

  2. Workshop on neutron capture therapy

    International Nuclear Information System (INIS)

    Fairchild, R.G.; Bond, V.P.

    1986-01-01

    Potentially optimal conditions for Neutron Capture Therapy (NCT) may soon be in hand due to the anticipated development of band-pass filtered beams relatively free of fast neutron contaminations, and of broadly applicable biomolecules for boron transport such as porphyrins and monoclonal antibodies. Consequently, a number of groups in the US are now devoting their efforts to exploring NCT for clinical application. The purpose of this Workshop was to bring these groups together to exchange views on significant problems of mutual interest, and to assure a unified and effective approach to the solutions. Several areas of preclinical investigation were deemed to be necessary before it would be possible to initiate clinical studies. As neither the monomer nor the dimer of sulfhydryl boron hydride is unequivocally preferable at this time, studies on both compounds should be continued until one is proven superior

  3. State-selective electron capture

    International Nuclear Information System (INIS)

    Dunford, R.W.; Liu, C.J.; Berry, H.G.; Pardo, R.C.; Raphaelian, M.L.A.

    1988-01-01

    We report results from a new atomic physics program using the Argonne PII ECR ion source which is being built as part of the upgrade of the Argonne Tandem-Linear Accelerator (ATLAS). Our initial experiments have been aimed at studying state-selective electron capture in ion-atom collisions using the technique of Photon Emission Spectroscopy. We are extending existing cross section measurements at low energy ( 6+ and O 7+ on He and H 2 targets in the energy range from 1-105 keV/amu. We also present uv spectra obtained in collisions of O 6+ , O 5+ and N 5+ on a sodium target. 4 refs., 2 figs., 1 tab

  4. Radiative capture versus Coulomb dissociation

    International Nuclear Information System (INIS)

    Esbensen, H.; Physics

    2006-01-01

    Measurements of the Coulomb dissociation of 8 B have been used to infer the rate of the inverse radiative proton capture on 7 Be. The analysis is usually based on the assumptions that the two processes are related by detailed balance and described by E1 transitions. However, there are corrections to this relation. The Coulomb form factors for the two processes, for example, are not identical. There are also E2 transitions and higher-order effects in the Coulomb dissociation, and the nuclear induced breakup cannot always be ignored. While adding first-order E2 transitions enhances the decay energy spectrum, the other mechanisms cause a suppression at low relative energies. The net result may accidentally be close to the conventional first-order E1 calculation, but there are differences which cannot be ignored if accuracies of 10% or better are needed

  5. Radiative Capture versus Coulomb Dissociation

    International Nuclear Information System (INIS)

    Esbensen, Henning

    2006-01-01

    Measurements of the Coulomb dissociation of 8B have been used to infer the rate of the inverse radiative proton capture on 7Be. The analysis is usually based on the assumptions that the two processes are related by detailed balance and described by E1 transitions. However, there are corrections to this relation. The Coulomb form factors for the two processes, for example, are not identical. There are also E2 transitions and higher-order effects in the Coulomb dissociation, and the nuclear induced breakup cannot always be ignored. While adding first-order E2 transitions enhances the decay energy spectrum, the other mechanisms cause a suppression at low relative energies. The net result may accidentally be close to the conventional first-order E1 calculation, but there are differences which cannot be ignored if accuracies of 10% or better are needed

  6. Collisional Cascades Following Triton's Capture

    Science.gov (United States)

    Cuk, Matija; Hamilton, Douglas P.; Stewart-Mukhopadhyay, Sarah T.

    2017-10-01

    Neptune's moon Triton is widely thought to have been captured from heliocentric orbit, most likely through binary dissociation (Agnor and Hamilton, 2006). Triton's original eccentric orbit must have been subsequently circularized by satellite tides (Goldreich et al. 1989). Cuk and Gladman (2005) found that Kozai oscillations make early tidal evolution inefficient, and have proposed that collisions between Triton and debris from pre-existing satellites was the dominant mechanism of shrinking Triton's large post-capture orbit. However, Cuk and Hamilton (DPS 2016), using numerical simulations and results of Stewart and Leinhardt (2012), have found that collisions between regular satellites are unlikely to be destructive, while collisions between prograde moons and Triton are certainly erosive if not catastrophic. An obvious outcome would be pre-existing moon material gradually grinding down Triton and making it reaccrete in the local Laplace plane, in conflict with Triton's large current inclination. We propose that the crucial ingredient for understanding the early evolution of the Neptunian system are the collisions between the moons and the prograde and retrograde debris originating from the pre-existing moons and Triton. In particular, we expect early erosive impact(s) on Triton to generate debris that will, in subsequent collisions, disrupt the regular satellites. If the retrograde material were to dominate at some planetocentric distances, the end result may be a large cloud or disk of retrograde debris that would be accreted by Triton, shrinking Triton's orbit. Some of the prograde debris could survive in a compact disk interior to Triton's pericenter, eventually forming the inner moons of Neptune. We will present results of numerical modeling of these complex dynamical processes at the meeting.

  7. Radiative muon capture on hydrogen

    International Nuclear Information System (INIS)

    Wright, D.H.; Ahmad, S.; Gorringe, T.P.; Hasinoff, M.D.; Larabee, A.J.; Waltham, C.E.; Armstrong, D.S.; Blecher, M.; Serna-Angel, A.; Azuelos, G.; Macdonald, J.A.; Poutissou, J.M.; Bertl, W.; Chen, C.Q.; Ding, Z.H.; Zhang, N.S.; Henderson, R.; McDonald, S.; Taylor, G.N.; Robertson, B.C.

    1989-01-01

    In the Standard Model, the weak interaction is purely V-A in character. However in semileptonic reactions the strong force induces additional couplings. One of these, the induced pseudoscalar coupling g p , is still very poorly determined experimentally. Using PCAC and the Goldberger-Treiman relation, one can obtain the estimate g p /g a = 6.8 for the nucleon. At present, the world average of 5 measurements of the rate of ordinary muon capture (each with an error in excess of 40%) yields g p /g a = 6.9 ± 1.5. Radiative Muon Capture (RMC) is considerably more sensitive to the pseudoscalar coupling. Due to the extremely small branching ratio (∼ 6 x 10 -8 ), the elementary reaction μ - p→ μnγ has never been measured. Effort to date has concentrated on nuclear RMC where the branching ratio is much larger, but the interpretation of these results is hindered by nuclear structure uncertainties. A measurement is being carried out at TRIUMF to determine the rate of RMC on hydrogen to a precision of 8% leading to a determination of g p with an error of 10%. The detection system is based on a large-volume drift chamber acting as a pair spectrometer. The drift chamber covers a solid angle of about 2π. At a magnetic field of 2.4 kG the acceptance for 70 MeV photons is about 0.9% using a 1.2 mm thick Pb photon converter. The expected photon energy resolution is about 10% FWHM. A detailed discussion of the systematic errors expected in the experiment and the preliminary results on the performance of the detector will be presented

  8. Formation of S-type planets in close binaries: scattering induced tidal capture of circumbinary planets

    Science.gov (United States)

    Gong, Yan-Xiang; Ji, Jianghui

    2018-05-01

    Although several S-type and P-type planets in binary systems were discovered in past years, S-type planets have not yet been found in close binaries with an orbital separation not more than 5 au. Recent studies suggest that S-type planets in close binaries may be detected through high-accuracy observations. However, nowadays planet formation theories imply that it is difficult for S-type planets in close binaries systems to form in situ. In this work, we extensively perform numerical simulations to explore scenarios of planet-planet scattering among circumbinary planets and subsequent tidal capture in various binary configurations, to examine whether the mechanism can play a part in producing such kind of planets. Our results show that this mechanism is robust. The maximum capture probability is ˜10%, which can be comparable to the tidal capture probability of hot Jupiters in single star systems. The capture probability is related to binary configurations, where a smaller eccentricity or a low mass ratio of the binary will lead to a larger probability of capture, and vice versa. Furthermore, we find that S-type planets with retrograde orbits can be naturally produced via capture process. These planets on retrograde orbits can help us distinguish in situ formation and post-capture origin for S-type planet in close binaries systems. The forthcoming missions (PLATO) will provide the opportunity and feasibility to detect such planets. Our work provides several suggestions for selecting target binaries in search for S-type planets in the near future.

  9. The calculation of resonance capture in granular fuels

    Energy Technology Data Exchange (ETDEWEB)

    Askew, J R; Harris, D W.G.; Hutton, J L

    1971-02-15

    The methods used in the UK for the calculation of resonance capture in granular HTR fuels follow the long established path of determining a 'geometric equivalence' which equates the resonance shielding to that in a homogeneous mixture of the resonance absorber in hydrogen. Simple collision probability arguments, usually for the black limit, were used for AGR and SGHW systems. For granular fuel a 'grey' equivalence, convenient for numerical use, has been adopted, and the geometric solution performed in two ways: by a synthetic collision probability model which is rapid and appropriate for design work and by a Monte Carlo model which allows details of the grain lattice structure to be studied. The results are in good agreement, and are shown to give good results compared with measured relative conversion ratios in the NESTOR stack experiments.

  10. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  11. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  12. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  13. Small Particles Intact Capture Experiment (SPICE)

    Science.gov (United States)

    Nishioka, Ken-Ji; Carle, G. C.; Bunch, T. E.; Mendez, David J.; Ryder, J. T.

    1994-01-01

    The Small Particles Intact Capture Experiment (SPICE) will develop technologies and engineering techniques necessary to capture nearly intact, uncontaminated cosmic and interplanetary dust particles (IDP's). Successful capture of such particles will benefit the exobiology and planetary science communities by providing particulate samples that may have survived unaltered since the formation of the solar system. Characterization of these particles may contribute fundamental data to our knowledge of how these particles could have formed into our planet Earth and, perhaps, contributed to the beginnings of life. The term 'uncontaminated' means that captured cosmic and IDP particles are free of organic contamination from the capture process and the term 'nearly intact capture' means that their chemical and elemental components are not materially altered during capture. The key to capturing cosmic and IDP particles that are organic-contamination free and nearly intact is the capture medium. Initial screening of capture media included organic foams, multiple thin foil layers, and aerogel (a silica gel); but, with the exception of aerogel, the requirements of no contamination or nearly intact capture were not met. To ensure no contamination of particles in the capture process, high-purity aerogel was chosen. High-purity aerogel results in high clarity (visual clearness), a useful quality in detection and recovery of embedded captured particles from the aerogel. P. Tsou at the Jet Propulsion Laboratory (JPL) originally described the use of aerogel for this purpose and reported laboratory test results. He has flown aerogel as a 'GAS-can Lid' payload on STS-47 and is evaluating the results. The Timeband Capture Cell Experiment (TICCE), a Eureca 1 experiment, is also flying aerogel and is scheduled for recovery in late April.

  14. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  15. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  16. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  17. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  18. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  19. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  20. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  1. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  2. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  3. Estimating temporary emigration and breeding proportions using capture-recapture data with Pollock's robust design

    Science.gov (United States)

    Kendall, W.L.; Nichols, J.D.; Hines, J.E.

    1997-01-01

    Statistical inference for capture-recapture studies of open animal populations typically relies on the assumption that all emigration from the studied population is permanent. However, there are many instances in which this assumption is unlikely to be met. We define two general models for the process of temporary emigration, completely random and Markovian. We then consider effects of these two types of temporary emigration on Jolly-Seber (Seber 1982) estimators and on estimators arising from the full-likelihood approach of Kendall et al. (1995) to robust design data. Capture-recapture data arising from Pollock's (1982) robust design provide the basis for obtaining unbiased estimates of demographic parameters in the presence of temporary emigration and for estimating the probability of temporary emigration. We present a likelihood-based approach to dealing with temporary emigration that permits estimation under different models of temporary emigration and yields tests for completely random and Markovian emigration. In addition, we use the relationship between capture probability estimates based on closed and open models under completely random temporary emigration to derive three ad hoc estimators for the probability of temporary emigration, two of which should be especially useful in situations where capture probabilities are heterogeneous among individual animals. Ad hoc and full-likelihood estimators are illustrated for small mammal capture-recapture data sets. We believe that these models and estimators will be useful for testing hypotheses about the process of temporary emigration, for estimating demographic parameters in the presence of temporary emigration, and for estimating probabilities of temporary emigration. These latter estimates are frequently of ecological interest as indicators of animal movement and, in some sampling situations, as direct estimates of breeding probabilities and proportions.

  4. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  5. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  6. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  7. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  8. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  9. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  10. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  11. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  12. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  13. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  14. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  15. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  16. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  17. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  18. Geometry of q-Exponential Family of Probability Distributions

    Directory of Open Access Journals (Sweden)

    Shun-ichi Amari

    2011-06-01

    Full Text Available The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than the standard Gibbs type distributions. The Tsallis q-entropy is a typical example capturing such phenomena. We treat the q-Gibbs distribution or the q-exponential family by generalizing the exponential function to the q-family of power functions, which is useful for studying various complex or non-standard physical phenomena. We give a new mathematical structure to the q-exponential family different from those previously given. It has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it. The q-version of the maximum entropy theorem is naturally induced from the q-Pythagorean theorem. We also show that the maximizer of the q-escort distribution is a Bayesian MAP (Maximum A posteriori Probability estimator.

  19. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  20. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  1. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  2. Collision Probabilities for Finite Cylinders and Cuboids

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.

  3. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  4. 8 Good Reasons for Reconsidering Lecture Capturing (of which Cost-Effectiveness is One)

    DEFF Research Database (Denmark)

    Godsk, Mikkel

    .g. expressed by Smithers, 2011; Hallett & Faria, 2006; and others). Some of these presuppositions are probably valid in some contexts; however, in other contexts the technology may be a reasonable and cost-effective alternative. This article presents my initial experiences with implementing lecture captures...

  5. Carbon dioxide capture and storage

    International Nuclear Information System (INIS)

    Durand, B.

    2011-01-01

    The author first highlights the reasons why storing carbon dioxide in geological formations could be a solution in the struggle against global warming and climate change. Thus, he comments various evolutions and prospective data about carbon emissions or fossil energy consumption as well as various studies performed by international bodies and agencies which show the interest of carbon dioxide storage. He comments the evolution of CO 2 contributions of different industrial sectors and activities, notably in France. He presents the different storage modes and methods which concern different geological formations (saline aquifers, abandoned oil or gas fields, not exploitable coal seams) and different processes (sorption, carbonation). He discusses the risks associated with these storages, the storable quantities, evokes some existing installations in different countries. He comments different ways to capture carbon dioxide (in post-combustion, through oxy-combustion, by pre-combustion) and briefly evokes some existing installations. He evokes the issue of transport, and discusses efficiency and cost aspects, and finally has few words on legal aspects and social acceptability

  6. Neutrinoless double electron capture decay of 54-Fe

    International Nuclear Information System (INIS)

    Bikit, I; Krmar, M.; Slivka, J.; Anicin, I.; Veskovic, M.; Convie, L.

    1994-01-01

    Double electron capture is the only decay mode of 54-Fe to 54-Cr. The most probable KK capture in the 0 nu case would lead to an otherwise not populated excited state of 54-Cr with the energy of 668 + - KeV. This process has not been yet investigated, probably because the lacking theoretical arguments on the nature of the excited state which could favour the decay. On the other hand if we suppose that gamma transition from this state to the ground state is allowed the 668 KeV gamma ray would be a definite signature of the process. Having in mind the relatively large abundance of 54-Fe in natural iron, a large quantity of iron in some shields in low level gamma spectroscopy systems and the low and flat background in the 668 keV spectral region, we easily estimate that the sensitivity for measuring the half life of this process is quit high. With our equipment consisting of a 25% efficiency commercial HP Ge spectrometer, placed in a cubic shaped iron shield with wall thickness of 25 cm from the background spectrum measured for only 100 days we calculated the lower limit for the half life of the 0 nu EC.EC decay of 54-Fe on the 68% confidence level to be T > 3.1 *10 sup 2 sup 2 years. 2 figs., 5 refs. (author)

  7. Suitability of digital camcorders for virtual reality image data capture

    Science.gov (United States)

    D'Apuzzo, Nicola; Maas, Hans-Gerd

    1998-12-01

    Today's consumer market digital camcorders offer features which make them appear quite interesting devices for virtual reality data capture. The paper compares a digital camcorder with an analogue camcorder and a machine vision type CCD camera and discusses the suitability of these three cameras for virtual reality applications. Besides the discussion of technical features of the cameras, this includes a detailed accuracy test in order to define the range of applications. In combination with the cameras, three different framegrabbers are tested. The geometric accuracy potential of all three cameras turned out to be surprisingly large, and no problems were noticed in the radiometric performance. On the other hand, some disadvantages have to be reported: from the photogrammetrists point of view, the major disadvantage of most camcorders is the missing possibility to synchronize multiple devices, limiting the suitability for 3-D motion data capture. Moreover, the standard video format contains interlacing, which is also undesirable for all applications dealing with moving objects or moving cameras. Further disadvantages are computer interfaces with functionality, which is still suboptimal. While custom-made solutions to these problems are probably rather expensive (and will make potential users turn back to machine vision like equipment), this functionality could probably be included by the manufacturers at almost zero cost.

  8. A survey of the Carbon Capture

    International Nuclear Information System (INIS)

    Jokrllova, J.; Cik, G.; Takacova, A.; Smolinska, M.

    2014-01-01

    The concentration of carbon dioxide, one of the most important representatives of greenhouse gases in the atmosphere continues to rise. Fossil fuels burned in thermal power plants currently represent 80% of total energy production around the world and are the largest point sources of CO 2 , accounting for approximately 40% of total CO 2 emissions. There are several options for reducing CO 2 emissions: reducing demand, improving production efficiency and carbon capture and storage (CCS, carbon capture and storage). Capture and storage of carbon dioxide is generally a three-step process: 1 st Capture and compression of combustion products, 2 nd transport (mostly pipeline) and 3 rd utilization (eg. production of urea, beverage industry, production of dry ice, etc.). Technologies for CO 2 capturing used in power plants burning fossil fuels can be divided into four groups, each of which requires a completely different approach to CO 2 capture.

  9. Does imminent threat capture and hold attention?

    Science.gov (United States)

    Koster, Ernst H W; Crombez, Geert; Van Damme, Stefaan; Verschuere, Bruno; De Houwer, Jan

    2004-09-01

    According to models of attention and emotion, threat captures and holds attention. In behavioral tasks, robust evidence has been found for attentional holding but not for attentional capture by threat. An important explanation for the absence of attentional capture effects is that the visual stimuli used posed no genuine threat. The present study investigated whether visual cues that signal an aversive white noise can elicit attentional capture and holding effects. Cues presented in an attentional task were simultaneously provided with a threat value through an aversive conditioning procedure. Response latencies showed that threatening cues captured and held attention. These results support recent views on attention to threat, proposing that imminent threat captures attention in everyone. (c) 2004 APA, all rights reserved

  10. Techniques for capturing bighorn sheep lambs

    Science.gov (United States)

    Smith, Joshua B.; Walsh, Daniel P.; Goldstein, Elise J.; Parsons, Zachary D.; Karsch, Rebekah C.; Stiver, Julie R.; Cain, James W.; Raedeke, Kenneth J.; Jenks, Jonathan A.

    2014-01-01

    Low lamb recruitment is a major challenge facing managers attempting to mitigate the decline of bighorn sheep (Ovis canadensis), and investigations into the underlying mechanisms are limited because of the inability to readily capture and monitor bighorn sheep lambs. We evaluated 4 capture techniques for bighorn sheep lambs: 1) hand-capture of lambs from radiocollared adult females fitted with vaginal implant transmitters (VITs), 2) hand-capture of lambs of intensively monitored radiocollared adult females, 3) helicopter net-gunning, and 4) hand-capture of lambs from helicopters. During 2010–2012, we successfully captured 90% of lambs from females that retained VITs to ≤1 day of parturition, although we noted differences in capture rates between an area of high road density in the Black Hills (92–100%) of South Dakota, USA, and less accessible areas of New Mexico (71%), USA. Retention of VITs was 78% with pre-partum expulsion the main cause of failure. We were less likely to capture lambs from females that expelled VITs ≥1 day of parturition (range = 80–83%) or females that were collared without VITs (range = 60–78%). We used helicopter net-gunning at several sites in 1999, 2001–2002, and 2011, and it proved a useful technique; however, at one site, attempts to capture lambs led to lamb predation by golden eagles (Aquila chrysaetos). We attempted helicopter hand-captures at one site in 1999, and they also were successful in certain circumstances and avoided risk of physical trauma from net-gunning; however, application was limited. In areas of low accessibility or if personnel lack the ability to monitor females and/or VITs for extended periods, helicopter capture may provide a viable option for lamb capture.

  11. Measurements of neutron capture cross sections

    International Nuclear Information System (INIS)

    Nakajima, Yutaka

    1984-01-01

    A review of measurement techniques for the neutron capture cross sections is presented. Sell transmission method, activation method, and prompt gamma-ray detection method are described using examples of capture cross section measurements. The capture cross section of 238 U measured by three different prompt gamma-ray detection methods (large liquid scintillator, Moxon-Rae detector, and pulse height weighting method) are compared and their discrepancies are resolved. A method how to derive the covariance is described. (author)

  12. Stream capture to form Red Pass, northern Soda Mountains, California

    Science.gov (United States)

    Miller, David; Mahan, Shannon

    2014-01-01

    Red Pass, a narrow cut through the Soda Mountains important for prehistoric and early historic travelers, is quite young geologically. Its history of downcutting to capture streams west of the Soda Mountains, thereby draining much of eastern Fort Irwin, is told by the contrast in alluvial fan sediments on either side of the pass. Old alluvial fan deposits (>500 ka) were shed westward off an intact ridge of the Soda Mountains but by middle Pleistocene time, intermediate-age alluvial fan deposits (~100 ka) were laid down by streams flowing east through the pass into Silurian Valley. The pass was probably formed by stream capture driven by high levels of groundwater on the west side. This is evidenced by widespread wetland deposits west of the Soda Mountains. Sapping and spring discharge into Silurian Valley over millennia formed a low divide in the mountains that eventually was overtopped and incised by a stream. Lessons include the importance of groundwater levels for stream capture and the relatively youthful appearance of this ~100-200 ka feature in the slowly changing Mojave Desert landscape.

  13. Predicting Flow Breakdown Probability and Duration in Stochastic Network Models: Impact on Travel Time Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Jing [ORNL; Mahmassani, Hani S. [Northwestern University, Evanston

    2011-01-01

    This paper proposes a methodology to produce random flow breakdown endogenously in a mesoscopic operational model, by capturing breakdown probability and duration. Based on previous research findings that probability of flow breakdown can be represented as a function of flow rate and the duration can be characterized by a hazard model. By generating random flow breakdown at various levels and capturing the traffic characteristics at the onset of the breakdown, the stochastic network simulation model provides a tool for evaluating travel time variability. The proposed model can be used for (1) providing reliability related traveler information; (2) designing ITS (intelligent transportation systems) strategies to improve reliability; and (3) evaluating reliability-related performance measures of the system.

  14. Attentional capture under high perceptual load.

    Science.gov (United States)

    Cosman, Joshua D; Vecera, Shaun P

    2010-12-01

    Attentional capture by abrupt onsets can be modulated by several factors, including the complexity, or perceptual load, of a scene. We have recently demonstrated that observers are less likely to be captured by abruptly appearing, task-irrelevant stimuli when they perform a search that is high, as opposed to low, in perceptual load (Cosman & Vecera, 2009), consistent with perceptual load theory. However, recent results indicate that onset frequency can influence stimulus-driven capture, with infrequent onsets capturing attention more often than did frequent onsets. Importantly, in our previous task, an abrupt onset was present on every trial, and consequently, attentional capture might have been affected by both onset frequency and perceptual load. In the present experiment, we examined whether onset frequency influences attentional capture under conditions of high perceptual load. When onsets were presented frequently, we replicated our earlier results; attentional capture by onsets was modulated under conditions of high perceptual load. Importantly, however, when onsets were presented infrequently, we observed robust capture effects. These results conflict with a strong form of load theory and, instead, suggest that exposure to the elements of a task (e.g., abrupt onsets) combines with high perceptual load to modulate attentional capture by task-irrelevant information.

  15. What Determines State Capture in Poland?

    Directory of Open Access Journals (Sweden)

    Stanisław Alwasiak

    2013-12-01

    Full Text Available Purpose: This study examines the determinants of ex-ante state capture in Poland. Methodology: In order to establish the determinants of ex-ante state capture a logistic regression is estimated. Findings: The study shows that in Poland the majority of legal acts were passed with the aim to satisfy the interest of particular groups. Furthermore, the regression analysis shows that the likelihood of state capture increases during the period of higher economic growth and local elections. The likelihood of state capture, however, declines during presidential elections. The results we attribute to different interests of political parties in the period of local and presidential elections. Finally, we fi nd that the state capture increased over the years in Poland. Additionally, we show that the EU accession did not prevent state capture in Poland. In contrast, the fi nancial crisis of 2007 resulted in a wake-up effect and the likelihood of state capture declined in Poland. Research limitations: In the study we employ proxies for state capture, yet we assume that corruption is a widespread phenomenon in Poland. However, due to its nature corruption is very diffi cult to assess and measure. Originality: The study uses a unique dataset on ex-ante state capture that was identifi ed in the legal acts that have been passed in the period 1990–2011 in Poland.

  16. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  17. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  18. Captures of Boll Weevils (Coleoptera: Curculionidae) in Relation to Trap Orientation and Distance From Brush Lines.

    Science.gov (United States)

    Spurgeon, Dale W

    2016-04-01

    Eradication programs for the boll weevil (Anthonomus grandis grandis Boheman) rely on pheromone-baited traps to trigger insecticide treatments and monitor program progress. A key objective of monitoring in these programs is the timely detection of incipient weevil populations to limit or prevent re-infestation. Therefore, improvements in the effectiveness of trapping would enhance efforts to achieve and maintain eradication. Association of pheromone traps with woodlots and other prominent vegetation are reported to increase captures of weevils, but the spatial scale over which this effect occurs is unknown. The influences of trap distance (0, 10, and 20 m) and orientation (leeward or windward) to brush lines on boll weevil captures were examined during three noncropping seasons (October to February) in the Rio Grande Valley of Texas. Differences in numbers of captured weevils and in the probability of capture between traps at 10 or 20 m from brush, although often statistically significant, were generally small and variable. Variations in boll weevil population levels, wind directions, and wind speeds apparently contributed to this variability. In contrast, traps closely associated with brush (0 m) generally captured larger numbers of weevils, and offered a higher probability of weevil capture compared with traps away from brush. These increases in the probability of weevil capture were as high as 30%. Such increases in the ability of traps to detect low-level boll weevil populations indicate trap placement with respect to prominent vegetation is an important consideration in maximizing the effectiveness of trap-based monitoring for the boll weevil.

  19. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  20. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  1. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  2. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  3. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  4. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  5. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  7. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  8. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  9. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  10. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  11. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  12. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  13. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  14. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  15. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  16. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  17. Probabilities from entanglement, Born's rule from envariance

    International Nuclear Information System (INIS)

    Zurek, W.

    2005-01-01

    Full text: I shall discuss consequences of envariance (environment - assisted invariance) symmetry exhibited by entangled quantum states. I shall focus on the implications of envariance for the understanding of the origins and nature of ignorance, and, hence, for the origin of probabilities in physics. While the derivation of the Born's rule for probabilities (pk IykI2) is the principal accomplishment of this research, I shall explore the possibility that several other symptoms of the quantum - classical transition that are a consequence of decoherence can be justified directly by envariance -- i.e., without invoking Born's rule. (author)

  18. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  19. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  20. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  1. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  2. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  3. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  4. Bayesian estimation of core-melt probability

    International Nuclear Information System (INIS)

    Lewis, H.W.

    1984-01-01

    A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease

  5. Visual Field Asymmetry in Attentional Capture

    Science.gov (United States)

    Du, Feng; Abrams, Richard A.

    2010-01-01

    The present study examined the spatial distribution of involuntary attentional capture over the two visual hemi-fields. A new experiment, and an analysis of three previous experiments showed that distractors in the left visual field that matched a sought-for target in color produced a much larger capture effect than identical distractors in the…

  6. Capturing, annotating and reflecting video footage

    DEFF Research Database (Denmark)

    Eckardt, Max Roald; Wagner, Johannes

    A presentation of interaction field data capturing setups for uninterrupted long term capturing. Two setups were described: the AMU forklift driving school with 17 cameras and the Digital Days 2016 at University College Nord in Aalborg with 16 kameras, 14 audio recorders, and two HDMI recorders....

  7. Atomic capture of negative mesons in hydrogen

    International Nuclear Information System (INIS)

    Leon, M.

    1979-01-01

    After a brief description of the present state of theoretical understanding of atomic capture of negative mesons, a very simple model calculation of negative muon capture by the simplest atoms, hydrogen is described. Also the possibility of generalizing these results to more complicated atoms and even molecules is noted. 15 references

  8. Contingent Attentional Capture by Conceptually Relevant Images

    Science.gov (United States)

    Wyble, Brad; Folk, Charles; Potter, Mary C.

    2013-01-01

    Attentional capture is an unintentional shift of visuospatial attention to the location of a distractor that is either highly salient, or relevant to the current task set. The latter situation is referred to as contingent capture, in that the effect is contingent on a match between characteristics of the stimuli and the task-defined…

  9. Screen captures to support switching attention.

    NARCIS (Netherlands)

    Gellevij, M.R.M.; van der Meij, Hans

    2002-01-01

    The study set out to validate the supportive role of screen captures for switching attention. Forty-two participants learned how to work with Microsoft Excel with a paper manual. There were three types of manuals: a textual manual, a visual manual with full-screen captures, and a visual manual with

  10. Capture, transport and storage of CO2

    International Nuclear Information System (INIS)

    De Boer, B.

    2008-01-01

    The emission of greenhouse gas CO2 in industrial processes and electricity production can be reduced on a large scale. Available techniques include post-combustion, pre-combustion, the oxy-fuel process, CO2 fixation in industrial processes and CO2 mineralization. In the Netherlands, plans for CO2 capture are not developing rapidly (CCS - carbon capture and storage). [mk] [nl

  11. Carbon capture by hybrid separation processes

    NARCIS (Netherlands)

    van Benthum, R.J.; van Kemenade, H.P.; Brouwers, J.J.H.

    2014-01-01

    Even though there is an increasing development of carbon capture technology over the last decade, large-scale implementation is still far from common practice, mainly caused by the energy intensiveness of carbon capture processes and the lack of regulation. In absence of strict regulation, less

  12. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  13. Confusion noise from LISA capture sources

    International Nuclear Information System (INIS)

    Barack, Leor; Cutler, Curt

    2004-01-01

    Captures of compact objects (COs) by massive black holes (MBHs) in galactic nuclei will be an important source for LISA, the proposed space-based gravitational wave (GW) detector. However, a large fraction of captures will not be individually resolvable - either because they are too distant, have unfavorable orientation, or have too many years to go before final plunge - and so will constitute a source of 'confusion noise', obscuring other types of sources. In this paper we estimate the shape and overall magnitude of the GW background energy spectrum generated by CO captures. This energy spectrum immediately translates to a spectral density S h capt (f) for the amplitude of capture-generated GWs registered by LISA. The overall magnitude of S h capt (f) is linear in the CO capture rates, which are rather uncertain; therefore we present results for a plausible range of rates. S h capt (f) includes the contributions from both resolvable and unresolvable captures, and thus represents an upper limit on the confusion noise level. We then estimate what fraction of S h capt (f) is due to unresolvable sources and hence constitutes confusion noise. We find that almost all of the contribution to S h capt (f) coming from white dwarf and neutron star captures, and at least ∼30% of the contribution from black hole captures, is from sources that cannot be individually resolved. Nevertheless, we show that the impact of capture confusion noise on the total LISA noise curve ranges from insignificant to modest, depending on the rates. Capture rates at the high end of estimated ranges would raise LISA's overall (effective) noise level [fS h eff (f)] 1/2 by at most a factor ∼2 in the frequency range 1-10 mHz, where LISA is most sensitive. While this slightly elevated noise level would somewhat decrease LISA's sensitivity to other classes of sources, we argue that, overall, this would be a pleasant problem for LISA to have: It would also imply that detection rates for CO captures

  14. Capture of Planetesimals into a Circumterrestrial Swarm

    Science.gov (United States)

    Weidenschilling, S. J.

    1985-01-01

    The lunar origin model considered in this report involves processing of protolunar material through a circumterrestrial swarm of particles. Once such a swarm has formed, it can gain mass by capturing infalling planetesimals and ejecta from giant impacts on the Earth, although the angular momentum supply from these sources remains a problem. The first stage of formation of a geocentric swarm by capture of planetesimals from initially heliocentric orbits is examined. The only plausible capture mechanism that is not dependent on very low approach velocities is the mutual collision of planetesimals passing within Earth's sphere of influence. The dissipation of energy in inelastic collisions or accretion events changes the value of the Jacobi parameter, allowing capture into bound geocentric orbits. This capture scenario was tested directly by many body numerical integration of planetesimal orbits in near Earth space.

  15. Capture cross sections on unstable nuclei

    Science.gov (United States)

    Tonchev, A. P.; Escher, J. E.; Scielzo, N.; Bedrossian, P.; Ilieva, R. S.; Humby, P.; Cooper, N.; Goddard, P. M.; Werner, V.; Tornow, W.; Rusev, G.; Kelley, J. H.; Pietralla, N.; Scheck, M.; Savran, D.; Löher, B.; Yates, S. W.; Crider, B. P.; Peters, E. E.; Tsoneva, N.; Goriely, S.

    2017-09-01

    Accurate neutron-capture cross sections on unstable nuclei near the line of beta stability are crucial for understanding the s-process nucleosynthesis. However, neutron-capture cross sections for short-lived radionuclides are difficult to measure due to the fact that the measurements require both highly radioactive samples and intense neutron sources. Essential ingredients for describing the γ decays following neutron capture are the γ-ray strength function and level densities. We will compare different indirect approaches for obtaining the most relevant observables that can constrain Hauser-Feshbach statistical-model calculations of capture cross sections. Specifically, we will consider photon scattering using monoenergetic and 100% linearly polarized photon beams. Challenges that exist on the path to obtaining neutron-capture cross sections for reactions on isotopes near and far from stability will be discussed.

  16. Capture cross sections on unstable nuclei

    Directory of Open Access Journals (Sweden)

    Tonchev A.P.

    2017-01-01

    Full Text Available Accurate neutron-capture cross sections on unstable nuclei near the line of beta stability are crucial for understanding the s-process nucleosynthesis. However, neutron-capture cross sections for short-lived radionuclides are difficult to measure due to the fact that the measurements require both highly radioactive samples and intense neutron sources. Essential ingredients for describing the γ decays following neutron capture are the γ-ray strength function and level densities. We will compare different indirect approaches for obtaining the most relevant observables that can constrain Hauser-Feshbach statistical-model calculations of capture cross sections. Specifically, we will consider photon scattering using monoenergetic and 100% linearly polarized photon beams. Challenges that exist on the path to obtaining neutron-capture cross sections for reactions on isotopes near and far from stability will be discussed.

  17. Tropical Cyclone Wind Probability Forecasting (WINDP).

    Science.gov (United States)

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  18. The Probability Heuristics Model of Syllogistic Reasoning.

    Science.gov (United States)

    Chater, Nick; Oaksford, Mike

    1999-01-01

    Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…

  19. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  20. Critique of `Elements of Quantum Probability'

    NARCIS (Netherlands)

    Gill, R.D.

    1998-01-01

    We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension

  1. Independent Events in Elementary Probability Theory

    Science.gov (United States)

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  2. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.

  3. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  4. Spatial Probability Cuing and Right Hemisphere Damage

    Science.gov (United States)

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  5. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  6. Virus isolation: Specimen type and probable transmission

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Virus isolation: Specimen type and probable transmission. Over 500 CHIK virus isolations were made. 4 from male Ae. Aegypti (?TOT). 6 from CSF (neurological involvement). 1 from a 4-day old child (transplacental transmission.

  7. Estimating the Probability of Negative Events

    Science.gov (United States)

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  8. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  9. Confusion between Odds and Probability, a Pandemic?

    Science.gov (United States)

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  10. Probability in Action: The Red Traffic Light

    Science.gov (United States)

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  11. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  12. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  13. Conditional probability on MV-algebras

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2005-01-01

    Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005

  14. Investigating Probability with the NBA Draft Lottery.

    Science.gov (United States)

    Quinn, Robert J.

    1997-01-01

    Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…

  15. Probability from a Socio-Cultural Perspective

    Science.gov (United States)

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  16. Neutrosophic Probability, Set, And Logic (first version)

    OpenAIRE

    Smarandache, Florentin

    2000-01-01

    This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.

  17. Pade approximant calculations for neutron escape probability

    International Nuclear Information System (INIS)

    El Wakil, S.A.; Saad, E.A.; Hendi, A.A.

    1984-07-01

    The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)

  18. On a paradox of probability theory

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)

  19. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  20. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  1. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  2. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  3. Measurement of low energy neutrino absorption probability in thallium 205

    International Nuclear Information System (INIS)

    Freedman, M.S.

    1986-01-01

    A major aspect of the P-P neutrino flux determination using thallium 205 is the very difficult problem of experimentally demonstrating the neutrino reaction cross section with about 10% accuracy. One will soon be able to completely strip the electrons from atomic thallium 205 and to maintain the bare nucleus in this state in the heavy storage ring to be built at GSI Darmstadt. This nucleus can decay by emitting a beta-minus particle into the bound K-level of the daughter lead 205 ion as the only energetically open decay channel, (plus, of course, an antineutrino). This single channel beta decay explores the same nuclear wave functions of initial and final states as does the neutrino capture in atomic thallium 205, and thus its probability or rate is governed by the same nuclear matrix elements that affect both weak interactions. Measuring the rate of accumulation of lead 205 ions in the circulating beam of thallium 205 ions gives directly the cross section of the neutrino capture reaction. The calculations of the expected rates under realistic experimental conditions will be shown to be very favorable for the measurement. A special calibration experiment to verify this method and check the theoretical calculations will be suggested. Finally, the neutrino cross section calculation based on the observed rate of the single channel beta-minus decay reaction will be shown. Demonstrating bound state beta decay may be the first verification of the theory of this very important process that influences beta decay rates of several isotopes in stellar interiors, e.g., Re-187, that play important roles in geologic and cosmologic dating and nucleosynthesis. 21 refs., 2 figs

  4. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  5. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  6. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  7. Fusion probability and survivability in estimates of heaviest nuclei production

    International Nuclear Information System (INIS)

    Sagaidak, Roman

    2012-01-01

    A number of theoretical models have been recently developed to predict production cross sections for the heaviest nuclei in fusion-evaporation reactions. All the models reproduce cross sections obtained in experiments quite well. At the same time they give fusion probability values P fus ≡ P CN differed within several orders of the value. This difference implies a corresponding distinction in the calculated values of survivability. The production of the heaviest nuclei (from Cm to the region of superheavy elements (SHE) close to Z = 114 and N = 184) in fusion-evaporation reactions induced by heavy ions has been considered in a systematic way within the framework of the barrier-passing (fusion) model coupled with the standard statistical model (SSM) of the compound nucleus (CN) decay. Both models are incorporated into the HIVAP code. Available data on the excitation functions for fission and evaporation residues (ER) produced in very asymmetric combinations can be described rather well within the framework of HIVAP. Cross-section data obtained in these reactions allow one to choose model parameters quite definitely. Thus one can scale and fix macroscopic (liquid-drop) fission barriers for nuclei involved in the evaporation-fission cascade. In less asymmetric combinations (with 22 Ne and heavier projectiles) effects of fusion suppression caused by quasi-fission are starting to appear in the entrance channel of reactions. The P fus values derived from the capture-fission and fusion-fission cross-sections obtained at energies above the Bass barrier were plotted as a function of the Coulomb parameter. For more symmetric combinations one can deduce the P fus values semi-empirically, using the ER and fission excitation functions measured in experiments, and applying SSM model with parameters obtained in the analysis of a very asymmetric combination leading to the production of (nearly) the same CN, as was done for reactions leading to the pre-actinide nuclei formation

  8. Systematics of the breakup probability function for {sup 6}Li and {sup 7}Li projectiles

    Energy Technology Data Exchange (ETDEWEB)

    Capurro, O.A., E-mail: capurro@tandar.cnea.gov.ar [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); Pacheco, A.J.; Arazi, A. [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Carnelli, P.F.F. [CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Instituto de Investigación e Ingeniería Ambiental, Universidad Nacional de San Martín, 25 de Mayo y Francia, B1650BWA San Martín, Buenos Aires (Argentina); Fernández Niello, J.O. [Laboratorio TANDAR, Comisión Nacional de Energía Atómica, Av. General Paz 1499, B1650KNA San Martín, Buenos Aires (Argentina); CONICET, Av. Rivadavia 1917, C1033AAJ Buenos Aires (Argentina); Instituto de Investigación e Ingeniería Ambiental, Universidad Nacional de San Martín, 25 de Mayo y Francia, B1650BWA San Martín, Buenos Aires (Argentina); and others

    2016-01-15

    Experimental non-capture breakup cross sections can be used to determine the probability of projectile and ejectile fragmentation in nuclear reactions involving weakly bound nuclei. Recently, the probability of both type of dissociations has been analyzed in nuclear reactions involving {sup 9}Be projectiles onto various heavy targets at sub-barrier energies. In the present work we extend this kind of systematic analysis to the case of {sup 6}Li and {sup 7}Li projectiles with the purpose of investigating general features of projectile-like breakup probabilities for reactions induced by stable weakly bound nuclei. For that purpose we have obtained the probabilities of projectile and ejectile breakup for a large number of systems, starting from a compilation of the corresponding reported non-capture breakup cross sections. We parametrize the results in accordance with the previous studies for the case of beryllium projectiles, and we discuss their systematic behavior as a function of the projectile, the target mass and the reaction Q-value.

  9. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  10. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  11. Exploiting the Capture Effect to Enhance RACH Performance in Cellular-Based M2M Communications

    Directory of Open Access Journals (Sweden)

    Jonghun Kim

    2017-09-01

    Full Text Available Cellular-based machine-to-machine (M2M communication is expected to facilitate services for the Internet of Things (IoT. However, because cellular networks are designed for human users, they have some limitations. Random access channel (RACH congestion caused by massive access from M2M devices is one of the biggest factors hindering cellular-based M2M services because the RACH congestion causes random access (RA throughput degradation and connection failures to the devices. In this paper, we show the possibility exploiting the capture effects, which have been known to have a positive impact on the wireless network system, on RA procedure for improving the RA performance of M2M devices. For this purpose, we analyze an RA procedure using a capture model. Through this analysis, we examine the effects of capture on RA performance and propose an Msg3 power-ramping (Msg3 PR scheme to increase the capture probability (thereby increasing the RA success probability even when severe RACH congestion problem occurs. The proposed analysis models are validated using simulations. The results show that the proposed scheme, with proper parameters, further improves the RA throughput and reduces the connection failure probability, by slightly increasing the energy consumption. Finally, we demonstrate the effects of coexistence with other RA-related schemes through simulation results.

  12. Capture of irregular satellites at Jupiter

    International Nuclear Information System (INIS)

    Nesvorný, David; Vokrouhlický, David; Deienno, Rogerio

    2014-01-01

    The irregular satellites of outer planets are thought to have been captured from heliocentric orbits. The exact nature of the capture process, however, remains uncertain. We examine the possibility that irregular satellites were captured from the planetesimal disk during the early solar system instability when encounters between the outer planets occurred. Nesvorný et al. already showed that the irregular satellites of Saturn, Uranus, and Neptune were plausibly captured during planetary encounters. Here we find that the current instability models present favorable conditions for capture of irregular satellites at Jupiter as well, mainly because Jupiter undergoes a phase of close encounters with an ice giant. We show that the orbital distribution of bodies captured during planetary encounters provides a good match to the observed distribution of irregular satellites at Jupiter. The capture efficiency for each particle in the original transplanetary disk is found to be (1.3-3.6) × 10 –8 . This is roughly enough to explain the observed population of jovian irregular moons. We also confirm Nesvorný et al.'s results for the irregular satellites of Saturn, Uranus, and Neptune.

  13. Recent development of capture of CO2

    CERN Document Server

    Chavez, Rosa Hilda

    2014-01-01

    "Recent Technologies in the capture of CO2" provides a comprehensive summary on the latest technologies available to minimize the emission of CO2 from large point sources like fossil-fuel power plants or industrial facilities. This ebook also covers various techniques that could be developed to reduce the amount of CO2 released into the atmosphere. The contents of this book include chapters on oxy-fuel combustion in fluidized beds, gas separation membrane used in post-combustion capture, minimizing energy consumption in CO2 capture processes through process integration, characterization and application of structured packing for CO2 capture, calcium looping technology for CO2 capture and many more. Recent Technologies in capture of CO2 is a valuable resource for graduate students, process engineers and administrative staff looking for real-case analysis of pilot plants. This eBook brings together the research results and professional experiences of the most renowned work groups in the CO2 capture field...

  14. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  15. USING RASCH ANALYSIS TO EXPLORE WHAT STUDENTS LEARN ABOUT PROBABILITY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand.  Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand.Keywords: Perceived Understanding, Probability Concepts, Rasch Measurement Model DOI: dx.doi.org/10.22342/jme.61.1

  16. Understanding Motion Capture for Computer Animation

    CERN Document Server

    Menache, Alberto

    2010-01-01

    The power of today's motion capture technology has taken animated characters and special effects to amazing new levels of reality. And with the release of blockbusters like Avatar and Tin-Tin, audiences continually expect more from each new release. To live up to these expectations, film and game makers, particularly technical animators and directors, need to be at the forefront of motion capture technology. In this extensively updated edition of Understanding Motion Capture for Computer Animation and Video Games, an industry insider explains the latest research developments in digital design

  17. The carbon dioxide capture and geological storage

    International Nuclear Information System (INIS)

    2006-06-01

    This road-map proposes by the Group Total aims to inform the public on the carbon dioxide capture and geological storage. One possible means of climate change mitigation consists of storing the CO 2 generated by the greenhouse gases emission in order to stabilize atmospheric concentrations. This sheet presents the CO 2 capture from lage fossil-fueled combustion installations, the three capture techniques and the CO 2 transport options, the geological storage of the CO 2 and Total commitments in the domain. (A.L.B.)

  18. Transition probabilities for lines of Cr II, Na II and Sb I by laser produced plasma atomic emission spectroscopy

    International Nuclear Information System (INIS)

    Gonzalez, A. M.; Ortiz, M.; Campos, J.

    1995-01-01

    Absolute transition probabilities for lines of CR II, Na II and Sb I were determined by emission spectroscopy of laser induced plasmas. the plasma was produced focusing the emission of a pulsed Nd-Yag laser on solid samples containing the atom in study. the light arising from the plasma region was collected by and spectrometer. the detector used was a time-resolved optical multichannel analyzer (OMA III EG and G). The wavelengths of the measured transitions range from 2000 sto 4100 A. The spectral resolution of the system was 0. 2 A. The method can be used in insulators materials as Cl Na crystals and in metallic samples as Al-Cr and Sn-Sn alloys. to avoid self-absorption effects the alloys were made with low Sb or Cr content. Relative transition probabilities have been determined from measurements of emission-line intensities and were placed on an absolute scale by using, where possible, accurate experimental lifetime values form the literature or theoretical data. From these measurements, values for plasma temperature (8000-24000 K), electron densities (∼∼ 10''16 cm ''-3) and self-absorption coefficients have been obtained. (Author) 56 refs

  19. Transition probabilities for lines of Cr II, Na II and Sb I by laser produced plasma atomic emission spectroscopy

    International Nuclear Information System (INIS)

    Gonzalez, A.M.; Ortiz, M.; Campos, J.

    1995-09-01

    Absolute transition probabilities for lines of Cr II, Na II and Sb I were determined by emission spectroscopy of laser induced plasmas. The plasma was produced focusing the emission of a pulsed Nd-Yag laser on solid samples containing the atom in study. The light arising from the plasma region was collected by and spectrometer. the detector used was a time-resolved optical multichannel analyzer (OMA III EG and G). The wavelengths of the measured transitions range from 2000 to 4100 A. The spectral resolution of the system was 0.2 A. The method can be used in insulators materials as Cl Na crystals and in metallic samples as Al-Cr and Sn-Sb alloys. To avoid self-absorption effects the alloys were made with low Sb or Cr content. Relative transition probabilities have been determined from measurements of emission-line intensities and were placed on an absolute scale by using, where possible, accurate experimental lifetime values form the literature or theoretical data. From these measurements, values for plasma temperature (8000-24000K), electron densities (approx 10 ''16 cm''-3) and self-absorption coefficients have been obtained

  20. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  1. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  2. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  3. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    Anthony, J.P.; Bacher, P.; Lheureux, L.; Moreau, J.; Schmitt, A.P.

    1957-01-01

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [fr

  4. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  5. Multiple model cardinalized probability hypothesis density filter

    Science.gov (United States)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  6. Conditional probabilities in Ponzano-Regge minisuperspace

    International Nuclear Information System (INIS)

    Petryk, Roman; Schleich, Kristin

    2003-01-01

    We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes

  7. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  8. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  9. Probability Weighting as Evolutionary Second-best

    OpenAIRE

    Herold, Florian; Netzer, Nick

    2011-01-01

    The economic concept of the second-best involves the idea that multiple simultaneous deviations from a hypothetical first-best optimum may be optimal once the first-best itself can no longer be achieved, since one distortion may partially compensate for another. Within an evolutionary framework, we translate this concept to behavior under uncertainty. We argue that the two main components of prospect theory, the value function and the probability weighting function, are complements in the sec...

  10. Bayesian probability theory and inverse problems

    International Nuclear Information System (INIS)

    Kopec, S.

    1994-01-01

    Bayesian probability theory is applied to approximate solving of the inverse problems. In order to solve the moment problem with the noisy data, the entropic prior is used. The expressions for the solution and its error bounds are presented. When the noise level tends to zero, the Bayesian solution tends to the classic maximum entropy solution in the L 2 norm. The way of using spline prior is also shown. (author)

  11. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  12. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  13. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  14. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  15. Clan structure analysis and rapidity gap probability

    International Nuclear Information System (INIS)

    Lupia, S.; Giovannini, A.; Ugoccioni, R.

    1995-01-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)

  16. Clan structure analysis and rapidity gap probability

    Energy Technology Data Exchange (ETDEWEB)

    Lupia, S. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Giovannini, A. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Ugoccioni, R. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy)

    1995-03-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)

  17. Introduction to tensorial resistivity probability tomography

    OpenAIRE

    Mauriello, Paolo; Patella, Domenico

    2005-01-01

    The probability tomography approach developed for the scalar resistivity method is here extended to the 2D tensorial apparent resistivity acquisition mode. The rotational invariant derived from the trace of the apparent resistivity tensor is considered, since it gives on the datum plane anomalies confined above the buried objects. Firstly, a departure function is introduced as the difference between the tensorial invariant measured over the real structure and that computed for a reference uni...

  18. Interaction probability value calculi for some scintillators

    International Nuclear Information System (INIS)

    Garcia-Torano Martinez, E.; Grau Malonda, A.

    1989-01-01

    Interaction probabilities for 17 gamma-ray energies between 1 and 1.000 KeV have been computed and tabulated. The tables may be applied to the case of cylindrical vials with radius 1,25 cm and volumes 5, 10 and 15 ml. Toluene, Toluene/Alcohol, Dioxane-Naftalen, PCS, INSTAGEL and HISAFE II scintillators are considered. Graphical results for 10 ml are also given. (Author) 11 refs

  19. Probability of collective excited state decay

    International Nuclear Information System (INIS)

    Manykin, Eh.A.; Ozhovan, M.I.; Poluehktov, P.P.

    1987-01-01

    Decay mechanisms of condensed excited state formed of highly excited (Rydberg) atoms are considered, i.e. stability of so-called Rydberg substance is analyzed. It is shown that Auger recombination and radiation transitions are the basic processes. The corresponding probabilities are calculated and compared. It is ascertained that the ''Rydberg substance'' possesses macroscopic lifetime (several seconds) and in a sense it is metastable

  20. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.