WorldWideScience

Sample records for calculating age-conditional probabilities

  1. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number of...

  2. Calculation of fractional electron capture probabilities

    CERN Document Server

    Schoenfeld, E

    1998-01-01

    A 'Table of Radionuclides' is being prepared which will supersede the 'Table de Radionucleides' formerly issued by the LMRI/LPRI (France). In this effort it is desirable to have a uniform basis for calculating theoretical values of fractional electron capture probabilities. A table has been compiled which allows one to calculate conveniently and quickly the fractional probabilities P sub K , P sub L , P sub M , P sub N and P sub O , their ratios and the assigned uncertainties for allowed and non-unique first forbidden electron capture transitions of known transition energy for radionuclides with atomic numbers from Z=3 to 102. These results have been applied to a total of 28 transitions of 14 radionuclides ( sup 7 Be, sup 2 sup 2 Na, sup 5 sup 1 Cr, sup 5 sup 4 Mn, sup 5 sup 5 Fe, sup 6 sup 8 Ge , sup 6 sup 8 Ga, sup 7 sup 5 Se, sup 1 sup 0 sup 9 Cd, sup 1 sup 2 sup 5 I, sup 1 sup 3 sup 9 Ce, sup 1 sup 6 sup 9 Yb, sup 1 sup 9 sup 7 Hg, sup 2 sup 0 sup 2 Tl). The values are in reasonable agreement with measure...

  3. Computational methods for probability of instability calculations

    Science.gov (United States)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  4. Probability calculations for three-part mineral resource assessments

    Science.gov (United States)

    Ellefsen, Karl J.

    2017-06-27

    Three-part mineral resource assessment is a methodology for predicting, in a specified geographic region, both the number of undiscovered mineral deposits and the amount of mineral resources in those deposits. These predictions are based on probability calculations that are performed with computer software that is newly implemented. Compared to the previous implementation, the new implementation includes new features for the probability calculations themselves and for checks of those calculations. The development of the new implementation lead to a new understanding of the probability calculations, namely the assumptions inherent in the probability calculations. Several assumptions strongly affect the mineral resource predictions, so it is crucial that they are checked during an assessment. The evaluation of the new implementation leads to new findings about the probability calculations,namely findings regarding the precision of the computations,the computation time, and the sensitivity of the calculation results to the input.

  5. Fostering Positive Attitude in Probability Learning Using Graphing Calculator

    Science.gov (United States)

    Tan, Choo-Kim; Harji, Madhubala Bava; Lau, Siong-Hoe

    2011-01-01

    Although a plethora of research evidence highlights positive and significant outcomes of the incorporation of the Graphing Calculator (GC) in mathematics education, its use in the teaching and learning process appears to be limited. The obvious need to revisit the teaching and learning of Probability has resulted in this study, i.e. to incorporate…

  6. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  7. Calculating the probability of detecting radio signals from alien civilizations

    Science.gov (United States)

    Horvat, Marko

    2006-09-01

    Although it might not be self-evident, it is in fact entirely possible to calculate the probability of detecting alien radio signals by understanding what types of extraterrestrial radio emissions can be expected and what properties these emissions can have. Using the Drake equation as the obvious starting point, and logically identifying and enumerating constraints of interstellar radio communications, may yield the possibility of detecting a genuine alien radio signal.

  8. Application of random match probability calculations to mixed STR profiles.

    Science.gov (United States)

    Bille, Todd; Bright, Jo-Anne; Buckleton, John

    2013-03-01

    Mixed DNA profiles are being encountered more frequently as laboratories analyze increasing amounts of touch evidence. If it is determined that an individual could be a possible contributor to the mixture, it is necessary to perform a statistical analysis to allow an assignment of weight to the evidence. Currently, the combined probability of inclusion (CPI) and the likelihood ratio (LR) are the most commonly used methods to perform the statistical analysis. A third method, random match probability (RMP), is available. This article compares the advantages and disadvantages of the CPI and LR methods to the RMP method. We demonstrate that although the LR method is still considered the most powerful of the binary methods, the RMP and LR methods make similar use of the observed data such as peak height, assumed number of contributors, and known contributors where the CPI calculation tends to waste information and be less informative. © 2013 American Academy of Forensic Sciences.

  9. Exact numerical calculation of fixation probability and time on graphs.

    Science.gov (United States)

    Hindersin, Laura; Möller, Marius; Traulsen, Arne; Bauer, Benedikt

    2016-12-01

    The Moran process on graphs is a popular model to study the dynamics of evolution in a spatially structured population. Exact analytical solutions for the fixation probability and time of a new mutant have been found for only a few classes of graphs so far. Simulations are time-expensive and many realizations are necessary, as the variance of the fixation times is high. We present an algorithm that numerically computes these quantities for arbitrary small graphs by an approach based on the transition matrix. The advantage over simulations is that the calculation has to be executed only once. Building the transition matrix is automated by our algorithm. This enables a fast and interactive study of different graph structures and their effect on fixation probability and time. We provide a fast implementation in C with this note (Hindersin et al., 2016). Our code is very flexible, as it can handle two different update mechanisms (Birth-death or death-Birth), as well as arbitrary directed or undirected graphs. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Probability density functions for use when calculating standardised drought indices

    Science.gov (United States)

    Svensson, Cecilia; Prosdocimi, Ilaria; Hannaford, Jamie

    2015-04-01

    Time series of drought indices like the standardised precipitation index (SPI) and standardised flow index (SFI) require a statistical probability density function to be fitted to the observed (generally monthly) precipitation and river flow data. Once fitted, the quantiles are transformed to a Normal distribution with mean = 0 and standard deviation = 1. These transformed data are the SPI/SFI, which are widely used in drought studies, including for drought monitoring and early warning applications. Different distributions were fitted to rainfall and river flow data accumulated over 1, 3, 6 and 12 months for 121 catchments in the United Kingdom. These catchments represent a range of catchment characteristics in a mid-latitude climate. Both rainfall and river flow data have a lower bound at 0, as rains and flows cannot be negative. Their empirical distributions also tend to have positive skewness, and therefore the Gamma distribution has often been a natural and suitable choice for describing the data statistically. However, after transformation of the data to Normal distributions to obtain the SPIs and SFIs for the 121 catchments, the distributions are rejected in 11% and 19% of cases, respectively, by the Shapiro-Wilk test. Three-parameter distributions traditionally used in hydrological applications, such as the Pearson type 3 for rainfall and the Generalised Logistic and Generalised Extreme Value distributions for river flow, tend to make the transformed data fit better, with rejection rates of 5% or less. However, none of these three-parameter distributions have a lower bound at zero. This means that the lower tail of the fitted distribution may potentially go below zero, which would result in a lower limit to the calculated SPI and SFI values (as observations can never reach into this lower tail of the theoretical distribution). The Tweedie distribution can overcome the problems found when using either the Gamma or the above three-parameter distributions. The

  11. Flipping Out: Calculating Probability with a Coin Game

    Science.gov (United States)

    Degner, Kate

    2015-01-01

    In the author's experience with this activity, students struggle with the idea of representativeness in probability. Therefore, this student misconception is part of the classroom discussion about the activities in this lesson. Representativeness is related to the (incorrect) idea that outcomes that seem more random are more likely to happen. This…

  12. A Failure Probability Calculation Method for Power Equipment Based on Multi-Characteristic Parameters

    Directory of Open Access Journals (Sweden)

    Hang Liu

    2017-05-01

    Full Text Available Although traditional fault diagnosis methods can qualitatively identify the failure modes for power equipment, it is difficult to evaluate the failure probability quantitatively. In this paper, a failure probability calculation method for power equipment based on multi-characteristic parameters is proposed. After collecting the historical data of different fault characteristic parameters, the distribution functions and the cumulative distribution functions of each parameter, which are applied to dispersing the parameters and calculating the differential warning values, are calculated by using the two-parameter Weibull model. To calculate the membership functions of parameters for each failure mode, the Apriori algorithm is chosen to mine the association rules between parameters and failure modes. After that, the failure probability of each failure mode is obtained by integrating the membership functions of different parameters by a weighted method, and the important weight of each parameter is calculated by the differential warning values. According to the failure probability calculation result, the series model is established to estimate the failure probability of the equipment. Finally, an application example for two 220 kV transformers is presented to show the detailed process of the method. Compared with traditional fault diagnosis methods, the calculation results not only identify the failure modes correctly, but also reflect the failure probability changing trend of the equipment accurately.

  13. Exact calculation of loop formation probability identifies folding motifs in RNA secondary structures.

    Science.gov (United States)

    Sloma, Michael F; Mathews, David H

    2016-12-01

    RNA secondary structure prediction is widely used to analyze RNA sequences. In an RNA partition function calculation, free energy nearest neighbor parameters are used in a dynamic programming algorithm to estimate statistical properties of the secondary structure ensemble. Previously, partition functions have largely been used to estimate the probability that a given pair of nucleotides form a base pair, the conditional stacking probability, the accessibility to binding of a continuous stretch of nucleotides, or a representative sample of RNA structures. Here it is demonstrated that an RNA partition function can also be used to calculate the exact probability of formation of hairpin loops, internal loops, bulge loops, or multibranch loops at a given position. This calculation can also be used to estimate the probability of formation of specific helices. Benchmarking on a set of RNA sequences with known secondary structures indicated that loops that were calculated to be more probable were more likely to be present in the known structure than less probable loops. Furthermore, highly probable loops are more likely to be in the known structure than the set of loops predicted in the lowest free energy structures. © 2016 Sloma and Mathews; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  14. A web-based interface to calculate phonotactic probability for words and nonwords in English.

    Science.gov (United States)

    Vitevitch, Michael S; Luce, Paul A

    2004-08-01

    Phonotactic probability refers to the frequency with which phonological segments and sequences of phonological segments occur in words in a given language. We describe one method of estimating phonotactic probabilities based on words in American English. These estimates of phonotactic probability have been used in a number of previous studies and are now being made available to other researchers via a Web-based interface. Instructions for using the interface, as well as details regarding how the measures were derived, are provided in the present article. The Phonotactic Probability Calculator can be accessed at http://www.people.ku.edu/-mvitevit/PhonoProbHome.html.

  15. Calculation of the spatial distribution of defects and cascade- probability functions in the materials

    Science.gov (United States)

    Kupchishin, A. I.; Kupchishin, A. A.; Shmygalev, E. V.; Shmygaleva, T. A.; Tlebaev, K. B.

    2014-11-01

    In this article we carried out the calculations of the depth distribution of implanted ions of arsenic and indium, loss of energy and cascade-probability functions in silicon. Comparison of the calculations with the experimental data is in the satisfactory agreement. The computer simulation and analysis of the characteristics of ions depending on the depth of penetration and the number of interactions were carried out.

  16. A study of the absence of arbitrage opportunities without calculating the risk-neutral probability

    Directory of Open Access Journals (Sweden)

    Dani S.

    2016-12-01

    Full Text Available In this paper, we establish the property of conditional full support for two processes: the Ornstein Uhlenbeck and the stochastic integral in which the Brownian Bridge is the integrator and we build the absence of arbitrage opportunities without calculating the risk-neutral probability.

  17. Assigning stereochemistry to single diastereoisomers by GIAO NMR calculation: the DP4 probability.

    Science.gov (United States)

    Smith, Steven G; Goodman, Jonathan M

    2010-09-22

    GIAO NMR shift calculation has been applied to the challenging task of reliably assigning stereochemistry with quantifiable confidence when only one set of experimental data are available. We have compared several approaches for assigning a probability to each candidate structure and have tested the ability of these methods to distinguish up to 64 possible diastereoisomers of 117 different molecules, using NMR shifts obtained in rapid and computationally inexpensive single-point calculations on molecular mechanics geometries without time-consuming ab initio geometry optimization. We show that a probability analysis based on the errors in each (13)C or (1)H shift is significantly more successful at making correct assignments with high confidence than are probabilities based on the correlation coefficient and mean absolute error parameters. Our new probability measure, which we have termed DP4, complements the probabilities obtained from our previously developed CP3 parameter, which applies to the case of assigning a pair of diastereoisomers when one has both experimental data sets. We illustrate the application of DP4 to assigning the stereochemistry or structure of 21 natural products that were originally misassigned in the literature or that required extensive synthesis of diastereoisomers to establish their stereochemistry.

  18. Calculation of ruin probabilities for a dense class of heavy tailed distributions

    DEFF Research Database (Denmark)

    Bladt, Mogens; Nielsen, Bo Friis; Samorodnitsky, Gennady

    2015-01-01

    In this paper, we propose a class of infinite-dimensional phase-type distributions with finitely many parameters as models for heavy tailed distributions. The class of finite-dimensional phase-type distributions is dense in the class of distributions on the positive reals and may hence approximat...... any such distribution. We prove that formulas from renewal theory, and with a particular attention to ruin probabilities, which are true for common phase-type distributions also hold true for the infinite-dimensional case. We provide algorithms for calculating functionals of interest...... of distributions with a slowly varying tail. An example from risk theory, comparing ruin probabilities for a classical risk process with Pareto distributed claim sizes, is presented and exact known ruin probabilities for the Pareto case are compared to the ones obtained by approximating by an infinite...

  19. Efficient calculation of steady state probability distribution for stochastic biochemical reaction network.

    Science.gov (United States)

    Karim, Shahriar; Buzzard, Gregery T; Umulis, David M

    2012-01-01

    The Steady State (SS) probability distribution is an important quantity needed to characterize the steady state behavior of many stochastic biochemical networks. In this paper, we propose an efficient and accurate approach to calculating an approximate SS probability distribution from solution of the Chemical Master Equation (CME) under the assumption of the existence of a unique deterministic SS of the system. To find the approximate solution to the CME, a truncated state-space representation is used to reduce the state-space of the system and translate it to a finite dimension. The subsequent ill-posed eigenvalue problem of a linear system for the finite state-space can be converted to a well-posed system of linear equations and solved. The proposed strategy yields efficient and accurate estimation of noise in stochastic biochemical systems. To demonstrate the approach, we applied the method to characterize the noise behavior of a set of biochemical networks of ligand-receptor interactions for Bone Morphogenetic Protein (BMP) signaling. We found that recruitment of type II receptors during the receptor oligomerization by itself doesn't not tend to lower noise in receptor signaling, but regulation by a secreted co-factor may provide a substantial improvement in signaling relative to noise. The steady state probability approximation method shortened the time necessary to calculate the probability distributions compared to earlier approaches, such as Gillespie's Stochastic Simulation Algorithm (SSA) while maintaining high accuracy.

  20. Efficient Probability of Failure Calculations for QMU using Computational Geometry LDRD 13-0144 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Scott A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ebeida, Mohamed Salah [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rushdi, Ahmad A. [Univ. of Texas, Austin, TX (United States); Abdelkader, Ahmad [Univ. of Maryland, College Park, MD (United States)

    2015-09-01

    This SAND report summarizes our work on the Sandia National Laboratory LDRD project titled "Efficient Probability of Failure Calculations for QMU using Computational Geometry" which was project #165617 and proposal #13-0144. This report merely summarizes our work. Those interested in the technical details are encouraged to read the full published results, and contact the report authors for the status of the software and follow-on projects.

  1. Effects of subpopulation structure on probability calculations of DNA profiles from forensic PCR analysis.

    Science.gov (United States)

    Gallo, J C; Thomas, E; Novick, G E; Herrera, R J

    1997-01-01

    DNA typing for forensic identification is a two-step process. The first step involves determining the profiles of samples collected at the crime scene and comparing them with the profiles obtained from suspects and the victims. In the case of a match that includes the suspect as the potential source of the material collected at the crime scene, the last step in the process is to answer the question, what is the likelihood that someone in addition to the suspect could match the profile of the sample studied? This likelihood is calculated by determining the frequency of the suspect's profile in the relevant population databases. The design of forensic databases and the criteria for comparison has been addressed by the NRC report of 1996 (National Research Council, 1996). However, the fact that geographical proximity, migrational patterns, and even cultural and social practices have effects on subpopulation structure establishes the grounds for further study into its effects on the calculation of probability of occurrence values. The issue becomes more relevant in the case of discrete polymorphic markers that show higher probability of occurrence in the reference populations, where several orders of magnitude difference between the databases may have an impact on the jury. In this study, we calculated G values for all possible pairwise comparisons of allelic frequencies in the different databases from the races or subpopulations examined. In addition, we analyzed a set of 24 unrelated Caucasian, 37 unrelated African-American, and 96 unrelated Sioux/Chippewa individuals for seven polymorphic loci (DQA1, LDLR, GYPA, HBGG, D7S8, GC, and D1S80). All three sets of individuals where sampled from Minnesota. The probability of occurrence for all seven loci were calculated with respect to nine different databases: Caucasian, Arabic, Korean, Sioux/Chippewa, Navajo, Pueblo, African American, Southeastern Hispanic, and Southwestern Hispanic. Analysis of the results demonstrated

  2. Impact of temporal probability in 4D dose calculation for lung tumors.

    Science.gov (United States)

    Rouabhi, Ouided; Ma, Mingyu; Bayouth, John; Xia, Junyi

    2015-11-08

    The purpose of this study was to evaluate the dosimetric uncertainty in 4D dose calculation using three temporal probability distributions: uniform distribution, sinusoidal distribution, and patient-specific distribution derived from the patient respiratory trace. Temporal probability, defined as the fraction of time a patient spends in each respiratory amplitude, was evaluated in nine lung cancer patients. Four-dimensional computed tomography (4D CT), along with deformable image registration, was used to compute 4D dose incorporating the patient's respiratory motion. First, the dose of each of 10 phase CTs was computed using the same planning parameters as those used in 3D treatment planning based on the breath-hold CT. Next, deformable image registration was used to deform the dose of each phase CT to the breath-hold CT using the deformation map between the phase CT and the breath-hold CT. Finally, the 4D dose was computed by summing the deformed phase doses using their corresponding temporal probabilities. In this study, 4D dose calculated from the patient-specific temporal probability distribution was used as the ground truth. The dosimetric evaluation matrix included: 1) 3D gamma analysis, 2) mean tumor dose (MTD), 3) mean lung dose (MLD), and 4) lung V20. For seven out of nine patients, both uniform and sinusoidal temporal probability dose distributions were found to have an average gamma passing rate > 95% for both the lung and PTV regions. Compared with 4D dose calculated using the patient respiratory trace, doses using uniform and sinusoidal distribution showed a percentage difference on average of -0.1% ± 0.6% and -0.2% ± 0.4% in MTD, -0.2% ± 1.9% and -0.2% ± 1.3% in MLD, 0.09% ± 2.8% and -0.07% ± 1.8% in lung V20, -0.1% ± 2.0% and 0.08% ± 1.34% in lung V10, 0.47% ± 1.8% and 0.19% ± 1.3% in lung V5, respectively. We concluded that four-dimensional dose computed using either a uniform or sinusoidal temporal probability distribution can

  3. Probability density of tunneled carrier states near heterojunctions calculated numerically by the scattering method.

    Energy Technology Data Exchange (ETDEWEB)

    Wampler, William R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Myers, Samuel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Modine, Normand A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    The energy-dependent probability density of tunneled carrier states for arbitrarily specified longitudinal potential-energy profiles in planar bipolar devices is numerically computed using the scattering method. Results agree accurately with a previous treatment based on solution of the localized eigenvalue problem, where computation times are much greater. These developments enable quantitative treatment of tunneling-assisted recombination in irradiated heterojunction bipolar transistors, where band offsets may enhance the tunneling effect by orders of magnitude. The calculations also reveal the density of non-tunneled carrier states in spatially varying potentials, and thereby test the common approximation of uniform- bulk values for such densities.

  4. Calculating Absolute Transition Probabilities for Deformed Nuclei in the Rare-Earth Region

    Science.gov (United States)

    Stratman, Anne; Casarella, Clark; Aprahamian, Ani

    2017-09-01

    Absolute transition probabilities are the cornerstone of understanding nuclear structure physics in comparison to nuclear models. We have developed a code to calculate absolute transition probabilities from measured lifetimes, using a Python script and a Mathematica notebook. Both of these methods take pertinent quantities such as the lifetime of a given state, the energy and intensity of the emitted gamma ray, and the multipolarities of the transitions to calculate the appropriate B(E1), B(E2), B(M1) or in general, any B(σλ) values. The program allows for the inclusion of mixing ratios of different multipolarities and the electron conversion of gamma-rays to correct for their intensities, and yields results in absolute units or results normalized to Weisskopf units. The code has been tested against available data in a wide range of nuclei from the rare earth region (28 in total), including 146-154Sm, 154-160Gd, 158-164Dy, 162-170Er, 168-176Yb, and 174-182Hf. It will be available from the Notre Dame Nuclear Science Laboratory webpage for use by the community. This work was supported by the University of Notre Dame College of Science, and by the National Science Foundation, under Contract PHY-1419765.

  5. Quantum wave packet calculation of reaction probabilities, cross sections, and rate constants for the C(1D) + HD reaction

    Science.gov (United States)

    Gogtas, Fahrettin; Bulut, Niyazi; Akpinar, Sinan

    The time-dependent real wave packet method has been used to study the C(1D) + HD reaction. The state-to-state and state-to-all reactive scattering probabilities for a broad range of energies are calculated at zero total angular momentum. The probabilities for J > 0 are estimated from accurately computed J = 0 probabilities by using the J-shifting approximation. The integral cross sections for a large energy range, and thermal rate constants are calculated.

  6. Calculating inspector probability of detection using performance demonstration program pass rates

    Science.gov (United States)

    Cumblidge, Stephen; D'Agostino, Amy

    2016-02-01

    The United States Nuclear Regulatory Commission (NRC) staff has been working since the 1970's to ensure that nondestructive testing performed on nuclear power plants in the United States will provide reasonable assurance of structural integrity of the nuclear power plant components. One tool used by the NRC has been the development and implementation of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code Section XI Appendix VIII[1] (Appendix VIII) blind testing requirements for ultrasonic procedures, equipment, and personnel. Some concerns have been raised, over the years, by the relatively low pass rates for the Appendix VIII qualification testing. The NRC staff has applied statistical tools and simulations to determine the expected probability of detection (POD) for ultrasonic examinations under ideal conditions based on the pass rates for the Appendix VIII qualification tests for the ultrasonic testing personnel. This work was primarily performed to answer three questions. First, given a test design and pass rate, what is the expected overall POD for inspectors? Second, can we calculate the probability of detection for flaws of different sizes using this information? Finally, if a previously qualified inspector fails a requalification test, does this call their earlier inspections into question? The calculations have shown that one can expect good performance from inspectors who have passed appendix VIII testing in a laboratory-like environment, and the requalification pass rates show that the inspectors have maintained their skills between tests. While these calculations showed that the PODs for the ultrasonic inspections are very good under laboratory conditions, the field inspections are conducted in a very different environment. The NRC staff has initiated a project to systematically analyze the human factors differences between qualification testing and field examinations. This work will be used to evaluate and prioritize

  7. Free Probability based Capacity Calculation of Multiantenna Gaussian Fading Channels with Cochannel Interference

    CERN Document Server

    Chatzinotas, Symeon

    2010-01-01

    During the last decade, it has been well understood that communication over multiple antennas can increase linearly the multiplexing capacity gain and provide large spectral efficiency improvements. However, the majority of studies in this area were carried out ignoring cochannel interference. Only a small number of investigations have considered cochannel interference, but even therein simple channel models were employed, assuming identically distributed fading coefficients. In this paper, a generic model for a multi-antenna channel is presented incorporating four impairments, namely additive white Gaussian noise, flat fading, path loss and cochannel interference. Both point-to-point and multiple-access MIMO channels are considered, including the case of cooperating Base Station clusters. The asymptotic capacity limit of this channel is calculated based on an asymptotic free probability approach which exploits the additive and multiplicative free convolution in the R- and S-transform domain respectively, as ...

  8. Improved ischemic stroke outcome prediction using model estimation of outcome probability: the THRIVE-c calculation.

    Science.gov (United States)

    Flint, Alexander C; Rao, Vivek A; Chan, Sheila L; Cullen, Sean P; Faigeles, Bonnie S; Smith, Wade S; Bath, Philip M; Wahlgren, Nils; Ahmed, Niaz; Donnan, Geoff A; Johnston, S Claiborne

    2015-08-01

    The Totaled Health Risks in Vascular Events (THRIVE) score is a previously validated ischemic stroke outcome prediction tool. Although simplified scoring systems like the THRIVE score facilitate ease-of-use, when computers or devices are available at the point of care, a more accurate and patient-specific estimation of outcome probability should be possible by computing the logistic equation with patient-specific continuous variables. We used data from 12 207 subjects from the Virtual International Stroke Trials Archive and the Safe Implementation of Thrombolysis in Stroke - Monitoring Study to develop and validate the performance of a model-derived estimation of outcome probability, the THRIVE-c calculation. Models were built with logistic regression using the underlying predictors from the THRIVE score: age, National Institutes of Health Stroke Scale score, and the Chronic Disease Scale (presence of hypertension, diabetes mellitus, or atrial fibrillation). Receiver operator characteristics analysis was used to assess model performance and compare the THRIVE-c model to the traditional THRIVE score, using a two-tailed Chi-squared test. The THRIVE-c model performed similarly in the randomly chosen development cohort (n = 6194, area under the curve = 0·786, 95% confidence interval 0·774-0·798) and validation cohort (n = 6013, area under the curve = 0·784, 95% confidence interval 0·772-0·796) (P = 0·79). Similar performance was also seen in two separate external validation cohorts. The THRIVE-c model (area under the curve = 0·785, 95% confidence interval 0·777-0·793) had superior performance when compared with the traditional THRIVE score (area under the curve = 0·746, 95% confidence interval 0·737-0·755) (P computing the logistic equation with patient-specific continuous variables in the THRIVE-c calculation, outcomes at the individual patient level are more accurately estimated. Given the widespread availability of

  9. Web Service for Calculating the Probability of Returning a Loan – De-sign, Implementation and Deployment

    National Research Council Canada - National Science Library

    Julian VASILEV

    2014-01-01

    ... – in credit institutions and banks when giving a loan. This study is the first of its kind to show the design, implementation and deployment of a web service for calculating the probability of returning a loan...

  10. An on-line calculator to compute phonotactic probability and neighborhood density based on child corpora of spoken American English

    Science.gov (United States)

    Storkel, Holly L.; Hoover, Jill R.

    2010-01-01

    An on-line calculator was developed (http://www.bncdnet.ku.edu/cml/info_ccc.vi) to compute phonotactic probability, the likelihood of occurrence of a sound sequence, and neighborhood density, the number of phonologically similar words, based on child corpora of American English (Kolson, 1960; Moe, Hopkins, & Rush, 1982) and compared to an adult calculator. Phonotactic probability and neighborhood density were computed for a set of 380 nouns (Fenson et al., 1993) using both the child and adult corpora. Child and adult raw values were significantly correlated. However, significant differences were detected. Specifically, child phonotactic probability was higher than adult phonotactic probability, especially for high probability words; and child neighborhood density was lower than adult neighborhood density, especially for high density words. These differences were reduced or eliminated when relative measures (i.e., z scores) were used. Suggestions are offered regarding which values to use in future research. PMID:20479181

  11. Beyond DP4: an Improved Probability for the Stereochemical Assignment of Isomeric Compounds using Quantum Chemical Calculations of NMR Shifts.

    Science.gov (United States)

    Grimblat, Nicolás; Zanardi, María M; Sarotti, Ariel M

    2015-12-18

    The DP4 probability is one of the most sophisticated and popular approaches for the stereochemical assignment of organic molecules using GIAO NMR chemical shift calculations when only one set of experimental data is available. In order to improve the performance of the method, we have developed a modified probability (DP4+), whose main differences from the original DP4 are the inclusion of unscaled data and the use of higher levels of theory for the NMR calculation procedure. With these modifications, a significant improvement in the overall performance was achieved, providing accurate and confident results in establishing the stereochemistry of 48 challenging isomeric compounds.

  12. Remark about Transition Probabilities Calculation for Single Server Queues with Lognormal Inter-Arrival or Service Time Distributions

    Science.gov (United States)

    Lee, Moon Ho; Dudin, Alexander; Shaban, Alexy; Pokhrel, Subash Shree; Ma, Wen Ping

    Formulae required for accurate approximate calculation of transition probabilities of embedded Markov chain for single-server queues of the GI/M/1, GI/M/1/K, M/G/1, M/G/1/K type with heavy-tail lognormal distribution of inter-arrival or service time are given.

  13. An alternative method for the calculation of joint probability distributions. Application to the expectation of the triplet invariant.

    Science.gov (United States)

    Brosius, J

    2015-01-01

    This paper presents a completely new method for the calculation of expectations (and thus joint probability distributions) of structure factors or phase invariants. As an example, a first approximation of the expectation of the triplet invariant (up to a constant) is given and a complex number is obtained. Instead of considering the atomic vector positions or reciprocal vectors as the fundamental random variables, the method samples over all functions (distributions) with a given number of atoms and given Patterson function. The aim of this paper was to explore the feasibility of the method, so the easiest problem was chosen: the calculation of the expectation value of the triplet invariant in P1. Calculation of the joint probability distribution of the triplet is not performed here but will be done in the future.

  14. An online calculator to compute phonotactic probability and neighborhood density on the basis of child corpora of spoken American English.

    Science.gov (United States)

    Storkel, Holly L; Hoover, Jill R

    2010-05-01

    An online calculator was developed (www.bncdnet.ku.edu/cml/info_ccc.vi) to compute phonotactic probability--the likelihood of occurrence of a sound sequence--and neighborhood density--the number of phonologically similar words--on the basis of child corpora of American English (Kolson, 1960; Moe, Hopkins, & Rush, 1982) and to compare its results to those of an adult calculator. Phonotactic probability and neighborhood density were computed for a set of 380 nouns (Fenson et al., 1993) using both the child and adult corpora. The child and adult raw values were significantly correlated. However, significant differences were detected. Specifically, child phonotactic probability was higher than adult phonotactic probability, especially for high-probability words, and child neighborhood density was lower than adult neighborhood density, especially for words with high-density neighborhoods. These differences were reduced or eliminated when relative measures (i.e., z scores) were used. Suggestions are offered regarding which values to use in future research.

  15. Effects of the Application of Graphing Calculator on Students' Probability Achievement

    Science.gov (United States)

    Tan, Choo-Kim

    2012-01-01

    A Graphing Calculator (GC) is one of the most portable and affordable technology in mathematics education. It quickens the mechanical procedure in solving mathematical problems and creates a highly interactive learning environment, which makes learning a seemingly difficult subject, easy. Since research on the use of GCs for the teaching and…

  16. A Method for Calculating the Probability of Successfully Completing a Rocket Propulsion Ground Test

    Science.gov (United States)

    Messer, Bradley

    2007-01-01

    Propulsion ground test facilities face the daily challenge of scheduling multiple customers into limited facility space and successfully completing their propulsion test projects. Over the last decade NASA s propulsion test facilities have performed hundreds of tests, collected thousands of seconds of test data, and exceeded the capabilities of numerous test facility and test article components. A logistic regression mathematical modeling technique has been developed to predict the probability of successfully completing a rocket propulsion test. A logistic regression model is a mathematical modeling approach that can be used to describe the relationship of several independent predictor variables X(sub 1), X(sub 2),.., X(sub k) to a binary or dichotomous dependent variable Y, where Y can only be one of two possible outcomes, in this case Success or Failure of accomplishing a full duration test. The use of logistic regression modeling is not new; however, modeling propulsion ground test facilities using logistic regression is both a new and unique application of the statistical technique. Results from this type of model provide project managers with insight and confidence into the effectiveness of rocket propulsion ground testing.

  17. A Web-based interface to calculate phonotactic probability for words and nonwords in Modern Standard Arabic.

    Science.gov (United States)

    Aljasser, Faisal; Vitevitch, Michael S

    2017-03-24

    A number of databases (Storkel Behavior Research Methods, 45, 1159-1167, 2013) and online calculators (Vitevitch & Luce Behavior Research Methods, Instruments, and Computers, 36, 481-487, 2004) have been developed to provide statistical information about various aspects of language, and these have proven to be invaluable assets to researchers, clinicians, and instructors in the language sciences. The number of such resources for English is quite large and continues to grow, whereas the number of such resources for other languages is much smaller. This article describes the development of a Web-based interface to calculate phonotactic probability in Modern Standard Arabic (MSA). A full description of how the calculator can be used is provided. It can be freely accessed at http://phonotactic.drupal.ku.edu/ .

  18. Lab Retriever: a software tool for calculating likelihood ratios incorporating a probability of drop-out for forensic DNA profiles.

    Science.gov (United States)

    Inman, Keith; Rudin, Norah; Cheng, Ken; Robinson, Chris; Kirschner, Adam; Inman-Semerau, Luke; Lohmueller, Kirk E

    2015-09-18

    Technological advances have enabled the analysis of very small amounts of DNA in forensic cases. However, the DNA profiles from such evidence are frequently incomplete and can contain contributions from multiple individuals. The complexity of such samples confounds the assessment of the statistical weight of such evidence. One approach to account for this uncertainty is to use a likelihood ratio framework to compare the probability of the evidence profile under different scenarios. While researchers favor the likelihood ratio framework, few open-source software solutions with a graphical user interface implementing these calculations are available for practicing forensic scientists. To address this need, we developed Lab Retriever, an open-source, freely available program that forensic scientists can use to calculate likelihood ratios for complex DNA profiles. Lab Retriever adds a graphical user interface, written primarily in JavaScript, on top of a C++ implementation of the previously published R code of Balding. We redesigned parts of the original Balding algorithm to improve computational speed. In addition to incorporating a probability of allelic drop-out and other critical parameters, Lab Retriever computes likelihood ratios for hypotheses that can include up to four unknown contributors to a mixed sample. These computations are completed nearly instantaneously on a modern PC or Mac computer. Lab Retriever provides a practical software solution to forensic scientists who wish to assess the statistical weight of evidence for complex DNA profiles. Executable versions of the program are freely available for Mac OSX and Windows operating systems.

  19. On the calculation of steady-state loss probabilities in the GI/G/2/0 queue

    Directory of Open Access Journals (Sweden)

    Igor N. Kovalenko

    1994-01-01

    Full Text Available This paper considers methods for calculating the steady-state loss probability in the GI/G/2/0 queue. A previous study analyzed this queue in discrete time and this led to an efficient, numerical approximation scheme for continuous-time systems. The primary aim of the present work is to provide an alternative approach by analyzing the GI/ME/2/0 queue; i.e., assuming that the service time can be represented by a matrix-exponential distribution. An efficient computational scheme based on this method is developed and some numerical examples are studied. Some comparisons are made with the discrete-time approach, and the two methods are seen to be complementary.

  20. Calculation of transition probabilities and ac Stark shifts in two-photon laser transitions of antiprotonic helium

    CERN Document Server

    Hori, Masaki

    2010-01-01

    Numerical ab initio variational calculations of the transition probabilities and ac Stark shifts in two-photon transitions of antiprotonic helium atoms driven by two counter-propagating laser beams are presented. We found that sub-Doppler spectroscopy is in principle possible by exciting transitions of the type (n,L)->(n-2,L-2) between antiprotonic states of principal and angular momentum quantum numbers n~L-1~35, first by using highly monochromatic, nanosecond laser beams of intensities 10^4-10^5 W/cm^2, and then by tuning the virtual intermediate state close (e.g., within 10-20 GHz) to the real state (n-1,L-1) to enhance the nonlinear transition probability. We expect that ac Stark shifts of a few MHz or more will become an important source of systematic error at fractional precisions of better than a few parts in 10^9. These shifts can in principle be minimized and even canceled by selecting an optimum combination of laser intensities and frequencies. We simulated the resonance profiles of some two-photon ...

  1. GTNEUT: A code for the calculation of neutral particle transport in plasmas based on the Transmission and Escape Probability method

    Science.gov (United States)

    Mandrekas, John

    2004-08-01

    GTNEUT is a two-dimensional code for the calculation of the transport of neutral particles in fusion plasmas. It is based on the Transmission and Escape Probabilities (TEP) method and can be considered a computationally efficient alternative to traditional Monte Carlo methods. The code has been benchmarked extensively against Monte Carlo and has been used to model the distribution of neutrals in fusion experiments. Program summaryTitle of program: GTNEUT Catalogue identifier: ADTX Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTX Computer for which the program is designed and others on which it has been tested: The program was developed on a SUN Ultra 10 workstation and has been tested on other Unix workstations and PCs. Operating systems or monitors under which the program has been tested: Solaris 8, 9, HP-UX 11i, Linux Red Hat v8.0, Windows NT/2000/XP. Programming language used: Fortran 77 Memory required to execute with typical data: 6 219 388 bytes No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: No No. of bytes in distributed program, including test data, etc.: 300 709 No. of lines in distributed program, including test data, etc.: 17 365 Distribution format: compressed tar gzip file Keywords: Neutral transport in plasmas, Escape probability methods Nature of physical problem: This code calculates the transport of neutral particles in thermonuclear plasmas in two-dimensional geometric configurations. Method of solution: The code is based on the Transmission and Escape Probability (TEP) methodology [1], which is part of the family of integral transport methods for neutral particles and neutrons. The resulting linear system of equations is solved by standard direct linear system solvers (sparse and non-sparse versions are included). Restrictions on the complexity of the problem: The current version of the code can

  2. Internationally comparable diagnosis-specific survival probabilities for calculation of the ICD-10-based Injury Severity Score

    DEFF Research Database (Denmark)

    Gedeborg, R.; Warner, M.; Chen, L. H.

    2014-01-01

    BACKGROUND: The International Statistical Classification of Diseases, 10th Revision (ICD-10) -based Injury Severity Score (ICISS) performs well but requires diagnosis-specific survival probabilities (DSPs), which are empirically derived, for its calculation. The objective was to examine if DSPs b...... based on data pooled from several countries could increase accuracy, precision, utility, and international comparability of DSPs and ICISS. METHODS: Australia, Argentina, Austria, Canada, Denmark, New Zealand, and Sweden provided ICD-10-coded injury hospital discharge data, including in......-hospital mortality status. Data from the seven countries were pooled using four different methods to create an international collaborative effort ICISS (ICE-ICISS). The ability of the ICISS to predict mortality using the country-specific DSPs and the pooled DSPs was estimated and compared. RESULTS: The pooled DSPs...... generated empirically derived DSPs. These pooled DSPs facilitate international comparisons and enables the use of ICISS in all settings where ICD-10 hospital discharge diagnoses are available. The modest reduction in performance of the ICE-ICISS compared with the country-specific scores is unlikely...

  3. User’s guide for MapMark4—An R package for the probability calculations in three-part mineral resource assessments

    Science.gov (United States)

    Ellefsen, Karl J.

    2017-06-27

    MapMark4 is a software package that implements the probability calculations in three-part mineral resource assessments. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of this user’s guide are bundled together in R’s unit of shareable code, which is called a “package.” This user’s guide includes step-by-step instructions showing how the functions are used to carry out the probability calculations. The calculations are demonstrated using test data, which are included in the package.

  4. Relativistic Many-body Moller-Plesset Perturbation Theory Calculations of the Energy Levels and Transition Probabilities in Na- to P-like Xe Ions

    Energy Technology Data Exchange (ETDEWEB)

    Vilkas, M J; Ishikawa, Y; Trabert, E

    2007-03-27

    Relativistic multireference many-body perturbation theory calculations have been performed on Xe{sup 43+}-Xe{sup 39+} ions, resulting in energy levels, electric dipole transition probabilities, and level lifetimes. The second-order many-body perturbation theory calculation of energy levels included mass shifts, frequency-dependent Breit correction and Lamb shifts. The calculated transition energies and E1 transition rates are used to present synthetic spectra in the extreme ultraviolet range for some of the Xe ions.

  5. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, M.S.

    1991-11-01

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events attempted boreholes over rooms and drifts,'' mining alters ground-water regime,'' water-withdrawal wells provide alternate pathways,'' and the feature brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features.

  6. PAPIN: A Fortran-IV program to calculate cross section probability tables, Bondarenko and transmission self-shielding factors for fertile isotopes in the unresolved resonance region

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Cobos, J.G.

    1981-08-01

    The Fortran IV code PAPIN has been developed to calculate cross section probability tables, Bondarenko self-shielding factors and average self-indication ratios for non-fissile isotopes, below the inelastic threshold, on the basis of the ENDF/B prescriptions for the unresolved resonance region. Monte-Carlo methods are utilized to generate ladders of resonance parameters in the unresolved resonance region, from average resonance parameters and their appropriate distribution functions. The neutron cross-sections are calculated by the single level Breit-Wigner (SLBW) formalism, with s, p and d-wave contributions. The cross section probability tables are constructed by sampling the Doppler-broadened cross sections. The various self-shielded factors are computed numerically as Lebesgue integrals over the cross section probability tables. The program PAPIN has been validated through extensive comparisons with several deterministic codes.

  7. Numerical calculations of the probabilities for quantum transitions in atoms and molecules by the path integral method

    Science.gov (United States)

    Biryukov, Alexander; Degtyareva, Yana

    2017-10-01

    The probabilities of molecular quantum transitions induced by electromagnetic field are expressed as path integrals of a real alternating functional. We propose a new method for computing these integrals by means of recurrence relations. We apply this approach to description of the two-photon Rabi oscillations.

  8. Evaluation of forensic DNA mixture evidence: protocol for evaluation, interpretation, and statistical calculations using the combined probability of inclusion.

    Science.gov (United States)

    Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D

    2016-08-31

    The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.

  9. Tumor control probability and the utility of 4D vs 3D dose calculations for stereotactic body radiotherapy for lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Valdes, Gilmer, E-mail: gilmer.valdes@uphs.upenn.edu [Department of Radiation Oncology, Perelman Center for Advanced Medicine, University of Pennsylvania, Philadelphia, PA (United States); Robinson, Clifford [Department of Radiation Oncology, Siteman Cancer Center, Washington University in St. Louis, St. Louis, MO (United States); Lee, Percy [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States); Morel, Delphine [Department of Biomedical Engineering, AIX Marseille 2 University, Marseille (France); Department of Medical Physics, Joseph Fourier University, Grenoble (France); Low, Daniel; Iwamoto, Keisuke S.; Lamb, James M. [Department of Radiation Oncology, David Geffen School of Medicine, UCLA, Los Angeles, CA (United States)

    2015-04-01

    Four-dimensional (4D) dose calculations for lung cancer radiotherapy have been technically feasible for a number of years but have not become standard clinical practice. The purpose of this study was to determine if clinically significant differences in tumor control probability (TCP) exist between 3D and 4D dose calculations so as to inform the decision whether 4D dose calculations should be used routinely for treatment planning. Radiotherapy plans for Stage I-II lung cancer were created for 8 patients. Clinically acceptable treatment plans were created with dose calculated on the end-exhale 4D computed tomography (CT) phase using a Monte Carlo algorithm. Dose was then projected onto the remaining 9 phases of 4D-CT using the Monte Carlo algorithm and accumulated onto the end-exhale phase using commercially available deformable registration software. The resulting dose-volume histograms (DVH) of the gross tumor volume (GTV), planning tumor volume (PTV), and PTV{sub setup} were compared according to target coverage and dose. The PTV{sub setup} was defined as a volume including the GTV and a margin for setup uncertainties but not for respiratory motion. TCPs resulting from these DVHs were estimated using a wide range of alphas, betas, and tumor cell densities. Differences of up to 5 Gy were observed between 3D and 4D calculations for a PTV with highly irregular shape. When the TCP was calculated using the resulting DVHs for fractionation schedules typically used in stereotactic body radiation therapy (SBRT), the TCP differed at most by 5% between 4D and 3D cases, and in most cases, it was by less than 1%. We conclude that 4D dose calculations are not necessary for most cases treated with SBRT, but they might be valuable for irregularly shaped target volumes. If 4D calculations are used, 4D DVHs should be evaluated on volumes that include margin for setup uncertainty but not respiratory motion.

  10. Compressive behavior of laminated neoprene bridge bearing pads under thermal aging condition

    Science.gov (United States)

    Jun, Xie; Zhang, Yannian; Shan, Chunhong

    2017-10-01

    The present study was conducted to obtain a better understanding of the variation rule of mechanical properties of laminated neoprene bridge bearing pads under thermal aging condition using compression tests. A total of 5 specimens were processed in a high-temperature chamber. After that, the specimens were tested subjected to axial load. The parameter mainly considered time of thermal aging processing for specimens. The results of compression tests show that the specimens after thermal aging processing are more probably brittle failure than the standard specimen. Moreover, the exposure of steel plate, cracks and other failure phenomena are more serious than the standard specimen. The compressive capacity, ultimate compressive strength, compressive elastic modulus of the laminated neoprene bridge bearing pads decreased dramatically with the increasing in the aging time of thermal aging processing. The attenuation trends of ultimate compressive strength, compressive elastic modulus of laminated neoprene bridge bearing pads under thermal aging condition accord with power function. The attenuation models are acquired by regressing data of experiment with the least square method. The attenuation models conform to reality well which shows that this model is applicable and has vast prospect in assessing the performance of laminated neoprene bridge bearing pads under thermal aging condition.

  11. Application of multi-dimensional discrimination diagrams and probability calculations to Paleoproterozoic acid rocks from Brazilian cratons and provinces to infer tectonic settings

    Science.gov (United States)

    Verma, Sanjeet K.; Oliveira, Elson P.

    2013-08-01

    In present work, we applied two sets of new multi-dimensional geochemical diagrams (Verma et al., 2013) obtained from linear discriminant analysis (LDA) of natural logarithm-transformed ratios of major elements and immobile major and trace elements in acid magmas to decipher plate tectonic settings and corresponding probability estimates for Paleoproterozoic rocks from Amazonian craton, São Francisco craton, São Luís craton, and Borborema province of Brazil. The robustness of LDA minimizes the effects of petrogenetic processes and maximizes the separation among the different tectonic groups. The probability based boundaries further provide a better objective statistical method in comparison to the commonly used subjective method of determining the boundaries by eye judgment. The use of readjusted major element data to 100% on an anhydrous basis from SINCLAS computer program, also helps to minimize the effects of post-emplacement compositional changes and analytical errors on these tectonic discrimination diagrams. Fifteen case studies of acid suites highlighted the application of these diagrams and probability calculations. The first case study on Jamon and Musa granites, Carajás area (Central Amazonian Province, Amazonian craton) shows a collision setting (previously thought anorogenic). A collision setting was clearly inferred for Bom Jardim granite, Xingú area (Central Amazonian Province, Amazonian craton) The third case study on Older São Jorge, Younger São Jorge and Maloquinha granites Tapajós area (Ventuari-Tapajós Province, Amazonian craton) indicated a within-plate setting (previously transitional between volcanic arc and within-plate). We also recognized a within-plate setting for the next three case studies on Aripuanã and Teles Pires granites (SW Amazonian craton), and Pitinga area granites (Mapuera Suite, NW Amazonian craton), which were all previously suggested to have been emplaced in post-collision to within-plate settings. The seventh case

  12. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...

  13. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  14. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  15. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  16. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....

  17. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  18. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  19. Calculation of the ultracold neutron upscattering loss probability in fluid walled storage bottles using experimental measurements of the liquid thermomechanical properties of fomblin

    Science.gov (United States)

    Lamoreaux, S. K.; Golub, R.

    2002-10-01

    Presently, the most accurate values of the free neutron beta-decay lifetime result from measurements using fluid-coated ultacold neutron (UCN) storage bottles. The purpose of this work is to investigate the temperature-dependent UCN loss rate from these storage systems. To verify that the surface properites of fomblin films are the same as the bulk properties, we present experimental measurements of the properties of a liquid ``fomblin'' surface obtained by the quasielastic scattering of laser light. The properties include the surface tension and viscosity as functions of temperature. The results are compared to measurements of the bulk fluid properties. We then calculate the upscattering rate of UCNs from thermally excited surface capillary waves on the liquid surface and compare the results to experimental measurements of the UCN lifetime in fomblin-fluid-walled UCN storage bottles, and show that the excess storage loss rate for UCN energies near the fomblin potential can be explained. The rapid temperature dependence of the fomblin storage lifetime is explained by our analysis.

  20. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  1. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  2. Performance of asphaltic concrete incorporating styrene butadiene rubber subjected to varying aging condition

    Science.gov (United States)

    Salah, Faisal Mohammed; Jaya, Ramadhansyah Putra; Mohamed, Azman; Hassan, Norhidayah Abdul; Rosni, Nurul Najihah Mad; Mohamed, Abdullahi Ali; Agussabti

    2017-12-01

    The influence of styrene butadiene rubber (SBR) on asphaltic concrete properties at different aging conditions was presented in this study. These aging conditions were named as un-aged, short-term, and long-term aging. The conventional asphalt binder of penetration grade 60/70 was used in this work. Four different levels of SBR addition were employed (i.e., 0 %, 1 %, 3 %, and 5 % by binder weight). Asphalt concrete mixes were prepared at selected optimum asphalt content (5 %). The performance was evaluated based on Marshall Stability, resilient modulus, and dynamic creep tests. Results indicated the improving stability and permanent deformation characteristics that the mixes modified with SBR polymer have under aging conditions. The result also showed that the stability, resilient modulus, and dynamic creep tests have the highest rates compared to the short-term aging and un-aged samples. Thus, the use of 5 % SBR can produce more durable asphalt concrete mixtures with better serviceability.

  3. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  4. Effect of physicochemical aging conditions on the composite-composite repair bond strength

    NARCIS (Netherlands)

    Brendeke, Johannes; Ozcan, Mutlu

    2007-01-01

    Purpose: This study evaluated the effect of different physicochemical aging methods and surface conditioning techniques on the repair bond strength of composite. It was hypothesized that the aging conditions would decrease the repair bond strength and surface conditioning methods would perform

  5. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  6. A spectrophotometric analysis of extraoral aging conditions on the color stability of maxillofacial silicone.

    Science.gov (United States)

    Mehta, Siddharth; Nandeeshwar, D B

    2017-01-01

    Surveys have reported color fading as the most frequent reasons patients given for disliking their prostheses. The aim of the study is to compare the color variation between two maxillofacial silicone elastomers after subjecting them to extraoral aging conditions. A total of 80 samples were made from M511 Maxillofacial Rubber (Part A: Part B = 10:1) and Z004 Platinum Silicone Rubber (Part A: Part B = 1:1) and divided into two main Groups A and B (40 each). These main groups were then subdivided into five subgroups (A1B1, A2B2, A3B3, A4B4, and A5B5) (n = 8); outdoor weathering, acidic perspiration, sebum (for 6 months), and neutral soap and disinfectant (for 30 h), respectively. Baseline L*a*b* values were recorded. The samples were subjected to the extraoral aging conditions, and the L* a*b* values were recorded after the aging period using a spectrophotometer. The intergroup comparison was done by Kruskal-Wallis test, whereas the intragroup comparison was done by Mann-Whitney test. All groups exhibited visually detectable, mean color differences that ranged from 3.06-5.21, except for A4B4. There was no statistical significance between the two materials when subjected to extraoral aging conditions. Visually perceptible and clinically unacceptable color changes occur when exposed to various extraoral aging conditions except for neutral soap solution immersion, for which values of Δ E* were clinically acceptable (ΔE < 3). It can be said for all practical purposes, clinically, the choice between M511 Maxillofacial Rubber (Part A: Part B = 10:1) and Z004 Platinum Silicone Rubber (Part A: Part B = 1:1) would yield more or less the same results, with unacceptable norms in terms of color stability under extraoral aging conditions.

  7. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  8. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  9. Onset aging conditions of adults with an intellectual disability associated with primary caregiver depression.

    Science.gov (United States)

    Lin, Lan-Ping; Hsu, Shang-Wei; Kuo, Meng-Ting; Wu, Jia-Lin; Chu, Cordia; Lin, Jin-Ding

    2014-03-01

    Caregivers of adults with an intellectual disability experience depressive symptoms, but the aging factors of the care recipients associated with the depressive symptoms are unknown. The objective of this study was to analyze the onset aging conditions of adults with an intellectual disability that associated with the depression scores of their primary caregivers. A cross-sectional survey was administered to gather information from 455 caregivers of adults with an intellectual disability about their symptoms of depression which assessed by a 9-item Patient Health Questionnaire (PHQ-9). The 12 aging conditions of adults with an intellectual disability include physical and mental health. The results indicate that 78% of adults with an intellectual disability demonstrate aging conditions. Physical conditions associated with aging include hearing decline (66.3%), vision decline (63.6%), incontinence (44%), articulation and bone degeneration (57.9%), teeth loss (80.4), physical strength decline (81.2%), sense of taste and smell decline (52.8%), and accompanied chronic illnesses (74.6%). Mental conditions associated with aging include memory loss (77%), language ability deterioration (74.4%), poor sleep quality (74.2%), and easy onset of depression and sadness (50.3%). Aging conditions of adults with an intellectual disability (pdepressive symptom among caregivers after controlling demographic characteristics. Particularly, poor sleep quality of adults with an intellectual disability (yes vs. no, OR=3.807, p=0.002) was statistically correlated to the occurrence of significant depressive symptoms among their caregivers. This study suggests that the authorities should reorient community services and future policies toward the needs of family caregivers to decrease the burdens associated with caregiving. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Nuclear structure of tellurium 133 via beta decay and shell model calculations in the doubly magic tin 132 region. [J,. pi. , transition probabilities, neutron and proton separation, g factors

    Energy Technology Data Exchange (ETDEWEB)

    Lane, S.M.

    1979-08-01

    An experimental investigation of the level structure of /sup 133/Te was performed by spectroscopy of gamma-rays following the beta-decay of 2.7 min /sup 133/Sb. Multiscaled gamma-ray singles spectra and 2.5 x 10/sup 7/ gamma-gamma coincidence events were used in the assignment of 105 of the approximately 400 observed gamma-rays to /sup 133/Sb decay and in the construction of the /sup 133/Te level scheme with 29 excited levels. One hundred twenty-two gamma-rays were identified as originating in the decay of other isotopes of Sb or their daughter products. The remaining gamma-rays were associated with the decay of impurity atoms or have as yet not been identified. A new computer program based on the Lanczos tridiagonalization algorithm using an uncoupled m-scheme basis and vector manipulations was written. It was used to calculate energy levels, parities, spins, model wavefunctions, neutron and proton separation energies, and some electromagnetic transition probabilities for the following nuclei in the /sup 132/Sn region: /sup 128/Sn, /sup 129/Sn, /sup 130/Sn, /sup 131/Sn, /sup 130/Sb, /sup 131/Sb, /sup 132/Sb, /sup 133/Sb, /sup 132/Te, /sup 133/Te, /sup 134/Te, /sup 134/I, /sup 135/I, /sup 135/Xe, and /sup 136/Xe. The results are compared with experiment and the agreement is generally good. For non-magic nuclei: the lg/sub 7/2/, 2d/sub 5/2/, 2d/sub 3/2/, 1h/sub 11/2/, and 3s/sub 1/2/ orbitals are available to valence protons and the 2d/sub 5/2/, 2d/sub 3/2/, 1h/sub 11/2/, and 3s/sub 1/2/ orbitals are available to valence neutron holes. The present CDC7600 computer code can accommodate 59 single particle states and vectors comprised of 30,000 Slater determinants. The effective interaction used was that of Petrovich, McManus, and Madsen, a modification of the Kallio-Kolltveit realistic force. Single particle energies, effective charges and effective g-factors were determined from experimental data for nuclei in the /sup 132/Sn region. 116 references.

  11. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  12. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  13. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  14. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  15. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  16. Survivability of integrated PVDF film sensors to accelerated ageing conditions in aeronautical/aerospace structures

    Science.gov (United States)

    Guzman, E.; Cugnoni, J.; Gmür, T.; Bonhôte, P.; Schorderet, A.

    2013-06-01

    This work validates the use of integrated polyvinylidene fluoride (PVDF) film sensors for dynamic testing, even after being subjected to UV-thermo-hygro-mechanical accelerated ageing conditions. The verification of PVDF sensors’ survivability in these environmental conditions, typically confronted by civil and military aircraft, is the main concern of the study. The evaluation of survivability is made by a comparison of dynamic testing results provided by the PVDF patch sensors subjected to an accelerated ageing protocol, and those provided by neutral non-aged sensors (accelerometers). The available measurements are the time-domain response signals issued from a modal analysis procedure, and the corresponding frequency response functions (FRF). These are in turn used to identify the constitutive properties of the samples by extraction of the modal parameters, in particular the natural frequencies. The composite specimens in this study undergo different accelerated ageing processes. After several weeks of experimentation, the samples exhibit a loss of stiffness, represented by a decrease in the elastic moduli down to 10%. Despite the ageing, the integrated PVDF sensors, subjected to the same ageing conditions, are still capable of providing reliable data to carry out a close followup of these changes. This survivability is a determinant asset in order to use integrated PVDF sensors to perform structural health monitoring (SHM) in the future of full-scale composite aeronautical structures.

  17. Stationary algorithmic probability

    National Research Council Canada - National Science Library

    Müller, Markus

    2010-01-01

    ...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...

  18. Factual and cognitive probability

    OpenAIRE

    Chuaqui, Rolando

    2012-01-01

    This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...

  19. Evaluating probability forecasts

    OpenAIRE

    Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo

    2011-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...

  20. Monte Carlo calculation of the total probability for gamma-Ray interaction in toluene; Aplicacion del metodo de Monte Carlo al calcu lo de la probabilidad de interaccion fotonica en tolueno

    Energy Technology Data Exchange (ETDEWEB)

    Grau Malonda, A.; Garcia-Torano, E.

    1983-07-01

    Interaction and absorption probabilities for gamma-rays with energies between 1 and 1000 KeV have been computed and tabulated. Toluene based scintillator solution has been assumed in the computation. Both, point sources and homogeneously dispersed radioactive material have been assumed. These tables may be applied to cylinders with radii between 1.25 cm and 0.25 cm and heights between 4.07 cm and 0.20 cm. (Author) 26 refs.

  1. Influence of aging conditions on the quality of red Sangiovese wine.

    Science.gov (United States)

    Castellari, M; Piermattei, B; Arfelli, G; Amati, A

    2001-08-01

    A red Sangiovese wine was stored in barrels of different woods (oak and chestnut) and types (225-L "barriques" and 1000-L barrels) at 12 and 22 degrees C for 320 days to evaluate the effects of different aging conditions on wine quality. Chestnut barrels led to wines richer in phenolics, and which were more tannic, colored, and fruity. Oak barrels gave wines with more monomeric phenolics, but less astringent, with higher vanilla smell, and more harmonious. The type of barrel could be used as a parameter to regulate the extraction of wood components and the polymerization of monomeric phenolics. Storage at 22 degrees C favored the formation of polymerized phenolics and the increase of color density and color hue. The temperature produced less pronounced effects on aroma and taste, even if wines stored at 12 degrees C showed more harmony.

  2. Quantitative Analysis of Ageing Condition of Insulating Paper Using Infrared Spectroscopy

    Directory of Open Access Journals (Sweden)

    R. Saldivar-Guerrero

    2016-01-01

    Full Text Available Transformers are very expensive apparatuses and are vital to make the whole power system run normally. The failures in such apparatuses could leave them out of service, causing severe economic losses. The life of a transformer can be effectively determined by the life of the insulating paper. In the present work, we show an alternative diagnostic technique to determine the ageing condition of transformer paper by the use of FTIR spectroscopy and an empirical model. This method has the advantage of using a microsample that could be extracted from the transformer on-site. The proposed technique offers an approximation quantitative evaluation of the degree of polymerization of dielectric papers and could be used for transformer diagnosis and remaining life estimation.

  3. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  4. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  5. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  6. Efficient probability sequence

    OpenAIRE

    Regnier, Eva

    2014-01-01

    A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...

  7. Efficient probability sequences

    OpenAIRE

    Regnier, Eva

    2014-01-01

    DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...

  8. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  9. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...

  10. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...

  11. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  12. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  13. Oxygen boundary crossing probabilities.

    Science.gov (United States)

    Busch, N A; Silver, I A

    1987-01-01

    The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.

  14. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  15. In All Probability, Probability is not All

    Science.gov (United States)

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  16. Crosslinking of SAVY-4000 O-rings as a Function of Aging Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Van Buskirk, Caleb Griffith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-08

    SAVY-4000 containers were developed as a part of DOE M 441.1-1 to protect workers who handle stored nuclear material from exposure due to loss of containment.1 The SAVY-4000 is comprised of three parts: a lid, a container, and a cross-linked fluoropolymer O-ring. Degradation of the O-ring during use could limit the lifetime of the SAVY-4000. In order to quantify the chemical changes of the Oring over time, the molecular weight between crosslinks was determined as a function of aging conditions using a swelling technique. Because the O-ring is a cross-linked polymer, it will absorb solvent into its matrix without dissolving. The relative amount of solvent uptake can be related to the degree of crosslinking using an equation developed by Paul Flory and John Rehner Jr3. This method was used to analyze O-ring samples aged under thermal and ionizing-radiation conditions. It was found that at the harsher thermal gaining conditions in absence of ionizing-radiation the average molecular weight between crosslinks decreased, indicating a rise in crosslinks, which may be attributable to advanced aging with no ionizing radiation present. Inversely, in the presence of ionizing radiation it was found that material has a higher level of cross-linking with age. This information could be used to help predict the lifetime of the O-rings in SAVY-4000 containers under service conditions.

  17. Influence of Aging Conditions on Fatigue Fracture Behaviour of 6063 Aluminum Alloy

    Directory of Open Access Journals (Sweden)

    Rafiq Ahmed Siddiqui

    2001-12-01

    Full Text Available Aluminum - Magnesium - Silicon (Al-Mg-Si 6063 alloy was heat-treated using under aged, peak aged and overage temperatures. The numbers of cycles required to cause the fatigue fracture, at constant stress, was considered as criteria for the fatigue resistance. Moreover, the fractured surface of the alloy at different aging conditions was evaluated by optical microscopy and the Scanning Electron Microscopy (SEM. The SEM micrographs confirmed the cleavage surfaces with well-defined fatigue striations. It has been observed that the various aging time and temperature of the 6063 Al-alloy, produces different modes of fractures. The most suitable age hardening time and temperature was found to be between 4 to 5 hours and to occur at 460 K. The increase in fatigue fracture property of the alloy due to aging could be attributed to a vacancy assisted diffusion mechanism or due to pinning of dislocations movement by the precipitates produced during aging. However, the decrease in the fatigue resistance, for the over aged alloys, might be due to the coalescence of precipitates into larger grains.

  18. Comparison Study on Interlaminar Shear Strength Testing Methods of CFRP under Hygrothermal Aging Conditions

    Directory of Open Access Journals (Sweden)

    SHUANG Chao

    2017-10-01

    Full Text Available The effects of hygrothermal aging on the interlaminar shear strength of T700/TDE-85 composites were studied by short-beam method and double-incision method. The relationship between moisture absorption and aging time was discussed, and the fracture surface morphology was analyzed. The experimental results show that the moisture absorption law of the two specimens is in accordance with Fickle's second law, but the saturated moisture absorption rate and moisture absorption time are different. The moisture absorption rates and saturated moisture absorption rates of the specimens of double incision method are higher than those of the short beam method. The interlaminar strength of double-incision test is more obvious than that of short-beam under hygrothermal aging conditions, the interlaminar shear strength retention rates of short-beam method's specimens are 74.5%, 61.0%, 53.2% and 50.6% at 500 h intervals and the interlaminar shear strength retention rates of double-incision method's specimens are 60.9%, 38.3%, 42.6% and 33.0% at 500 h intervals. The failure mode of short-beam specimen is more complicated than that of double-incision with the increase of hygrothermal aging time.

  19. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  20. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  1. Statistics and Probability

    Directory of Open Access Journals (Sweden)

    Laktineh Imad

    2010-04-01

    Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  2. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  3. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  4. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  5. Stage line diagram: An age-conditional reference diagram for tracking development

    NARCIS (Netherlands)

    Buuren, S. van; Ooms, J.C.L.

    2009-01-01

    This paper presents a method for calculating stage line diagrams, a novel type of reference diagram useful for tracking developmental processes over time. Potential fields of applications include: dentistry (tooth eruption), oncology (tumor grading, cancer staging), virology (HIV infection and

  6. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  7. Difficulties related to Probabilities

    OpenAIRE

    Rosinger, Elemer Elad

    2010-01-01

    Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.

  8. On Randomness and Probability

    Indian Academy of Sciences (India)

    casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...

  9. Dynamic update with probabilities

    NARCIS (Netherlands)

    Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant

  10. Elements of quantum probability

    NARCIS (Netherlands)

    Kummerer, B.; Maassen, H.

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with

  11. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  12. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-06-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  13. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  14. The estimation of tree posterior probabilities using conditional clade probability distributions.

    Science.gov (United States)

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  15. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  16. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  17. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  18. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  19. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  20. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  1. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  2. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  3. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  4. Elements of quantum probability

    OpenAIRE

    Kummerer, B.; Maassen, Hans

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of ‘quantum coin tosses’ are discussed, closely related to V.F.R....

  5. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  6. Effect of surface conditioning methods on the microtensile bond strength of resin composite to composite after aging conditions

    NARCIS (Netherlands)

    Ozcan, Mutlu; Barbosa, Silvia Helena; Melo, Renata Marques; Galhano, Graziela Avila Prado; Bottino, Marco Antonio

    2007-01-01

    Objectives. This study evaluated the effect of two different surface conditioning methods on the repair bond strength of a bis-GMA-adduct/bis-EMA/TEGDMA based resin composite after three aging conditions. Methods. Thirty-six composite resin blocks (Esthet X, Dentsply) were prepared (5 mm x 6 mm x 6

  7. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  8. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  9. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  10. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  11. Calculation of the pipes failure probability of the Rcic system of a nuclear power station by means of software WinPRAISE 07; Calculo de la probabilidad de falla de tuberias del sistema RCIC de una central nuclear mediante el software WinPRAISE 07

    Energy Technology Data Exchange (ETDEWEB)

    Jasso G, J.; Diaz S, A.; Mendoza G, G.; Sainz M, E. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Garcia de la C, F. M., E-mail: angeles.diaz@inin.gob.mx [Comision Federal de Electricidad, Central Nucleoelectrica Laguna Verde, Km 44.5 Carretera Cardel-Nautla, 91476 Laguna Verde, Alto Lucero, Veracruz (Mexico)

    2014-10-15

    The growth and the cracks propagation by fatigue are a typical degradation mechanism that is presented in the nuclear industry as in the conventional industry; the unstable propagation of a crack can cause the catastrophic failure of a metallic component even with high ductility; for this reason, activities of programmed maintenance have been established in the industry using inspection and visual techniques and/or ultrasound with an established periodicity allowing to follow up to these growths, controlling the undesirable effects; however, these activities increase the operation costs; and in the peculiar case of the nuclear industry, they increase the radiation exposure to the participant personnel. The use of mathematical processes that integrate concepts of uncertainty, material properties and the probability associated to the inspection results, has been constituted as a powerful tool of evaluation of the component reliability, reducing costs and exposure levels. In this work the evaluation of the failure probability by cracks growth preexisting by fatigue is presented, in pipes of a Reactor Core Isolation Cooling system (Rcic) in a nuclear power station. The software WinPRAISE 07 (Piping Reliability Analysis Including Seismic Events) was used supported in the probabilistic fracture mechanics principles. The obtained values of failure probability evidenced a good behavior of the analyzed pipes with a maximum order of 1.0 E-6, therefore is concluded that the performance of the lines of these pipes is reliable even extrapolating the calculations at 10, 20, 30 and 40 years of service. (Author)

  12. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  13. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  14. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  15. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  16. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  17. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  18. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  19. Huygens' foundations of probability

    NARCIS (Netherlands)

    Freudenthal, Hans

    It is generally accepted that Huygens based probability on expectation. The term “expectation,” however, stems from Van Schooten's Latin translation of Huygens' treatise. A literal translation of Huygens' Dutch text shows more clearly what Huygens actually meant and how he proceeded.

  20. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  1. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  2. Univariate Probability Distributions

    Science.gov (United States)

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  3. The Theory of Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.

  4. Probability Theory Without Tears!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...

  5. probably mostly white

    African Journals Online (AJOL)

    Willem Scholtz

    internet – the (probably mostly white) public's interest in the so-called Border War is ostensibly at an all-time high. By far most of the publications are written by ex- ... understanding of this very important episode in the history of Southern Africa. It was, therefore, with some anticipation that one waited for this book, which.

  6. the theory of probability

    Indian Academy of Sciences (India)

    important practical applications in statistical quality control. Of a similar kind are the laws of probability for the scattering of missiles, which are basic in the ..... deviations for different ranges for each type of gun and of shell are found empirically in firing practice on an artillery range. But the subsequent solution of all possible ...

  7. On Randomness and Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.

  8. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  9. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  10. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  11. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  12. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...

  13. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  14. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  15. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  16. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  17. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  18. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  19. Structural Minimax Probability Machine.

    Science.gov (United States)

    Gu, Bin; Sun, Xingming; Sheng, Victor S

    2017-07-01

    Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

  20. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  1. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  2. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  3. The Britannica Guide to Statistics and Probability

    CERN Document Server

    2011-01-01

    By observing patterns and repeated behaviors, mathematicians have devised calculations to significantly reduce human potential for error. This volume introduces the historical and mathematical basis of statistics and probability as well as their application to everyday situations. Readers will also meet the prominent thinkers who advanced the field and established a numerical basis for prediction

  4. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  5. Declination Calculator

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Declination is calculated using the current International Geomagnetic Reference Field (IGRF) model. Declination is calculated using the current World Magnetic Model...

  6. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  7. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  8. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  9. A seismic probability map

    Directory of Open Access Journals (Sweden)

    J. M. MUNUERA

    1964-06-01

    Full Text Available The material included in former two papers (SB and EF
    which summs 3307 shocks corresponding to 2360 years, up to I960, was
    reduced to a 50 years period by means the weight obtained for each epoch.
    The weitliing factor is the ratio 50 and the amount of years for every epoch.
    The frequency has been referred over basis VII of the international
    seismic scale of intensity, for all cases in which the earthquakes are equal or
    greater than VI and up to IX. The sum of products: frequency and parameters
    previously exposed, is the probable frequency expected for the 50
    years period.
    On each active small square, we have made the corresponding computation
    and so we have drawn the Map No 1, in percentage. The epicenters with
    intensity since X to XI are plotted in the Map No 2, in order to present a
    complementary information.
    A table shows the return periods obtained for all data (VII to XI,
    and after checking them with other computed from the first up to last shock,
    a list includes the probable approximate return periods estimated for the area.
    The solution, we suggest, is an appropriated form to express the seismic
    contingent phenomenon and it improves the conventional maps showing
    the equal intensity curves corresponding to the maximal values of given side.

  10. Method, system, and computer-readable medium for determining performance characteristics of an object undergoing one or more arbitrary aging conditions

    Science.gov (United States)

    Gering, Kevin L.

    2017-04-18

    A method, system, and computer-readable medium are described for characterizing performance loss of an object undergoing an arbitrary aging condition. Baseline aging data may be collected from the object for at least one known baseline aging condition over time, determining baseline multiple sigmoid model parameters from the baseline data, and performance loss of the object may be determined over time through multiple sigmoid model parameters associated with the object undergoing the arbitrary aging condition using a differential deviation-from-baseline approach from the baseline multiple sigmoid model parameters. The system may include an object, monitoring hardware configured to sample performance characteristics of the object, and a processor coupled to the monitoring hardware. The processor is configured to determine performance loss for the arbitrary aging condition from a comparison of the performance characteristics of the object deviating from baseline performance characteristics associated with a baseline aging condition.

  11. Nonequilibrium random matrix theory: Transition probabilities

    Science.gov (United States)

    Pedro, Francisco Gil; Westphal, Alexander

    2017-03-01

    In this paper we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  12. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  13. MEMS Calculator

    Science.gov (United States)

    SRD 166 MEMS Calculator (Web, free access)   This MEMS Calculator determines the following thin film properties from data taken with an optical interferometer or comparable instrument: a) residual strain from fixed-fixed beams, b) strain gradient from cantilevers, c) step heights or thicknesses from step-height test structures, and d) in-plane lengths or deflections. Then, residual stress and stress gradient calculations can be made after an optical vibrometer or comparable instrument is used to obtain Young's modulus from resonating cantilevers or fixed-fixed beams. In addition, wafer bond strength is determined from micro-chevron test structures using a material test machine.

  14. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  15. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  16. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  17. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  18. The probability density function of completed length of service (CLS ...

    African Journals Online (AJOL)

    By investigating the existing relationships between the probability density function of CLS distribution and some other wastage functions this paper estimates the functions for some secondary schools in Enugu State. Wastage probabilities are calculated, survivor functions estimated. The accompanying standard errors are ...

  19. On the Provenance of Judgments of Conditional Probability

    Science.gov (United States)

    Zhao, Jiaying; Shah, Anuj; Osherson, Daniel

    2009-01-01

    In standard treatments of probability, Pr(A[vertical bar]B) is defined as the ratio of Pr(A[intersection]B) to Pr(B), provided that Pr(B) greater than 0. This account of conditional probability suggests a psychological question, namely, whether estimates of Pr(A[vertical bar]B) arise in the mind via implicit calculation of…

  20. A Priori Probability Distribution of the Cosmological Constant

    OpenAIRE

    Weinberg, Steven

    2000-01-01

    In calculations of the probability distribution for the cosmological constant, it has been previously assumed that the a priori probability distribution is essentially constant in the very narrow range that is anthropically allowed. This assumption has recently been challenged. Here we identify large classes of theories in which this assumption is justified.

  1. Calculating Quenching Weights

    CERN Document Server

    Salgado, C A; Salgado, Carlos A.; Wiedemann, Urs Achim

    2003-01-01

    We calculate the probability (``quenching weight'') that a hard parton radiates an additional energy fraction due to scattering in spatially extended QCD matter. This study is based on an exact treatment of finite in-medium path length, it includes the case of a dynamically expanding medium, and it extends to the angular dependence of the medium-induced gluon radiation pattern. All calculations are done in the multiple soft scattering approximation (Baier-Dokshitzer-Mueller-Peign\\'e-Schiff--Zakharov ``BDMPS-Z''-formalism) and in the single hard scattering approximation (N=1 opacity approximation). By comparison, we establish a simple relation between transport coefficient, Debye screening mass and opacity, for which both approximations lead to comparable results. Together with this paper, a CPU-inexpensive numerical subroutine for calculating quenching weights is provided electronically. To illustrate its applications, we discuss the suppression of hadronic transverse momentum spectra in nucleus-nucleus colli...

  2. Chemical immobilization of adult female Weddell seals with tiletamine and zolazepam: effects of age, condition and stage of lactation

    Directory of Open Access Journals (Sweden)

    Harcourt Robert G

    2006-02-01

    Full Text Available Abstract Background Chemical immobilization of Weddell seals (Leptonychotes weddellii has previously been, for the most part, problematic and this has been mainly attributed to the type of immobilizing agent used. In addition to individual sensitivity, physiological status may play an important role. We investigated the use of the intravenous administration of a 1:1 mixture of tiletamine and zolazepam (Telazol® to immobilize adult females at different points during a physiologically demanding 5–6 week lactation period. We also compared performance between IV and IM injection of the same mixture. Results The tiletamine:zolazepam mixture administered intravenously was an effective method for immobilization with no fatalities or pronounced apnoeas in 106 procedures; however, there was a 25 % (one animal in four mortality rate with intramuscular administration. Induction time was slightly longer for females at the end of lactation (54.9 ± 2.3 seconds than at post-parturition (48.2 ± 2.9 seconds. In addition, the number of previous captures had a positive effect on induction time. There was no evidence for effects due to age, condition (total body lipid, stage of lactation or number of captures on recovery time. Conclusion We suggest that intravenous administration of tiletamine and zolazepam is an effective and safe immobilizing agent for female Weddell seals. Although individual traits could not explain variation in recovery time, we suggest careful monitoring of recovery times during longitudinal studies (> 2 captures. We show that physiological pressures do not substantially affect response to chemical immobilization with this mixture; however, consideration must be taken for differences that may exist for immobilization of adult males and juveniles. Nevertheless, we recommend a mass-specific dose of 0.50 – 0.65 mg/kg for future procedures with adult female Weddell seals and a starting dose of 0.50 mg/kg for other age classes and other

  3. Effect of aging conditions on the repair bond strength of a microhybrid and a nanohybrid resin composite.

    Science.gov (United States)

    Ozcan, Mutlu; Cura, Cenk; Brendeke, Johannes

    2010-12-01

    this study evaluated the effect of different aging methods on the repair bond strength and failure types of a microhybrid and a nanohybrid composite. disk-shaped microhybrid (Quadrant Anterior Shine-QA) and nanohybrid (Tetric EvoCeram-TE) resin composite specimens (N = 192, n = 12/per group) were photopolymerized and randomly assigned to one of the three aging conditions: (1) immersion in deionized water (37°C, 2 months), (2) thermocycling (5000 times, 5 to 55 °C), (3) immersion in citric acid (pH: 3.0; 1 week). The control group was stored dry for 24 h at 37°C. After aging procedures, the specimens were silica coated (30 microm SiO2) (CoJet-Sand) using an intraoral air abrasion device, silanized (ESPESil) and an intermediate adhesive resin was applied (Visio-Bond, 3M ESPE). Resin composites, once of the same kind as the substrate (QA-QA, TE-TE) and once other than the substrate material (QA-TE, TE-QA) were adhered onto the conditioned substrates. Shear force was applied to the adhesive interface in a universal testing machine (cross-head speed: 1 mm/min). a significant influence of the aging method was observed (p 0.05) (chi-square). Citric acid aging yielded significantly less incidence of score A (8-75%) compared to the control group in all composite combinations (p < 0.05). both microhybrid and nanohybrid composites could be used either as a substrate or as relayering composites in early repairs. Aging factors may diminish the repair quality.

  4. Training Teachers to Teach Probability

    Science.gov (United States)

    Batanero, Carmen; Godino, Juan D.; Roa, Rafael

    2004-01-01

    In this paper we analyze the reasons why the teaching of probability is difficult for mathematics teachers, describe the contents needed in the didactical preparation of teachers to teach probability and analyze some examples of activities to carry out this training. These activities take into account the experience at the University of Granada,…

  5. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  6. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  7. Calculation Software

    Science.gov (United States)

    1994-01-01

    MathSoft Plus 5.0 is a calculation software package for electrical engineers and computer scientists who need advanced math functionality. It incorporates SmartMath, an expert system that determines a strategy for solving difficult mathematical problems. SmartMath was the result of the integration into Mathcad of CLIPS, a NASA-developed shell for creating expert systems. By using CLIPS, MathSoft, Inc. was able to save the time and money involved in writing the original program.

  8. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  9. Considerations on probability: from games of chance to modern science

    Directory of Open Access Journals (Sweden)

    Paola Monari

    2015-12-01

    Full Text Available The article sets out a number of considerations on the distinction between variability and uncertainty over the centuries. Games of chance have always been useful random experiments which through combinatorial calculation have opened the way to probability theory and to the interpretation of modern science through statistical laws. The article also looks briefly at the stormy nineteenth-century debate concerning the definitions of probability which went over the same grounds – sometimes without any historical awareness – as the debate which arose at the very beginnings of probability theory, when the great probability theorists were open to every possible meaning of the term.

  10. Sample size and the probability of a successful trial.

    Science.gov (United States)

    Chuang-Stein, Christy

    2006-01-01

    This paper describes the distinction between the concept of statistical power and the probability of getting a successful trial. While one can choose a very high statistical power to detect a certain treatment effect, the high statistical power does not necessarily translate to a high success probability if the treatment effect to detect is based on the perceived ability of the drug candidate. The crucial factor hinges on our knowledge of the drug's ability to deliver the effect used to power the study. The paper discusses a framework to calculate the 'average success probability' and demonstrates how uncertainty about the treatment effect could affect the average success probability for a confirmatory trial. It complements an earlier work by O'Hagan et al. (Pharmaceutical Statistics 2005; 4:187-201) published in this journal. Computer codes to calculate the average success probability are included.

  11. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  12. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  13. Calculator calculus

    CERN Document Server

    McCarty, George

    1982-01-01

    How THIS BOOK DIFFERS This book is about the calculus. What distinguishes it, however, from other books is that it uses the pocket calculator to illustrate the theory. A computation that requires hours of labor when done by hand with tables is quite inappropriate as an example or exercise in a beginning calculus course. But that same computation can become a delicate illustration of the theory when the student does it in seconds on his calculator. t Furthermore, the student's own personal involvement and easy accomplishment give hi~ reassurance and en­ couragement. The machine is like a microscope, and its magnification is a hundred millionfold. We shall be interested in limits, and no stage of numerical approximation proves anything about the limit. However, the derivative of fex) = 67.SgX, for instance, acquires real meaning when a student first appreciates its values as numbers, as limits of 10 100 1000 t A quick example is 1.1 , 1.01 , 1.001 , •••• Another example is t = 0.1, 0.01, in the functio...

  14. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  15. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  16. Considerations on a posteriori probability

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available In this first paper of 1911 relating to the sex ratio at birth, Gini repurposed a Laplace’s succession rule according to a Bayesian version. The Gini's intuition consisted in assuming for prior probability a Beta type distribution and introducing the "method of results (direct and indirect" for the determination of  prior probabilities according to the statistical frequency obtained from statistical data.

  17. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....

  18. Transition Probabilities of Gd I

    Science.gov (United States)

    Bilty, Katherine; Lawler, J. E.; Den Hartog, E. A.

    2011-01-01

    Rare earth transition probabilities are needed within the astrophysics community to determine rare earth abundances in stellar photospheres. The current work is part an on-going study of rare earth element neutrals. Transition probabilities are determined by combining radiative lifetimes measured using time-resolved laser-induced fluorescence on a slow atom beam with branching fractions measured from high resolution Fourier transform spectra. Neutral rare earth transition probabilities will be helpful in improving abundances in cool stars in which a significant fraction of rare earths are neutral. Transition probabilities are also needed for research and development in the lighting industry. Rare earths have rich spectra containing 100's to 1000's of transitions throughout the visible and near UV. This makes rare earths valuable additives in Metal Halide - High Intensity Discharge (MH-HID) lamps, giving them a pleasing white light with good color rendering. This poster presents the work done on neutral gadolinium. We will report radiative lifetimes for 135 levels and transition probabilities for upwards of 1500 lines of Gd I. The lifetimes are reported to ±5% and the transition probabilities range from 5% for strong lines to 25% for weak lines. This work is supported by the National Science Foundation under grant CTS 0613277 and the National Science Foundation's REU program through NSF Award AST-1004881.

  19. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  20. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  1. Study on Quantitative Correlations between the Ageing Condition of Transformer Cellulose Insulation and the Large Time Constant Obtained from the Extended Debye Model

    Directory of Open Access Journals (Sweden)

    Yiyi Zhang

    2017-11-01

    Full Text Available Polarization-depolarization current (PDC measurements are now being used as a diagnosis tool to predict the ageing condition of transformer oil-paper insulation. Unfortunately, it is somewhat difficult to obtain the ageing condition of transformer cellulose insulation using the PDC technique due to the variation in transformer insulation geometry. In this literature, to quantify the ageing condition of transformer cellulose insulation using the PDC technique, we firstly designed a series of experiments under controlled laboratory conditions, and then obtained the branch parameters of an extended Debye model using the technique of curve fitting the PDC data. Finally, the ageing effect and water effect on the parameters of large time constant branches were systematically investigated. In the present paper, it is observed that there is a good exponential correlation between large time constants and degree of polymerization (DP. Therefore, the authors believe that the large time constants may be regard as a sensitive ageing indicator and the nice correlations might be utilized for the quantitative assessment of ageing condition in transformer cellulose insulation in the future due to the geometry independence of large time constants. In addition, it is found that the water in cellulose pressboards has a predominant effect on large time constants.

  2. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  4. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  5. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  6. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  7. Flood hazard probability mapping method

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  8. Incompatible Stochastic Processes and Complex Probabilities

    Science.gov (United States)

    Zak, Michail

    1997-01-01

    The definition of conditional probabilities is based upon the existence of a joint probability. However, a reconstruction of the joint probability from given conditional probabilities imposes certain constraints upon the latter, so that if several conditional probabilities are chosen arbitrarily, the corresponding joint probability may not exist.

  9. Probability measures on metric spaces

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom

  10. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  11. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  12. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  13. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  14. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  15. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  16. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics Impact factor: 1.357, year: 2016 http:// library .utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  17. Probability and statistics: A reminder

    Science.gov (United States)

    Clément, Benoit

    2013-07-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from "data analysis in experimental sciences" given in [1

  18. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  19. Probability and statistics: A reminder

    OpenAIRE

    Clément Benoit

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  20. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  1. On the discretization of probability density functions and the ...

    Indian Academy of Sciences (India)

    In probability theory, statistics, statistical mechanics, communication theory, and other fields of science, the calculation of Rényi and Tsallis entropies [1–3] for probability density function ρ(x) involves integral. ∫ b a [ρ(x)] q dx, where q ≥ 0 is a parameter. The aim of this paper is to present a procedure for the discretization of ...

  2. Transition probabilities and radiative lifetimes of levels in F I

    Energy Technology Data Exchange (ETDEWEB)

    Celik, Gueltekin, E-mail: gultekin@selcuk.edu.tr; Dogan, Duygu; Ates, Sule; Taser, Mehmet

    2012-07-15

    The electric dipole transition probabilities and the lifetimes of excited levels have been calculated using the weakest bound electron potential model theory (WBEPMT) and the quantum defect orbital theory (QDOT) in atomic fluorine. In the calculations, many of transition arrays included both multiplet and fine-structure transitions are considered. We employed Numerical Coulomb Approximation (NCA) wave functions and numerical non-relativistic Hartree-Fock (NRHF) wave functions for expectation values of radii in determination of parameters. The necessary energy values have been taken from experimental energy data in the literature. The calculated transition probabilities and lifetimes have been compared with available theoretical and experimental results. A good agreement with results in literature has been obtained. Moreover, some transition probability and the lifetime values not existing in the literature for some highly excited levels have been obtained using these methods.

  3. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...... as a filter with a transfer function depending on the actual velocity. This influences the detection probability, which gets lower at certain velocities. An index directly reflecting the probability of detection can easily be calculated from the cross-correlation estimated. This makes it possible to assess...

  4. Reach/frequency for printed media: Personal probabilities or models

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    2000-01-01

    that, in order to prevent bias, ratings per group must be used as reading probabilities. Nevertheless, in most cases, the estimates are still biased compared with panel data, thus overestimating net ´reach. Models with the same assumptions as with assignments of reading probabilities are presented......The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded...

  5. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  6. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  7. Probability on compact Lie groups

    CERN Document Server

    Applebaum, David

    2014-01-01

    Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...

  8. Cross Check of NOvA Oscillation Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States). Dept. of Theoretical Physics; Messier, Mark D. [Indiana Univ., Bloomington, IN (United States). Dept. of Physics

    2018-01-12

    In this note we perform a cross check of the programs used by NOvA to calculate the 3-flavor oscillation probabilities with a independent program using a different method. The comparison is performed at 6 significant figures and the agreement, $|\\Delta P|/P$ is better than $10^{-5}$, as good as can be expected with 6 significant figures. In addition, a simple and accurate alternative method to calculate the oscillation probabilities is outlined and compared in the L/E range and matter density relevant for the NOvA experiment.

  9. Measuring Robustness of Timetables at Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    infrastructure layouts given a timetable. These two methods provide different precision at the expense of a more complex calculation process. The advanced and more precise method is based on a probability distribution that can describe the expected delay between two trains as a function of the buffer time....... This paper proposes to use the exponential distribution, only taking non-negative delays into account, but any probability distribution can be used. Furthermore, the paper proposes that the calculation parameters are estimated from existing delay data, at a station, to achieve a higher precision. As delay...

  10. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness. Copyright © 2013 Cognitive Science Society, Inc.

  11. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  12. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover

  13. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  14. Probability, Statistics, and Computational Science

    OpenAIRE

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient infe...

  15. Entropy in probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Rolke, W.A.

    1992-01-01

    The author develops a theory of entropy, where entropy is defined as the Legendre-Fenchel transform of the logarithmic moment generating function of a probability measure on a Banach space. A variety of properties relating the probability measure and its entropy are proven. It is shown that the entropy of a large class of stochastic processes can be approximated by the entropies of the finite-dimensional distributions of the process. For several types of measures the author finds explicit formulas for the entropy, for example for stochastic processes with independent increments and for Gaussian processes. For the entropy of Markov chains, evaluated at the observations of the process, the author proves a central limit theorem. Theorems relating weak convergence of probability measures on a finite dimensional space and pointwise convergence of their entropies are developed and then used to give a new proof of Donsker's theorem. Finally the use of entropy in statistics is discussed. The author shows the connection between entropy and Kullback's minimum discrimination information. A central limit theorem yields a test for the independence of a sequence of observations.

  16. Evaluations of Structural Failure Probabilities and Candidate Inservice Inspection Programs

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.; Simonen, Fredric A.

    2009-05-01

    The work described in this report applies probabilistic structural mechanics models to predict the reliability of nuclear pressure boundary components. These same models are then applied to evaluate the effectiveness of alternative programs for inservice inspection to reduce these failure probabilities. Results of the calculations support the development and implementation of risk-informed inservice inspection of piping and vessels. Studies have specifically addressed the potential benefits of ultrasonic inspections to reduce failure probabilities associated with fatigue crack growth and stress-corrosion cracking. Parametric calculations were performed with the computer code pc-PRAISE to generate an extensive set of plots to cover a wide range of pipe wall thicknesses, cyclic operating stresses, and inspection strategies. The studies have also addressed critical inputs to fracture mechanics calculations such as the parameters that characterize the number and sizes of fabrication flaws in piping welds. Other calculations quantify uncertainties associated with the inputs calculations, the uncertainties in the fracture mechanics models, and the uncertainties in the resulting calculated failure probabilities. A final set of calculations address the effects of flaw sizing errors on the effectiveness of inservice inspection programs.

  17. Comparative Investigation on the Performance of Modified System Poles and Traditional System Poles Obtained from PDC Data for Diagnosing the Ageing Condition of Transformer Polymer Insulation Materials

    Directory of Open Access Journals (Sweden)

    Jiefeng Liu

    2018-02-01

    Full Text Available The life expectancy of a transformer is largely depended on the service life of transformer polymer insulation materials. Nowadays, several papers have reported that the traditional system poles obtained from polarization and depolarization current (PDC data can be used to assess the condition of transformer insulation systems. However, the traditional system poles technique only provides limited ageing information for transformer polymer insulation. In this paper, the modified system poles obtained from PDC data are proposed to assess the ageing condition of transformer polymer insulation. The aim of the work is to focus on reporting a comparative investigation on the performance of modified system poles and traditional system poles for assessing the ageing condition of a transformer polymer insulation system. In the present work, a series of experiments have been performed under controlled laboratory conditions. The PDC measurement data, degree of polymerization (DP and moisture content of the oil-immersed polymer pressboard specimens were carefully monitored. It is observed that, compared to the relationships between traditional system poles and DP values, there are better correlations between the modified system poles and DP values, because the modified system poles can obtain much more ageing information on transformer polymer insulation. Therefore, the modified system poles proposed in the paper are more suitable for the diagnosis of the ageing condition of transformer polymer insulation.

  18. Frequentist probability and frequentist statistics

    Energy Technology Data Exchange (ETDEWEB)

    Neyman, J.

    1977-01-01

    A brief, nontechnical outline is given of the author's views on the ''frequentist'' theory of probability and the ''frequentist'' theory of statistics; their applications are illustrated in a few domains of study of nature. The phenomenon of apparently stable relative frequencies as the source of the frequentist theories of probability and statistics is taken up first. Three steps are set out: empirical establishment of apparently stable long-run relative frequencies of events judged interesting, as they develop in nature; guessing and then verifying the chance mechanism, the repeated operation of which produced the observed frequencies--this is a problem of frequentist probability theory; using the hypothetical chance mechanism of the phenomenon studied to deduce rules of adjusting our actions to the observations to ensure the highest ''measure'' of ''success''. Illustrations of the three steps are given. The theory of testing statistical hypotheses is sketched: basic concepts, simple and composite hypotheses, hypothesis tested, importance of the power of the test used, practical applications of the theory of testing statistical hypotheses. Basic ideas and an example of the randomization of experiments are discussed, and an ''embarrassing'' example is given. The problem of statistical estimation is sketched: example of an isolated problem, example of connected problems treated routinely, empirical Bayes theory, point estimation. The theory of confidence intervals is outlined: basic concepts, anticipated misunderstandings, construction of confidence intervals: regions of acceptance. Finally, the theory of estimation by confidence intervals or regions is considered briefly. 4 figures. (RWR)

  19. Response-probability volume histograms and iso-probability of response charts in treatment plan evaluation.

    Science.gov (United States)

    Mavroidis, Panayiotis; Ferreira, Brigida Costa; Lopes, Maria do Carmo

    2011-05-01

    This study aims at demonstrating a new method for treatment plan evaluation and comparison based on the radiobiological response of individual voxels. This is performed by applying them on three different cancer types and treatment plans of different conformalities. Furthermore, their usefulness is examined in conjunction with traditionally applied radiobiological and dosimetric treatment plan evaluation criteria. Three different cancer types (head and neck, breast and prostate) were selected to quantify the benefits of the proposed treatment plan evaluation method. In each case, conventional conformal radiotherapy (CRT) and intensity modulated radiotherapy (IMRT) treatment configurations were planned. Iso-probability of response charts was produced by calculating the response probability in every voxel using the linear-quadratic-Poisson model and the dose-response parameters of the corresponding structure to which this voxel belongs. The overall probabilities of target and normal tissue responses were calculated using the Poisson and the relative seriality models, respectively. The 3D dose distribution converted to a 2 Gy fractionation, D2(GY) and iso-BED distributions are also shown and compared with the proposed methodology. Response-probability volume histograms (RVH) were derived and compared with common dose volume histograms (DVH). The different dose distributions were also compared using the complication-free tumor control probability, P+, the biologically effective uniform dose, D, and common dosimetric criteria. 3D Iso-probability of response distributions is very useful for plan evaluation since their visual information focuses on the doses that are likely to have a larger clinical effect in that particular organ. The graphical display becomes independent of the prescription dose highlighting the local radiation therapy effect in each voxel without the loss of important spatial information. For example, due to the exponential nature of the Poisson

  20. An Alternative Teaching Method of Conditional Probabilities and Bayes' Rule: An Application of the Truth Table

    Science.gov (United States)

    Satake, Eiki; Vashlishan Murray, Amy

    2015-01-01

    This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…

  1. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  2. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  3. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...... to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible to explain cooperation in the one-shot Prisoner's Dilemma...

  4. Constraints on probability distributions of grammatical forms

    Directory of Open Access Journals (Sweden)

    Kostić Aleksandar

    2007-01-01

    Full Text Available In this study we investigate the constraints on probability distribution of grammatical forms within morphological paradigms of Serbian language, where paradigm is specified as a coherent set of elements with defined criteria for inclusion. Thus, for example, in Serbian all feminine nouns that end with the suffix "a" in their nominative singular form belong to the third declension, the declension being a paradigm. The notion of a paradigm could be extended to other criteria as well, hence, we can think of noun cases, irrespective of grammatical number and gender, or noun gender, irrespective of case and grammatical number, also as paradigms. We took the relative entropy as a measure of homogeneity of probability distribution within paradigms. The analysis was performed on 116 morphological paradigms of typical Serbian and for each paradigm the relative entropy has been calculated. The obtained results indicate that for most paradigms the relative entropy values fall within a range of 0.75 - 0.9. Nonhomogeneous distribution of relative entropy values allows for estimating the relative entropy of the morphological system as a whole. This value is 0.69 and can tentatively be taken as an index of stability of the morphological system.

  5. A reconsideration of Lotka's extinction probability using a bisexual branching process

    OpenAIRE

    Hull, David M.

    2001-01-01

    It is generally recognized that Alfred Lotka made the first application of standard Galton-Watson branching process theory to calculate an extinction probability in a specific population (using asexual reproduction). This note applies bisexual Galton-Watson branching process theory to the calculation of an extinction probability from Lotka's data, yielding a somewhat higher value.

  6. Calculating Speed of Sound

    Science.gov (United States)

    Bhatnagar, Shalabh

    2017-01-01

    Sound is an emerging source of renewable energy but it has some limitations. The main limitation is, the amount of energy that can be extracted from sound is very less and that is because of the velocity of the sound. The velocity of sound changes as per medium. If we could increase the velocity of the sound in a medium we would be probably able to extract more amount of energy from sound and will be able to transfer it at a higher rate. To increase the velocity of sound we should know the speed of sound. If we go by the theory of classic mechanics speed is the distance travelled by a particle divided by time whereas velocity is the displacement of particle divided by time. The speed of sound in dry air at 20 °C (68 °F) is considered to be 343.2 meters per second and it won't be wrong in saying that 342.2 meters is the velocity of sound not the speed as it's the displacement of the sound not the total distance sound wave covered. Sound travels in the form of mechanical wave, so while calculating the speed of sound the whole path of wave should be considered not just the distance traveled by sound. In this paper I would like to focus on calculating the actual speed of sound wave which can help us to extract more energy and make sound travel with faster velocity.

  7. Approximation of ruin probabilities via Erlangized scale mixtures

    DEFF Research Database (Denmark)

    Peralta, Oscar; Rojas-Nandayapa, Leonardo; Xie, Wangyue

    2018-01-01

    In this paper, we extend an existing scheme for numerically calculating the probability of ruin of a classical Cramér–Lundbergreserve process having absolutely continuous but otherwise general claim size distributions. We employ a dense class of distributions that we denominate Erlangized scale m...

  8. Simplified Freeman-Tukey test statistics for testing probabilities in ...

    African Journals Online (AJOL)

    This paper presents the simplified version of the Freeman-Tukey test statistic for testing hypothesis about multinomial probabilities in one, two and multidimensional contingency tables that does not require calculating the expected cell frequencies before test of significance. The simplified method established new criteria of ...

  9. Measuring Robustness of Timetables in Stations using a Probability Distribution

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex

    delays caused by interdependencies, and result in a more robust operation. Currently three methods to calculate the complexity of station exists: 1. Complexity of a station based on the track layout 2. Complexity of a station based on the probability of a conflict using a plan of operation 3. Complexity...

  10. On estimating the fracture probability of nuclear graphite components

    Science.gov (United States)

    Srinivasan, Makuteswara

    2008-10-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation.

  11. [Pre-test and post-test probabilities. Who cares?].

    Science.gov (United States)

    Steurer, Johann

    2009-01-01

    The accuracy of a diagnostic test, i.e. abdomen ultrasound in patients with suspected acute appendicitis, is described in the terms of sensitivity and specificity. According to eminent textbooks physicians should use the values of the sensitivity and specificity of a test in their diagnostic reasoning. Physician's estimate, after taking the history, the pretest-probability of the suspected illness, order one or more tests and then calculate the respective posttest-probability. In practice physicians almost never follow this line of thinking. The main reasons are; to estimate concrete illness probabilities is difficult, the values for the sensitivity and specificity of a test are most often not known by physicians and calculations during daily practice are intricate. Helpful for busy physicians are trustworthy expert recommendations which test to apply in which clinical situation.

  12. Wigner function and the probability representation of quantum states

    Directory of Open Access Journals (Sweden)

    Man’ko Margarita A.

    2014-01-01

    Full Text Available The relation of theWigner function with the fair probability distribution called tomographic distribution or quantum tomogram associated with the quantum state is reviewed. The connection of the tomographic picture of quantum mechanics with the integral Radon transform of the Wigner quasidistribution is discussed. The Wigner–Moyal equation for the Wigner function is presented in the form of kinetic equation for the tomographic probability distribution both in quantum mechanics and in the classical limit of the Liouville equation. The calculation of moments of physical observables in terms of integrals with the state tomographic probability distributions is constructed having a standard form of averaging in the probability theory. New uncertainty relations for the position and momentum are written in terms of optical tomograms suitable for directexperimental check. Some recent experiments on checking the uncertainty relations including the entropic uncertainty relations are discussed.

  13. MARKOV MODELS IN CALCULATING CLV

    OpenAIRE

    DECEWICZ, Anna

    2015-01-01

    The  paper  presents  a  me hod  of  calculating  customer  lifetime  value and  finding optimal remarketing  strategy  basing on Markov model  with  short-term memory of  client's activity. Furthermore, sensitivity analysis of optimal strategy is conducted for two ty pes of retention rate functional form defining transitin probabilities

  14. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  15. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  16. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  17. A practical overview on probability distributions

    OpenAIRE

    Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca

    2015-01-01

    Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a bino...

  18. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  19. Non-equilibrium random matrix theory. Transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, Francisco Gil [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Gruppe Theorie

    2016-06-15

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  20. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  1. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...

  2. Probability output modeling for support vector machines

    Science.gov (United States)

    Zhang, Xiang; Xiao, Xiaoling; Tian, Jinwen; Liu, Jian

    2007-11-01

    In this paper we propose an approach to model the posterior probability output of multi-class SVMs. The sigmoid function is used to estimate the posterior probability output in binary classification. This approach modeling the posterior probability output of multi-class SVMs is achieved by directly solving the equations that are based on the combination of the probability outputs of binary classifiers using the Bayes's rule. The differences and different weights among these two-class SVM classifiers, based on the posterior probability, are considered and given for the combination of the probability outputs among these two-class SVM classifiers in this method. The comparative experiment results show that our method achieves the better classification precision and the better probability distribution of the posterior probability than the pairwise couping method and the Hastie's optimization method.

  3. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  4. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  5. Probability of flooding: An uncertainty analysis

    NARCIS (Netherlands)

    Slijkhuis, K.A.H.; Frijters, M.P.C.; Cooke, R.M.; Vrouwenvelder, A.C.W.M.

    1998-01-01

    In the Netherlands a new safety approach concerning the flood defences will probably be implemented in the near future. Therefore, an uncertainty analysis is currently being carried out to determine the uncertainty in the probability of flooding . The uncertainty of the probability of flooding could

  6. Lévy processes in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This is the continuation of a previous article that studied the relationship between the classes of infinitely divisible probability measures in classical and free probability, respectively, via the Bercovici–Pata bijection. Drawing on the results of the preceding article, the present paper outlines recent developments in the theory of Lévy processes in free probability.

  7. The trajectory of the target probability effect.

    Science.gov (United States)

    Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B

    2013-05-01

    The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.

  8. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  9. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  10. Lectures on probability and statistics. Revision

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1985-06-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. They begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probabilty of any specified outcome. They finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another. Hopefully, the reader will come away from these notes with a feel for some of the problems and uncertainties involved. Although there are standard approaches, most of the time there is no cut and dried ''best'' solution - ''best'' according to every criterion.

  11. Caustic-induced features in microlensing magnification probability distributions

    Science.gov (United States)

    Rauch, Kevin P.; Mao, Shude; Wambsganss, Joachim; Paczynski, Bohdan

    1992-01-01

    Numerical simulations have uncovered a previously unrecognized 'bump' in the macroimage magnification probabilities produced by a planar distribution of point masses. The result could be relevant to cases of microlensing by star fields in single galaxies, for which this lensing geometry is an excellent approximation. The bump is produced by bright pairs of microimages formed by sources lying near the caustics of the lens. The numerically calculated probabilities for the magnifications in the range between 3 and 30 are significantly higher than those given by the asymptotic relation derived by Schneider. The bump present in the two-dimensional lenses appears not to exist in the magnification probability distribution produced by a fully three-dimensional lens.

  12. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  13. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  14. Adolescents' misinterpretation of health risk probability expressions.

    Science.gov (United States)

    Cohn, L D; Schydlower, M; Foley, J; Copeland, R L

    1995-05-01

    To determine if differences exist between adolescents and physicians in their numerical translation of 13 commonly used probability expressions (eg, possibly, might). Cross-sectional. Adolescent medicine and pediatric orthopedic outpatient units. 150 adolescents and 51 pediatricians, pediatric orthopedic surgeons, and nurses. Numerical ratings of the degree of certainty implied by 13 probability expressions (eg, possibly, probably). Adolescents were significantly more likely than physicians to display comprehension errors, reversing or equating the meaning of terms such as probably/possibly and likely/possibly. Numerical expressions of uncertainty (eg, 30% chance) elicited less variability in ratings than lexical expressions of uncertainty (eg, possibly). Physicians should avoid using probability expressions such as probably, possibly, and likely when communicating health risks to children and adolescents. Numerical expressions of uncertainty may be more effective for conveying the likelihood of an illness than lexical expressions of uncertainty (eg, probably).

  15. A practical overview on probability distributions.

    Science.gov (United States)

    Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca

    2015-03-01

    Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a binomial or Poisson distribution in the majority of cases. For continuous variables, the probability can be described by the most important distribution in statistics, the normal distribution. Distributions of probability are briefly described together with some examples for their possible application.

  16. Two-dimensional imaging of tumour control probabilities and normal tissue complication probabilities.

    Science.gov (United States)

    Szlag, Marta; Slosarek, Krzysztof

    2010-01-01

    To create a presentation method of TCP and NTCP distributions calculated based on dose distribution for a selected CT slice. Three 24-bit colour maps - of dose distribution, delineated structures and CT information - were converted into m-by-n-by-3 data arrays, containing intensities of red, green, and blue colour components for each pixel. All calculations were performed with Matlab v.6.5. The transformation function, which consists of five linear functions, was prepared to translate the colour map into a one-dimensional data array of dose values. A menu-driven application based on the transformation function and mathematical models of complication risk (NTCP) and treatment control probability (TCP) was designed to allow pixel-by-pixel translation of colour maps into one-dimensional arrays of TCP and NTCP values. The result of this work is an application created to visualize the TCP and NTCP distribution for a single CT scan based on the spatial dose distribution calculated in the treatment planning system. The application allows 10 targets (PTV) and 10 organs at risks (OaR) to be defined. The interface allows alpha/beta values to be inserted for each delineated structure. The application computes TCP and NTCP matrices, which are presented as colour maps superimposed on the corresponding CT slice. There is a set of parameters used for TCP/NTCP calculations which can be defined by the user. Our application is a prototype of an evaluation tool. Although limited to a single plane of the treatment plan, it is believed to be a starting point for further development.

  17. [Inverse probability weighting (IPW) for evaluating and "correcting" selection bias].

    Science.gov (United States)

    Narduzzi, Silvia; Golini, Martina Nicole; Porta, Daniela; Stafoggia, Massimo; Forastiere, Francesco

    2014-01-01

    the Inverse probability weighting (IPW) is a methodology developed to account for missingness and selection bias caused by non-randomselection of observations, or non-random lack of some information in a subgroup of the population. to provide an overview of IPW methodology and an application in a cohort study of the association between exposure to traffic air pollution (nitrogen dioxide, NO₂) and 7-year children IQ. this methodology allows to correct the analysis by weighting the observations with the probability of being selected. The IPW is based on the assumption that individual information that can predict the probability of inclusion (non-missingness) are available for the entire study population, so that, after taking account of them, we can make inferences about the entire target population starting from the nonmissing observations alone.The procedure for the calculation is the following: firstly, we consider the entire population at study and calculate the probability of non-missing information using a logistic regression model, where the response is the nonmissingness and the covariates are its possible predictors.The weight of each subject is given by the inverse of the predicted probability. Then the analysis is performed only on the non-missing observations using a weighted model. IPW is a technique that allows to embed the selection process in the analysis of the estimates, but its effectiveness in "correcting" the selection bias depends on the availability of enough information, for the entire population, to predict the non-missingness probability. In the example proposed, the IPW application showed that the effect of exposure to NO2 on the area of verbal intelligence quotient of children is stronger than the effect showed from the analysis performed without regard to the selection processes.

  18. Time Dependence of Collision Probabilities During Satellite Conjunctions

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  19. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  20. Detection probability of vocalizing dugongs during playback of conspecific calls.

    Science.gov (United States)

    Ichikawa, Kotaro; Akamatsu, Tomonari; Shinke, Tomio; Sasamori, Kotoe; Miyauchi, Yukio; Abe, Yuki; Adulyanukosol, Kanjana; Arai, Nobuaki

    2009-10-01

    Dugongs (Dugong dugon) were monitored using simultaneous passive acoustic methods and visual observations in Thai waters during January 2008. Chirp and trill calls were detected by a towed stereo hydrophone array system. Two teams of experienced observers conducted standard visual observations on the same boat. Comparisons of detection probabilities of acoustic and visual monitoring between two independent observers were calculated. Acoustic and visual detection probabilities were 15.1% and 15.7%, respectively, employing a 300 s matching time interval. When conspecific chirp calls were broadcast from an underwater speaker deployed on the side of the observation boat, the detection probability of acoustic monitoring rose to 19.2%. The visual detection probability was 12.5%. Vocal hot spots characterized by frequent acoustic detection of calls were suggested by dispersion analysis, while dugongs were visually observed constantly throughout the focal area (pmonitoring assisted the survey since detection performance similar to that of experienced visual observers was shown. Playback of conspecific chirps appeared to increase the detection probability, which could be beneficial for future field surveys using passive acoustics in order to ensure the attendance of dugongs in the focal area.

  1. Magnetic Field Calculator

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Magnetic Field Calculator will calculate the total magnetic field, including components (declination, inclination, horizontal intensity, northerly intensity,...

  2. Alcohol Calorie Calculator

    Science.gov (United States)

    ... NIAAA College Materials Supporting Research Special Features CollegeAIM College Administrators Parents & Students Home > Special Features > Calculators > Alcohol Calorie Calculator Weekly Total 0 Calories Alcohol Calorie ...

  3. On the probability of cure for heavy-ion radiotherapy.

    Science.gov (United States)

    Hanin, Leonid; Zaider, Marco

    2014-07-21

    The probability of a cure in radiation therapy (RT)-viewed as the probability of eventual extinction of all cancer cells-is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.

  4. Experience Matters: Information Acquisition Optimizes Probability Gain

    Science.gov (United States)

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  5. Experience matters: information acquisition optimizes probability gain.

    Science.gov (United States)

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.

  6. UT Biomedical Informatics Lab (BMIL) Probability Wheel.

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K

    2016-01-01

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  7. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  8. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  9. Fracture probability along a fatigue crack path

    Energy Technology Data Exchange (ETDEWEB)

    Makris, P. [Technical Univ., Athens (Greece)

    1995-03-01

    Long experience has shown that the strength of materials under fatigue load has a stochastic behavior, which can be expressed through the fracture probability. This paper deals with a new analytically derived law for the distribution of the fracture probability along a fatigue crack path. The knowledge of the distribution of the fatigue fracture probability along the crack path helps the connection between stress conditions and the expected fatigue life of a structure under stochasticly varying loads. (orig.)

  10. Probability and statistics: models for research

    National Research Council Canada - National Science Library

    Bailey, Daniel Edgar

    1971-01-01

    This book is an interpretative presentation of the mathematical and logical basis of probability and statistics, indulging in some mathematics, but concentrating on the logical and scientific meaning...

  11. Numerical Computation of Multivariate Normal and Multivariate t Probabilities over Ellipsoidal Regions

    OpenAIRE

    Somerville, Paul N.

    2001-01-01

    An algorithm for the computation of multivariate normal and multivariate t probabilities over general hyperellipsoidal regions is given. A special case is the calculation of probabilities for central and noncentral F and x2 distributions. A FORTRAN 90 program MVELPS.FOR incorporates the algorithm.

  12. Numerical Computation of Multivariate Normal and Multivariate t Probabilities over Ellipsoidal Regions

    Directory of Open Access Journals (Sweden)

    Paul N. Somerville

    2001-06-01

    Full Text Available An algorithm for the computation of multivariate normal and multivariate t probabilities over general hyperellipsoidal regions is given. A special case is the calculation of probabilities for central and noncentral F and x2 distributions. A FORTRAN 90 program MVELPS.FOR incorporates the algorithm.

  13. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  14. Advantages of the probability amplitude over the probability density in quantum mechanics

    OpenAIRE

    Kurihara, Yoshimasa; Quach, Nhi My Uyen

    2013-01-01

    We discuss reasons why a probability amplitude, which becomes a probability density after squaring, is considered as one of the most basic ingredients of quantum mechanics. First, the Heisenberg/Schrodinger equation, an equation of motion in quantum mechanics, describes a time evolution of the probability amplitude rather than of a probability density. There may be reasons why dynamics of a physical system are described by amplitude. In order to investigate one role of the probability amplitu...

  15. Evaluation of the Permanent Deformations and Aging Conditions of Batu Pahat Soft Clay-Modified Asphalt Mixture by Using a Dynamic Creep Test

    Directory of Open Access Journals (Sweden)

    Al Allam A. M.

    2016-01-01

    Full Text Available This study aimed to evaluate the permanent deformation and aging conditions of BatuPahat soft clay–modified asphalt mixture, also called BatuPahat soft clay (BPSC particles; these particles are used in powder form as an additive to hot-mix asphalt mixture. In this experiment, five percentage compositions of BPSC (0%, 2%, 4%, 6%, and 8% by weight of bitumen were used. A novel design was established to modify the hot-mix asphalt by using the Superpave method for each additive ratio. Several laboratory tests evaluating different properties, such as indirect tensile strength, resilient stiffness modulus, and dynamic creep, was conducted to assess the performance of the samples mixed through the Superpave method. In the resilient modulus test, fatigue and rutting resistance were reduced by the BPSC particles. The added BPSC particles increased the indirect tensile strength. Among the mixtures, 4% BPSC particles yielded the highest performance. In the dynamic creep test, 4% BPSC particles added to the unaged and short-term aged specimens also showed the highest performance. Based on these results, our conclusion is that the BPSC particles can alleviate the permanent deformation (rutting of roads.

  16. Adhesive quality of self-adhesive and conventional adhesive resin cement to Y-TZP ceramic before and after aging conditions.

    Science.gov (United States)

    Passos, Sheila Pestana; May, Liliana Gressler; Barca, Diana Capelli; Ozcan, Mutlu; Bottino, Marco Antonio; Valandro, Luiz Felipe

    2010-01-01

    This study evaluated the adhesive quality of simplified self-adhesive and conventional resin cements to Y-TZP in dry and aged conditions. Y-TZP ceramic blocks (N = 192) (5 x 5 x 2 mm) were embedded in acrylic resin and randomly divided into two groups, based on surface conditioning: 96% isopropanol or chairside tribochemical silica coating and silanization. Conditioned ceramics were divided into four groups to receive the resin cements (Panavia F 2.0, Variolink II, RelyX U100 and Maxcem). After 24 hours, half of the specimens (n = 12) from each group were submitted to shear bond strength testing (0.5 mm/minute). The remaining specimens were tested after 90 days of water storage at 37 degrees C and thermocycling (12,000x, 5 degrees C-55 degrees C). Failure types were then assessed. The data were analyzed using three-way ANOVA and the Tukey's test (alpha = 0.05). Significant effects of ceramic conditioning, cement type and storage conditions were observed (p adhesion varied, depending on the adhesive cement type.

  17. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  18. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  19. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  20. Analytical Study of Thermonuclear Reaction Probability Integrals

    OpenAIRE

    Chaudhry, M.A.; Haubold, H. J.; Mathai, A. M.

    2000-01-01

    An analytic study of the reaction probability integrals corresponding to the various forms of the slowly varying cross-section factor $S(E)$ is attempted. Exact expressions for reaction probability integrals are expressed in terms of the extended gamma functions.

  1. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  2. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...

  3. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  4. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  5. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  6. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  7. prep misestimates the probability of replication

    NARCIS (Netherlands)

    Iverson, G.; Lee, M.D.; Wagenmakers, E.-J.

    2009-01-01

    The probability of "replication," prep, has been proposed as a means of identifying replicable and reliable effects in the psychological sciences. We conduct a basic test of prep that reveals that it misestimates the true probability of replication, especially for small effects. We show how these

  8. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  9. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  10. Critical Assessment of Theoretical Calculations of Atomic Structure and Transition Probabilities: An Experimenter’s View

    Directory of Open Access Journals (Sweden)

    Elmar Träbert

    2014-03-01

    Full Text Available The interpretation of atomic observations by theory and the testing of computational predictions by experiment are interactive processes. It is necessary to gain experience with “the other side” before claims of achievement can be validated and judged. The discussion covers some general problems in the field as well as many specific examples, mostly organized by isoelectronic sequence, of what level of accuracy recently has been reached or which atomic structure or level lifetime problem needs more attention.

  11. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  12. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  13. Probability of large explosive volcanic eruptions in the Cascades

    Science.gov (United States)

    Nathenson, M.; Clynne, M. A.

    2011-12-01

    Estimating the probability of large explosive eruptions in the Cascades is problematic because they occur relatively infrequently. Although some volcanic centers have been more likely to have large eruptions than others, the calculation of the probability of large eruptions for individual volcanic centers is inappropriate. A center that has had a large eruption in the past will not necessarily have a large eruption in the future, and the occurrence for individual volcanic centers is too infrequent to have much confidence in a probability estimate. The sources of some large eruptions are ambiguous (e.g. Shevlin Park Tuff, Oregon) or unknown (Dibekulewe ash), but because the effects of large eruptions are quite widespread, the precise location of the source is less important in terms of hazards. Thus, we focus on the calculation of probability of large eruptions for the Cascade arc as a whole. To estimate the probability, we have chosen a time period for documenting eruptions of 1.15 Ma (the age of the eruption of Kulshan caldera) as a balance between the likelihood of there being good information but with a long enough time period to get a reasonable number of occurrences. We have compiled data from the literature on eruptions larger than 5 km3 in erupted volume to exclude the relatively frequent eruptions ~1-2 km3. The largest eruptions are clearly or likely to have been associated with caldera formation. For erupted volumes greater than 5 km3, 19 events have occurred in the last 1.15 Ma. A plot of event number versus age shows a high rate of occurrence since 13.5 ka and a much lower rate before then. Most of the events since 13.5 ka are 5-10 km3. Events 10 km3 and larger have occurred at a reasonably constant rate since 630 ka. The difference between the two data sets is probably the poor preservation of deposits for events between 5 and 10 km3 that occurred prior to the ending of the glaciation at about 15 ka. Before 630 ka, the only eruption > 10 km3 is Kulshan

  14. Sticking probability for hydrogen atoms on the surface of liquid /sup 4/He

    Energy Technology Data Exchange (ETDEWEB)

    Zimmerman, D.S.; Berlinsky, A.J.

    1983-03-01

    A calculation is presented of the sticking probability for hydrogen atoms colliding with a liquid /sup 4/He surface. The calculation is based on a model potential for the H-liquid /sup 4/He interaction which is used to derive both bound and free atom wave function and the linear H atom-ripplon coupling. Results are presented in terms of the energy and angle dependent sticking probability s(E,THETA) and the thermally averaged probability s(T), and comparison is made to the experimental results s(T)=0.035 +-0.00 for 0.18 < T < 0.27 K.

  15. METHOD OF FOREST FIRES PROBABILITY ASSESSMENT WITH POISSON LAW

    Directory of Open Access Journals (Sweden)

    A. S. Plotnikova

    2016-01-01

    Full Text Available The article describes the method for the forest fire burn probability estimation on a base of Poisson distribution. The λ parameter is assumed to be a mean daily number of fires detected for each Forest Fire Danger Index class within specific period of time. Thus, λ was calculated for spring, summer and autumn seasons separately. Multi-annual daily Forest Fire Danger Index values together with EO-derived hot spot map were input data for the statistical analysis. The major result of the study is generation of the database on forest fire burn probability. Results were validated against EO daily data on forest fires detected over Irkutsk oblast in 2013. Daily weighted average probability was shown to be linked with the daily number of detected forest fires. Meanwhile, there was found a number of fires which were developed when estimated probability was low. The possible explanation of this phenomenon was provided.

  16. A two-locus forensic match probability for subdivided populations.

    Science.gov (United States)

    Ayres, K L

    2000-01-01

    A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.

  17. Robust Model-Free Multiclass Probability Estimation

    Science.gov (United States)

    Wu, Yichao; Zhang, Hao Helen; Liu, Yufeng

    2010-01-01

    Classical statistical approaches for multiclass probability estimation are typically based on regression techniques such as multiple logistic regression, or density estimation approaches such as linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). These methods often make certain assumptions on the form of probability functions or on the underlying distributions of subclasses. In this article, we develop a model-free procedure to estimate multiclass probabilities based on large-margin classifiers. In particular, the new estimation scheme is employed by solving a series of weighted large-margin classifiers and then systematically extracting the probability information from these multiple classification rules. A main advantage of the proposed probability estimation technique is that it does not impose any strong parametric assumption on the underlying distribution and can be applied for a wide range of large-margin classification methods. A general computational algorithm is developed for class probability estimation. Furthermore, we establish asymptotic consistency of the probability estimates. Both simulated and real data examples are presented to illustrate competitive performance of the new approach and compare it with several other existing methods. PMID:21113386

  18. Test Your Calculator IQ.

    Science.gov (United States)

    Williams, David E.

    1981-01-01

    This short quiz for teachers is intended to help them to brush up on their calculator operating skills and to prepare for the types of questions their students will ask about calculator idiosyncracies. (SJL)

  19. A new convolution algorithm for loss probablity analysis in multiservice networks

    DEFF Research Database (Denmark)

    Huang, Qian; Ko, King-Tim; Iversen, Villy Bæk

    2011-01-01

    present a new Permutational Convolution Algorithm (PCA) for loss probability approximation in multiservice systems with trunk reservation. This method extends the application of the convolution algorithm and overcomes the problems of approximation accuracy in systems with a large number of traffic flows......Performance analysis in multiservice loss systems generally focuses on accurate and efficient calculation methods for traffic loss probability. Convolution algorithm is one of the existing efficient numerical methods. Exact loss probabilities are obtainable from the convolution algorithm in systems....... It is verified that the loss probabilities obtained by PCA are very close to the exact solutions obtained by Markov chain models, and the accuracy outperforms the ACA approximation....

  20. Uncertainty about probability: a decision analysis perspective

    Energy Technology Data Exchange (ETDEWEB)

    Howard, R.A.

    1988-03-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group.

  1. Calculating correct compilers

    OpenAIRE

    Bahr, Patrick; Hutton, Graham

    2015-01-01

    In this article we present a new approach to the problem of calculating compilers. In particular, we develop a simple but general technique that allows us to derive correct compilers from high- level semantics by systematic calculation, with all details of the implementation of the compilers falling naturally out of the calculation process. Our approach is based upon the use of standard equational reasoning techniques, and has been applied to calculate compilers for a wide range of language f...

  2. A Course on Elementary Probability Theory

    OpenAIRE

    Lo, Gane Samb

    2017-01-01

    This book introduces to the theory of probabilities from the beginning. Assuming that the reader possesses the normal mathematical level acquired at the end of the secondary school, we aim to equip him with a solid basis in probability theory. The theory is preceded by a general chapter on counting methods. Then, the theory of probabilities is presented in a discrete framework. Two objectives are sought. The first is to give the reader the ability to solve a large number of problems related t...

  3. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  4. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  5. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  6. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  7. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  8. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  9. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...

  10. Fixation Probabilities of Evolutionary Graphs Based on the Positions of New Appearing Mutants

    Directory of Open Access Journals (Sweden)

    Pei-ai Zhang

    2014-01-01

    Full Text Available Evolutionary graph theory is a nice measure to implement evolutionary dynamics on spatial structures of populations. To calculate the fixation probability is usually regarded as a Markov chain process, which is affected by the number of the individuals, the fitness of the mutant, the game strategy, and the structure of the population. However the position of the new mutant is important to its fixation probability. Here the position of the new mutant is laid emphasis on. The method is put forward to calculate the fixation probability of an evolutionary graph (EG of single level. Then for a class of bilevel EGs, their fixation probabilities are calculated and some propositions are discussed. The conclusion is obtained showing that the bilevel EG is more stable than the corresponding one-rooted EG.

  11. Autistic Savant Calendar Calculators.

    Science.gov (United States)

    Patti, Paul J.

    This study identified 10 savants with developmental disabilities and an exceptional ability to calculate calendar dates. These "calendar calculators" were asked to demonstrate their abilities, and their strategies were analyzed. The study found that the ability to calculate dates into the past or future varied widely among these…

  12. Flexible Mental Calculation.

    Science.gov (United States)

    Threlfall, John

    2002-01-01

    Suggests that strategy choice is a misleading characterization of efficient mental calculation and that teaching mental calculation methods as a whole is not conducive to flexibility. Proposes an alternative in which calculation is thought of as an interaction between noticing and knowledge. Presents an associated teaching approach to promote…

  13. On the joint probability of correlated physical occurrences

    Science.gov (United States)

    Costa de Beauregard, O.

    1996-03-01

    Correlation meaning interaction for physical occurrences, the joint probability formalizes this interaction and conceptualizes a stochastic causality. Bayesian reversibility then expresses action-reaction symmetry for spacelike, und cause-effect symmetry for timelike, separations. Information-negeutropy equivalence (that is. reversibility of the twin-faced information concept) extends Mehlberg's “lawlike reversibility” and vindicates Wigner's claim that psychokinesis is reciprocal to gain in knowledge. A covariant axiomatization of probabilities as expressing physical interaction, and displaying the spacetime propagation of information, is proposed. Its correspondence (but essential difference) with the quantum calculation recipe is evidenced. The unfolding paradigm of a twin-faced reality- and- representation universe is stressed, and Pauli's hints in this direction are mentioned.

  14. Computer routines for probability distributions, random numbers, and related functions

    Science.gov (United States)

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  15. Voltage dependency of transmission probability of aperiodic DNA molecule

    Science.gov (United States)

    Wiliyanti, V.; Yudiarsah, E.

    2017-07-01

    Characteristics of electron transports in aperiodic DNA molecules have been studied. Double stranded DNA model with the sequences of bases, GCTAGTACGTGACGTAGCTAGGATATGCCTGA, in one chain and its complements on the other chains has been used. Tight binding Hamiltonian is used to model DNA molecules. In the model, we consider that on-site energy of the basis has a linearly dependency on the applied electric field. Slater-Koster scheme is used to model electron hopping constant between bases. The transmission probability of electron from one electrode to the next electrode is calculated using a transfer matrix technique and scattering matrix method simultaneously. The results show that, generally, higher voltage gives a slightly larger value of the transmission probability. The applied voltage seems to shift extended states to lower energy. Meanwhile, the value of the transmission increases with twisting motion frequency increment.

  16. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  17. Probability Analysis of a Quantum Computer

    OpenAIRE

    Einarsson, Göran

    2003-01-01

    The quantum computer algorithm by Peter Shor for factorization of integers is studied. The quantum nature of a QC makes its outcome random. The output probability distribution is investigated and the chances of a successful operation is determined

  18. Nanoformulations and Clinical Trial Candidates as Probably ...

    African Journals Online (AJOL)

    Nanoformulations and Clinical Trial Candidates as Probably Effective and Safe Therapy for Tuberculosis. Madeeha Laghari, Yusrida Darwis, Abdul Hakeem Memon, Arshad Ali Khan, Ibrahim Mohammed Tayeb Abdulbaqi, Reem Abou Assi ...

  19. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  20. Sampling, Probability Models and Statistical Reasoning Statistical ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  1. Zika Probably Not Spread Through Saliva: Study

    Science.gov (United States)

    ... page: https://medlineplus.gov/news/fullstory_167531.html Zika Probably Not Spread Through Saliva: Study Research with ... HealthDay News) -- Scientists have some interesting news about Zika: You're unlikely to get the virus from ...

  2. Liquefaction Probability Curves for Surficial Geologic Units

    Science.gov (United States)

    Holzer, T. L.; Noce, T. E.; Bennett, M. J.

    2009-12-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both

  3. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  4. On $\\varphi$-families of probability distributions

    OpenAIRE

    Rui F. Vigelis; Cavalcante, Charles C.

    2011-01-01

    We generalize the exponential family of probability distributions. In our approach, the exponential function is replaced by a $\\varphi$-function, resulting in a $\\varphi$-family of probability distributions. We show how $\\varphi$-families are constructed. In a $\\varphi$-family, the analogue of the cumulant-generating function is a normalizing function. We define the $\\varphi$-divergence as the Bregman divergence associated to the normalizing function, providing a generalization of the Kullbac...

  5. An Illustrative Problem in Computational Probability.

    Science.gov (United States)

    1980-06-01

    easily evaluated. In general, the (n)probabilities #j (t) my be comuted by the numerical solution of the simple differential equations d Cu) * (n) Rt#n...algorithmically tractable solutions to problem in probability adds an interesting new dimension to their analysis. Zn the con- struction of efficient...signif icence. This serves to Illustrate out first point. )hatbematica3lk equivalent solutions sma be vastly diLfferent In their sutIlIty for

  6. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  7. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  8. Assessing magnitude probability distribution through physics-based rupture scenarios

    Science.gov (United States)

    Hok, Sébastien; Durand, Virginie; Bernard, Pascal; Scotti, Oona

    2016-04-01

    When faced with complex network of faults in a seismic hazard assessment study, the first question raised is to what extent the fault network is connected and what is the probability that an earthquake ruptures simultaneously a series of neighboring segments. Physics-based dynamic rupture models can provide useful insight as to which rupture scenario is most probable, provided that an exhaustive exploration of the variability of the input parameters necessary for the dynamic rupture modeling is accounted for. Given the random nature of some parameters (e.g. hypocenter location) and the limitation of our knowledge, we used a logic-tree approach in order to build the different scenarios and to be able to associate them with a probability. The methodology is applied to the three main faults located along the southern coast of the West Corinth rift. Our logic tree takes into account different hypothesis for: fault geometry, location of hypocenter, seismic cycle position, and fracture energy on the fault plane. The variability of these parameters is discussed, and the different values tested are weighted accordingly. 64 scenarios resulting from 64 parameter combinations were included. Sensitivity studies were done to illustrate which parameters control the variability of the results. Given the weight of the input parameters, we evaluated the probability to obtain a full network break to be 15 %, while single segment rupture represents 50 % of the scenarios. These rupture scenario probability distribution along the three faults of the West Corinth rift fault network can then be used as input to a seismic hazard calculation.

  9. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  10. Estimation of first excursion probability for mechanical appendage system subjected to nonstationary earthquake excitation

    Energy Technology Data Exchange (ETDEWEB)

    Aoki, Shigeru; Suzuki, Kohei (Tokyo Metropolitan Univ. (Japan))

    1984-06-01

    An estimation technique whereby the first excursion probability of the mechanical appendage system subjected to the nonstationary seismic excitation can be conventionally calculated is proposed. The first excursion probability of the appendage system is estimated by using this method and the following results are obtained. (1) The probability from this technique is more convervative than that from a simulation technique taking artificial time histories compatible to the design spectrum as input excitation. (2) The first excursion probability is practically independent of the natural period of the appendage system when the tolerable barrier level is normalized by the response amplification factor given by the design spectrum. (3) The first excursion probability decreases as the damping ratio of the appendage system increases. It also decreases as the mass ratio of the appendage system to the supporting system increases. (4) For the inelastic appendage system, the first excursion probability is reduced, if an appropriate elongation is permitted.

  11. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  12. Core calculations of JMTR

    Energy Technology Data Exchange (ETDEWEB)

    Nagao, Yoshiharu [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    1998-03-01

    In material testing reactors like the JMTR (Japan Material Testing Reactor) of 50 MW in Japan Atomic Energy Research Institute, the neutron flux and neutron energy spectra of irradiated samples show complex distributions. It is necessary to assess the neutron flux and neutron energy spectra of an irradiation field by carrying out the nuclear calculation of the core for every operation cycle. In order to advance core calculation, in the JMTR, the application of MCNP to the assessment of core reactivity and neutron flux and spectra has been investigated. In this study, in order to reduce the time for calculation and variance, the comparison of the results of the calculations by the use of K code and fixed source and the use of Weight Window were investigated. As to the calculation method, the modeling of the total JMTR core, the conditions for calculation and the adopted variance reduction technique are explained. The results of calculation are shown. Significant difference was not observed in the results of neutron flux calculations according to the difference of the modeling of fuel region in the calculations by K code and fixed source. The method of assessing the results of neutron flux calculation is described. (K.I.)

  13. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  14. Stark shifts and transition probabilities within the Kr I spectrum

    Science.gov (United States)

    Milosavljević, V.; Simić, Z.; Daniels, S.; Dimitrijević, M. S.

    2012-05-01

    On the basis of 28 experimentally determined prominent neutral krypton (Kr I) line shapes (in the 5s-5p and 5s-6p transitions), we have obtained electron (de) and ion (di) contributions to the total Stark shifts (dt). Stark shifts are also calculated using the semiclassical perturbation formalism (SCPF) for electrons, protons and helium ions as perturbers up to 50 000 K electron temperatures. Transition probabilities of spontaneous emission (Einstein's Ak, i values) have been obtained using the relative line-intensity ratio method. The separate electron (de) and ion (di) contributions to the total Stark shifts are presented, as well as the ion-broadening parameters, which describe the influence of the ion-dynamical effect on the shift of the line shape, for neutral krypton spectral lines. We made a comparison of our measured and calculated de data and compared both of these with other available experimental and theoretical de values.

  15. Multiphase flow calculation software

    Science.gov (United States)

    Fincke, James R.

    2003-04-15

    Multiphase flow calculation software and computer-readable media carrying computer executable instructions for calculating liquid and gas phase mass flow rates of high void fraction multiphase flows. The multiphase flow calculation software employs various given, or experimentally determined, parameters in conjunction with a plurality of pressure differentials of a multiphase flow, preferably supplied by a differential pressure flowmeter or the like, to determine liquid and gas phase mass flow rates of the high void fraction multiphase flows. Embodiments of the multiphase flow calculation software are suitable for use in a variety of applications, including real-time management and control of an object system.

  16. Radar Signature Calculation Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: The calculation, analysis, and visualization of the spatially extended radar signatures of complex objects such as ships in a sea multipath environment and...

  17. Waste Package Lifting Calculation

    Energy Technology Data Exchange (ETDEWEB)

    H. Marr

    2000-05-11

    The objective of this calculation is to evaluate the structural response of the waste package during the horizontal and vertical lifting operations in order to support the waste package lifting feature design. The scope of this calculation includes the evaluation of the 21 PWR UCF (pressurized water reactor uncanistered fuel) waste package, naval waste package, 5 DHLW/DOE SNF (defense high-level waste/Department of Energy spent nuclear fuel)--short waste package, and 44 BWR (boiling water reactor) UCF waste package. Procedure AP-3.12Q, Revision 0, ICN 0, calculations, is used to develop and document this calculation.

  18. Electrical installation calculations advanced

    CERN Document Server

    Kitcher, Christopher

    2013-01-01

    All the essential calculations required for advanced electrical installation workThe Electrical Installation Calculations series has proved an invaluable reference for over forty years, for both apprentices and professional electrical installation engineers alike. The book provides a step-by-step guide to the successful application of electrical installation calculations required in day-to-day electrical engineering practiceA step-by-step guide to everyday calculations used on the job An essential aid to the City & Guilds certificates at Levels 2 and 3For apprentices and electrical installatio

  19. Evapotranspiration Calculator Desktop Tool

    Science.gov (United States)

    The Evapotranspiration Calculator estimates evapotranspiration time series data for hydrological and water quality models for the Hydrologic Simulation Program - Fortran (HSPF) and the Stormwater Management Model (SWMM).

  20. Electronics Environmental Benefits Calculator

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Electronics Environmental Benefits Calculator (EEBC) was developed to assist organizations in estimating the environmental benefits of greening their purchase,...

  1. Electrical installation calculations basic

    CERN Document Server

    Kitcher, Christopher

    2013-01-01

    All the essential calculations required for basic electrical installation workThe Electrical Installation Calculations series has proved an invaluable reference for over forty years, for both apprentices and professional electrical installation engineers alike. The book provides a step-by-step guide to the successful application of electrical installation calculations required in day-to-day electrical engineering practice. A step-by-step guide to everyday calculations used on the job An essential aid to the City & Guilds certificates at Levels 2 and 3Fo

  2. Probability of pregnancy in beef heifers

    Directory of Open Access Journals (Sweden)

    D.P. Faria

    2014-12-01

    Full Text Available This study aimed to evaluate the influence of initial weight, initial age, average daily gain in initial weight, average daily gain in total weight and genetic group on the probability of pregnancy in primiparous females of the Nellore, 1/2 Simmental + 1/2 Nellore, and 3/4 Nellore + 1/4 Simmental genetic groups. Data were collected from the livestock file of the Farpal Farm, located in the municipality of Jaíba, Minas Gerais State, Brazil. The pregnancy diagnosis results (success = 1 and failure = 0 were used to determine the probability of pregnancy that was modeled using logistic regression by the Proc Logistic procedure available on SAS (Statistical..., 2004 software, from the regressor variables initial weight, average daily gain in initial weight, average daily gain in total weight, and genetic group. Initial weight (IW was the most important variable in the probability of pregnancy in heifers, and 1-kg increments in IW allowed for increases of 5.8, 9.8 and 3.4% in the probability of pregnancy in Nellore, 1/2 Simmental + 1/2 Nellore and, 3/4 Nellore + 1/4 Simmental heifers, respectively. The initial age influenced the probability of pregnancy in Nellore heifers. From the estimates of the effects of each variable it was possible to determine the minimum initial weights for each genetic group. This information can be used to monitor the development of heifers until the breeding season and increase the pregnancy rate.

  3. Laboratory-tutorial activities for teaching probability

    Directory of Open Access Journals (Sweden)

    Michael C. Wittmann

    2006-08-01

    Full Text Available We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  4. Failure probability of regional flood defences

    Directory of Open Access Journals (Sweden)

    Lendering Kasper

    2016-01-01

    Full Text Available Polders in the Netherlands are protected from flooding by primary and regional flood defence systems. During the last decade, scientific research in flood risk focused on the development of a probabilistic approach to quantify the probability of flooding of the primary flood defence system. This paper proposed a methodology to quantify the probability of flooding of regional flood defence systems, which required several additions to the methodology used for the primary flood defence system. These additions focused on a method to account for regulation of regional water levels, the possibility of (reduced intrusion resistance due to maintenance dredging in regional water, the probability of traffic loads and the influence of dependence between regional water levels and the phreatic surface of a regional flood defence. In addition, reliability updating is used to demonstrate the potential for updating the probability of failure of regional flood defences with performance observations. The results demonstrated that the proposed methodology can be used to determine the probability of flooding of a regional flood defence system. In doing so, the methodology contributes to improving flood risk management in these systems.

  5. The role of probabilities in physics.

    Science.gov (United States)

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Methodology for embedded transport core calculation

    Science.gov (United States)

    Ivanov, Boyan D.

    The progress in the Nuclear Engineering field leads to developing new generations of Nuclear Power Plants (NPP) with complex rector core designs, such as cores loaded partially with mixed-oxide (MOX) fuel, high burn-up loadings, and cores with advanced designs of fuel assemblies and control rods. Such heterogeneous cores introduce challenges for the diffusion theory that has been used for several decades for calculations of the current Pressurized Water Rector (PWR) cores. To address the difficulties the diffusion approximation encounters new core calculation methodologies need to be developed by improving accuracy, while preserving efficiency of the current reactor core calculations. In this thesis, an advanced core calculation methodology is introduced, based on embedded transport calculations. Two different approaches are investigated. The first approach is based on embedded finite element (FEM), simplified P3 approximation (SP3), fuel assembly (FA) homogenization calculation within the framework of the diffusion core calculation with NEM code (Nodal Expansion Method). The second approach involves embedded FA lattice physics eigenvalue calculation based on collision probability method (CPM) again within the framework of the NEM diffusion core calculation. The second approach is superior to the first because most of the uncertainties introduced by the off-line cross-section generation are eliminated.

  7. Chemical calculations and chemicals that might calculate

    Science.gov (United States)

    Barnett, Michael P.

    I summarize some applications of symbolic calculation to the evaluation of molecular integrals over Slater orbitals, and discuss some spin-offs of this work that have wider potential. These include the exploration of the mechanized use of analogy. I explain the methods that I use to do this, in relation to mathematical proofs and to modeling step by step processes such as organic syntheses and NMR pulse sequences. Another spin-off relates to biological information processing. Some challenges and opportunities in the information infrastructure of interdisciplinary research are discussed.

  8. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  9. Classicality versus quantumness in Born's probability

    Science.gov (United States)

    Luo, Shunlong

    2017-11-01

    Born's rule, which postulates the probability of a measurement outcome in a quantum state, is pivotal to interpretations and applications of quantum mechanics. By exploiting the departure of the product of two Hermitian operators in Born's rule from Hermiticity, we prescribe an intrinsic and natural scheme to decompose Born's probability into a classical part and a quantum part, which have significant implications in quantum information theory. The classical part constitutes the information compatible with the associated measurement operator, while the quantum part represents the quantum coherence of the state with respect to the measurement operator. Fundamental properties of the decomposition are revealed. As applications, we establish several trade-off relations for the classicality and quantumness in Born's probability, which may be interpreted as alternative realizations of Heisenberg's uncertainty principle. The results shed physical lights on related issues concerning quantification of complementarity, coherence, and uncertainty, as well as the classical-quantum interplay.

  10. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  11. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  12. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  14. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  15. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  16. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  17. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  18. Explosion probability of unexploded ordnance: expert beliefs.

    Science.gov (United States)

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  19. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  20. Landslide hazard mapping considering rainfall probability in Inje, Korea

    Directory of Open Access Journals (Sweden)

    M.J. Lee

    2016-01-01

    Full Text Available This study evaluated the landslide hazard at Inje, Korea, using a geographic information system (GIS and rainfall probabilistic data. The locations of landslides were identified in the study area by aerial photograph interpretation and field surveys. Data about rainfall probability, topography, and geology were collected, processed, and compiled in a spatial database using GIS. Then, the probability of landslides in the study area in recurrence interval years in the future was calculated assuming that landslides are triggered by a daily rainfall of 202 mm or a three-day cumulative rainfall of 449 mm. Twelve factors that influence landslide occurrence were chosen from a database of topography, soil, and forest cover. Landslide susceptibilities were analysed and mapped according to these landslide-occurrence factors, employing the frequency ratio method. Of the total landslide locations, 50% were used for hazard analysis and the remaining 50% were used for model validation. Validation results for the daily rainfall of 202 mm and three-day cumulative rainfall of 449 mm for recurrence interval years were from 89.22% to 91.80% and from 89.38% to 93.80%, respectively. This analysis of landslide hazards took rainfall probability into account. Rainfall, including heavy rainfall, is expected to increase in the future.

  1. The probability of forming hypervelocity stars in the Galaxy

    Science.gov (United States)

    Dremova, G. N.; Dremov, V. V.; Orlov, V. V.; Tutukov, A. V.; Shirokova, K. S.

    2015-11-01

    The probability of forming a Galactic hypervelocity star is estimated for the scenario of Hills, which describes the dynamical capture of one component of a binary star by the gravitational field of the supermassive black hole in the Galactic center, leading to the ejection of the other component. Ten thousand initial orientations of the binary orbits were considered, and the semi-major axes of the binary orbits were varied in a wide range from 11.3 R ⊙ to 425 R ⊙. Two series of computations were carried out, in which the mass of the supermassive black hole was taken to be 106 M ⊙ and 3.4 × 106 M ⊙. Numerical simulations of encounters of the binary and black hole in the framework of the three-body and N-body problems are used to localize regions favorable for the formation of hypervelocity stars. The motion of the ejected star in the regular field of the Galaxy is calculated, and the conditions under which the star escapes the Galaxy defined. The probability of escaping the Galaxy is caluclated as a function of various parameters the initial separation of the binary components and the distance of the binary from the black hole. On average, the probability of forming a hypervelocity star is higher for closer encounters and more tightly bound binary pairs.

  2. Emission probability and photon statistics of a coherently driven mazer

    Energy Technology Data Exchange (ETDEWEB)

    Xiong Jin [Department of Physics, Shanghai Jiao Tong University, Shanghai (China)]. E-mail: xiongjint@hotmail.com; Zhang Zhiming [Department of Physics, Shanghai Jiao Tong University, Shanghai (China)

    2002-05-14

    The idea of a mazer is put forward with particular reference to the question of the driving-induced atomic coherence, which is established by a coherent driving field. The interaction of a quantum cavity field and an ultracold V-type three-level atom in which two levels are coupled by a coherent driving field, is derived. Its general quantum theory is established and the atomic emission probability and photon statistics are calculated and analysed. It is found that the mazer based on this driving-induced atomic coherence shows new features. There is a non-vanishing probability for the atom emitting a photon in the cavity even when the resonance condition is not fulfilled (here the resonance condition means that the cavity length is an integer multiple of half the atomic de Broglie wavelength). Under the resonance condition, the atomic emission probability has two sets of resonance peaks. For a very strong coherent driving field, the emission of the atom can be forbidden. As to the photon statistics, when the driving field is not very strong, the driving-induced atomic coherence reduces the photon number fluctuations of the cavity field. The photon statistics exhibits strong sub-Poissonian behaviour. In the region considered here, it can even be sub-Poissonian for any cavity length. However, when the driving field is too strong, the sub-Poissonian property may disappear. (author)

  3. [Understanding dosage calculations].

    Science.gov (United States)

    Benlahouès, Daniel

    2016-01-01

    The calculation of dosages in paediatrics is the concern of the whole medical and paramedical team. This activity must generate a minimum of risks in order to prevent care-related adverse events. In this context, the calculation of dosages is a practice which must be understood by everyone. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  4. Probabilities for separating sets of order statistics.

    Science.gov (United States)

    Glueck, D H; Karimpour-Fard, A; Mandel, J; Muller, K E

    2010-04-01

    Consider a set of order statistics that arise from sorting samples from two different populations, each with their own, possibly different distribution functions. The probability that these order statistics fall in disjoint, ordered intervals and that of the smallest statistics, a certain number come from the first populations is given in terms of the two distribution functions. The result is applied to computing the joint probability of the number of rejections and the number of false rejections for the Benjamini-Hochberg false discovery rate procedure.

  5. Harmonic analysis and the theory of probability

    CERN Document Server

    Bochner, Salomon

    2005-01-01

    Nineteenth-century studies of harmonic analysis were closely linked with the work of Joseph Fourier on the theory of heat and with that of P. S. Laplace on probability. During the 1920s, the Fourier transform developed into one of the most effective tools of modern probabilistic research; conversely, the demands of the probability theory stimulated further research into harmonic analysis.Mathematician Salomon Bochner wrote a pair of landmark books on the subject in the 1930s and 40s. In this volume, originally published in 1955, he adopts a more probabilistic view and emphasizes stochastic pro

  6. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  7. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  8. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  9. Probability Inequalities for a Gladiator Game

    OpenAIRE

    Yosef Rinott; Marco Scarsini; Yaming Yu

    2011-01-01

    Based on a model introduced by Kaminsky, Luks, and Nelson (1984), we consider a zero-sum allocation game called the Gladiator Game, where two teams of gladiators engage in a sequence of one-to-one fights in which the probability of winning is a function of the gladiators' strengths. Each team's strategy consist the allocation of its total strength among its gladiators. We find the Nash equilibria of the game and compute its value. To do this, we study interesting majorization-type probability...

  10. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  11. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  12. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  13. Concepts of probability in radiocarbon analysis

    Directory of Open Access Journals (Sweden)

    Bernhard Weninger

    2011-12-01

    Full Text Available In this paper we explore the meaning of the word probability, not in general terms, but restricted to the field of radiocarbon dating, where it has the meaning of ‘dating probability assigned to calibrated 14C-ages’. The intention of our study is to improve our understanding of certain properties of radiocarbon dates, which – although mathematically abstract – are fundamental both for the construction of age models in prehistoric archaeology, as well as for an adequate interpretation of their reliability.

  14. STRIP: stream learning of influence probabilities

    DEFF Research Database (Denmark)

    Kutzkov, Konstantin

    2013-01-01

    cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...... (landmark and sliding window). Among several results, we show that we can learn influence probabilities with one pass over the data, using O(n log n) space, in both the landmark model and the sliding-window model, and we further show that our algorithm is within a logarithmic factor of optimal. For truly...

  15. Display advertising: Estimating conversion probability efficiently

    OpenAIRE

    Safari, Abdollah; Altman, Rachel MacKay; Loughin, Thomas M.

    2017-01-01

    The goal of online display advertising is to entice users to "convert" (i.e., take a pre-defined action such as making a purchase) after clicking on the ad. An important measure of the value of an ad is the probability of conversion. The focus of this paper is the development of a computationally efficient, accurate, and precise estimator of conversion probability. The challenges associated with this estimation problem are the delays in observing conversions and the size of the data set (both...

  16. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    We evaluate the binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Harrison, Martínez-Correa and Swarthout [2013] found that the binary lottery procedure works robustly to induce risk neutrality when subjects are given one risk task defined over...... objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  17. Return probability: Exponential versus Gaussian decay

    Energy Technology Data Exchange (ETDEWEB)

    Izrailev, F.M. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)]. E-mail: izrailev@sirio.ifuap.buap.mx; Castaneda-Mendoza, A. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)

    2006-02-13

    We analyze, both analytically and numerically, the time-dependence of the return probability in closed systems of interacting particles. Main attention is paid to the interplay between two regimes, one of which is characterized by the Gaussian decay of the return probability, and another one is the well-known regime of the exponential decay. Our analytical estimates are confirmed by the numerical data obtained for two models with random interaction. In view of these results, we also briefly discuss the dynamical model which was recently proposed for the implementation of a quantum computation.

  18. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  19. Energy levels and transition probabilities for Fe XXV ions

    Energy Technology Data Exchange (ETDEWEB)

    Norrington, P.H.; Kingston, A.E.; Boone, A.W. [Department of Applied Maths and Theoretical Physics, Queen' s University, Belfast BT7 1NN (United Kingdom)

    2000-05-14

    The energy levels of the 1s{sup 2}, 1s2l and 1s3l states of helium-like iron Fe XXV have been calculated using two sets of configuration-interaction wavefunctions. One set of wavefunctions was generated using the fully relativistic GRASP code and the other was obtained using CIV3, in which relativistic effects are introduced using the Breit-Pauli approximation. For transitions from the ground state to the n=2 and 3 states and for transitions between the n=2 and 3 states, the calculated excitation energies obtained by these two independent methods are in very good agreement and there is good agreement between these results and recent theoretical and experimental results. However, there is considerable disagreement between the various excitation energies for the transitions among the n=2 and also among the n=3 states. The two sets of wavefunctions are also used to calculate the E1, E2, M1 and M2 transition probabilities between all of the 1s{sup 2}, 1s2l and 1s3l states of helium-like iron Fe XXV. The results from the two calculations are found to be similar and to compare very well with other recent results for {delta}n=1 or 2 transitions. For {delta}n=0 transitions the agreement is much less satisfactory; this is mainly due to differences in the excitation energies. (author)

  20. Transition probabilities in neutron-rich Se,8684

    Science.gov (United States)

    Litzinger, J.; Blazhev, A.; Dewald, A.; Didierjean, F.; Duchêne, G.; Fransen, C.; Lozeva, R.; Sieja, K.; Verney, D.; de Angelis, G.; Bazzacco, D.; Birkenbach, B.; Bottoni, S.; Bracco, A.; Braunroth, T.; Cederwall, B.; Corradi, L.; Crespi, F. C. L.; Désesquelles, P.; Eberth, J.; Ellinger, E.; Farnea, E.; Fioretto, E.; Gernhäuser, R.; Goasduff, A.; Görgen, A.; Gottardo, A.; Grebosz, J.; Hackstein, M.; Hess, H.; Ibrahim, F.; Jolie, J.; Jungclaus, A.; Kolos, K.; Korten, W.; Leoni, S.; Lunardi, S.; Maj, A.; Menegazzo, R.; Mengoni, D.; Michelagnoli, C.; Mijatovic, T.; Million, B.; Möller, O.; Modamio, V.; Montagnoli, G.; Montanari, D.; Morales, A. I.; Napoli, D. R.; Niikura, M.; Pollarolo, G.; Pullia, A.; Quintana, B.; Recchia, F.; Reiter, P.; Rosso, D.; Sahin, E.; Salsac, M. D.; Scarlassara, F.; Söderström, P.-A.; Stefanini, A. M.; Stezowski, O.; Szilner, S.; Theisen, Ch.; Valiente Dobón, J. J.; Vandone, V.; Vogt, A.

    2015-12-01

    Reduced quadrupole transition probabilities for low-lying transitions in neutron-rich Se,8684 are investigated with a recoil distance Doppler shift (RDDS) experiment. The experiment was performed at the Istituto Nazionale di Fisica Nucleare (INFN) Laboratori Nazionali di Legnaro using the Cologne Plunger device for the RDDS technique and the AGATA Demonstrator array for the γ -ray detection coupled to the PRISMA magnetic spectrometer for an event-by-event particle identification. In 86Se the level lifetime of the yrast 21+ state and an upper limit for the lifetime of the 41+ state are determined for the first time. The results of 86Se are in agreement with previously reported predictions of large-scale shell-model calculations using Ni78-I and Ni78-II effective interactions. In addition, intrinsic shape parameters of lowest yrast states in 86Se are calculated. In semimagic 84Se level lifetimes of the yrast 41+ and 61+ states are determined for the first time. Large-scale shell-model calculations using effective interactions Ni78-II, JUN45, jj4b, and jj4pna are performed. The calculations describe B (E 2 ;21+→01+) and B (E 2 ;61+→41+) fairly well and point out problems in reproducing the experimental B (E 2 ;41+→21+) .

  1. Probability learning and Piagetian probability conceptions in children 5 to 12 years old.

    Science.gov (United States)

    Kreitler, S; Zigler, E; Kreitler, H

    1989-11-01

    This study focused on the relations between performance on a three-choice probability-learning task and conceptions of probability as outlined by Piaget concerning mixture, normal distribution, random selection, odds estimation, and permutations. The probability-learning task and four Piagetian tasks were administered randomly to 100 male and 100 female, middle SES, average IQ children in three age groups (5 to 6, 8 to 9, and 11 to 12 years old) from different schools. Half the children were from Middle Eastern backgrounds, and half were from European or American backgrounds. As predicted, developmental level of probability thinking was related to performance on the probability-learning task. The more advanced the child's probability thinking, the higher his or her level of maximization and hypothesis formulation and testing and the lower his or her level of systematically patterned responses. The results suggest that the probability-learning and Piagetian tasks assess similar cognitive skills and that performance on the probability-learning task reflects a variety of probability concepts.

  2. Momentum Probabilities for a Single Quantum Particle in Three-Dimensional Regular "Infinite" Wells: One Way of Promoting Understanding of Probability Densities

    Science.gov (United States)

    Riggs, Peter J.

    2013-01-01

    Students often wrestle unsuccessfully with the task of correctly calculating momentum probability densities and have difficulty in understanding their interpretation. In the case of a particle in an "infinite" potential well, its momentum can take values that are not just those corresponding to the particle's quantised energies but…

  3. Multiple Auger decay probabilities of neon from the 1 s -core-hole state

    Science.gov (United States)

    Ma, Yulong; Zhou, Fuyang; Liu, Ling; Qu, Yizhi

    2017-10-01

    The multiple Auger decays of the Ne 1 s-1 state including double and triple Auger processes are investigated within the framework of perturbation theory. The contributions of the cascade and direct processes are determined for the double Auger decay. In the cascade processes, the choice of the orbital sets and configuration interaction can strongly affect the partial probabilities for the specific configurations of N e3 + . The multistep approaches, i.e., the knockout and shakeoff mechanisms, are implemented to deal with the direct double Auger processes for which the total and partial probabilities corresponding to specific configurations of N e3 + are calculated and reveal that the knockout is dominant. Finally, the probabilities of the triple Auger decays that are decomposed into a double Auger process and a subsequent emission of a single electron are obtained using the cascade and knockout mechanisms. The calculated probabilities agree reasonably well with the available experimental data.

  4. Refresher Course in Probability, Stochastic Processes and ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 5. Refresher Course in Probability, Stochastic Processes and Applications. Information and Announcements Volume 10 Issue 5 May 2005 pp 100-100. Fulltext. Click here to view fulltext PDF. Permanent link:

  5. Statistical methods for solar flare probability forecasting

    Science.gov (United States)

    Vecchia, D. F.; Tryon, P. V.; Caldwell, G. A.; Jones, R. W.

    1980-09-01

    The Space Environment Services Center (SESC) of the National Oceanic and Atmospheric Administration provides probability forecasts of regional solar flare disturbances. This report describes a statistical method useful to obtain 24 hour solar flare forecasts which, historically, have been subjectively formulated. In Section 1 of this report flare classifications of the SESC and the particular probability forecasts to be considered are defined. In Section 2 we describe the solar flare data base and outline general principles for effective data management. Three statistical techniques for solar flare probability forecasting are discussed in Section 3, viz, discriminant analysis, logistic regression, and multiple linear regression. We also review two scoring measures and suggest the logistic regression approach for obtaining 24 hour forecasts. In Section 4 a heuristic procedure is used to select nine basic predictors from the many available explanatory variables. Using these nine variables logistic regression is demonstrated by example in Section 5. We conclude in Section 6 with band broad suggestions regarding continued development of objective methods for solar flare probability forecasting.

  6. Probability in Action: The Red Traffic Light

    Science.gov (United States)

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  7. Haavelmo's Probability Approach and the Cointegrated VAR

    DEFF Research Database (Denmark)

    Juselius, Katarina

    dependent residuals, normalization, reduced rank, model selection, missing variables, simultaneity, autonomy and iden- ti…cation. Speci…cally the paper discusses (1) the conditions under which the VAR model represents a full probability formulation of a sample of time-series observations, (2...

  8. Critique of `Elements of Quantum Probability'

    NARCIS (Netherlands)

    Gill, R.D.

    1998-01-01

    We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension

  9. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data...

  10. Teaching Mathematics with Technology: Probability Simulations.

    Science.gov (United States)

    Bright, George W.

    1989-01-01

    Discussed are the use of probability simulations in a mathematics classroom. Computer simulations using regular dice and special dice are described. Sample programs used to generate 100 rolls of a pair of dice in BASIC and Logo languages are provided. (YP)

  11. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  12. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto; Baake, Ellen; Georgii, Hans-Otto

    2008-01-01

    This book is a translation of the third edition of the well accepted German textbook 'Stochastik', which presents the fundamental ideas and results of both probability theory and statistics, and comprises the material of a one-year course. The stochastic concepts, models and methods are motivated by examples and problems and then developed and analysed systematically.

  13. Laplace's 1774 Memoir on Inverse Probability

    OpenAIRE

    Stigler, Stephen M.

    1986-01-01

    Laplace's first major article on mathematical statistics was published in 1774. It is arguably the most influential article in this field to appear before 1800, being the first widely read presentation of inverse probability and its application to both binomial and location parameter estimation. After a brief introduction, and English translation of this epochal memoir is given.

  14. A Probability Model for Belady's Anomaly

    Science.gov (United States)

    McMaster, Kirby; Sambasivam, Samuel E.; Anderson, Nicole

    2010-01-01

    In demand paging virtual memory systems, the page fault rate of a process varies with the number of memory frames allocated to the process. When an increase in the number of allocated frames leads to an increase in the number of page faults, Belady's anomaly is said to occur. In this paper, we present a probability model for Belady's anomaly. We…

  15. Probability Theories and the Justification of Theism

    OpenAIRE

    Portugal, Agnaldo Cuoco

    2003-01-01

    In the present paper I intend to analyse, criticise and suggest an alternative to Richard Swinburne"s use of Bayes"s theorem to justify the belief that there is a God. Swinburne"s contribution here lies in the scope of his project and the interpretation he adopts for Bayes"s formula, a very important theorem of the probability calculus.

  16. Probability as a theory dependent concept

    NARCIS (Netherlands)

    Atkinson, D; Peijnenburg, J

    1999-01-01

    It is argued that probability should be defined implicitly by the distributions of possible measurement values characteristic of a theory. These distributions are tested by, but not defined in terms of, relative frequencies of occurrences of events of a specified kind. The adoption of an a priori

  17. Probability from a Socio-Cultural Perspective

    Science.gov (United States)

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  18. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  19. Investigating Probability with the NBA Draft Lottery.

    Science.gov (United States)

    Quinn, Robert J.

    1997-01-01

    Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…

  20. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  1. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  2. Probability in traffic : A challenge for modelling

    NARCIS (Netherlands)

    Calvert, S.C.; Taale, H.; Snelder, M.; Hoogendoorn, S.P.

    2012-01-01

    In the past decade an increase in research regarding stochasticity and probability in traffic modelling has occurred. The realisation has grown that simple presumptions and basic stochastic elements are insufficient to give accurate modelling results in many cases. This paper puts forward a strong

  3. Probability in traffic: a challenge for modelling

    NARCIS (Netherlands)

    Calvert, S.C.; Taale, H.; Snelder, M.; Hoogendoorn, S.P.

    2012-01-01

    In the past decade an increase in research regarding stochasticity and probability in traffic modelling has occurred. The realisation has grown that simple presumptions and basic stochastic elements are insufficient to give accurate modelling results in many cases. This paper puts forward a strong

  4. Probability and Statistics for Particle Physicists

    CERN Document Server

    Ocariz, J.

    2014-01-01

    Lectures presented at the 1st CERN Asia-Europe-Pacific School of High-Energy Physics, Fukuoka, Japan, 14-27 October 2012. A pedagogical selection of topics in probability and statistics is presented. Choice and emphasis are driven by the author's personal experience, predominantly in the context of physics analyses using experimental data from high-energy physics detectors.

  5. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    eligible voters who support a particular political party. A random sample of size n is selected from this population and suppose k voters support this party. What is a good estimate of the required proportion? How do we obtain a probability model for the experi- ment just conducted? Let us examine the following simple ex-.

  6. A nonparametric method for predicting survival probabilities

    NARCIS (Netherlands)

    van der Klaauw, B.; Vriend, S.

    2015-01-01

    Public programs often use statistical profiling to assess the risk that applicants will become long-term dependent on the program. The literature uses linear probability models and (Cox) proportional hazard models to predict duration outcomes. These either focus on one threshold duration or impose

  7. Conceptual Variation and Coordination in Probability Reasoning

    Science.gov (United States)

    Nilsson, Per

    2009-01-01

    This study investigates students' conceptual variation and coordination among theoretical and experimental interpretations of probability. In the analysis we follow how Swedish students (12-13 years old) interact with a dice game, specifically designed to offer the students opportunities to elaborate on the logic of sample space,…

  8. Probability distribution functions in turbulent convection

    Science.gov (United States)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdf's) for Rayleigh-Benard convection, in the hard turbulence regime, are presented. It is seen that the pdf's exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to universality is presented.

  9. Learning a Probability Distribution Efficiently and Reliably

    Science.gov (United States)

    Laird, Philip; Gamble, Evan

    1988-01-01

    A new algorithm, called the CDF-Inversion Algorithm, is described. Using it, one can efficiently learn a probability distribution over a finite set to a specified accuracy and confidence. The algorithm can be extended to learn joint distributions over a vector space. Some implementation results are described.

  10. Five-Parameter Bivariate Probability Distribution

    Science.gov (United States)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  11. Rethinking the learning of belief network probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  12. Failure probability of regional flood defences

    NARCIS (Netherlands)

    Lendering, K.T.; lang, M.; Klijn, F.; Samuels, P.

    2016-01-01

    Polders in the Netherlands are protected from flooding by primary and regional flood defence systems. During the last decade, scientific research in flood risk focused on the development of a probabilistic approach to quantify the probability of flooding of the primary flood defence system. This

  13. Comonotonic Book-Making with Nonadditive Probabilities

    NARCIS (Netherlands)

    Diecidue, E.; Wakker, P.P.

    2000-01-01

    This paper shows how de Finetti's book-making principle, commonly used to justify additive subjective probabilities, can be modi-ed to agree with some nonexpected utility models.More precisely, a new foundation of the rank-dependent models is presented that is based on a comonotonic extension of the

  14. Good Practices in Free-energy Calculations

    Science.gov (United States)

    Pohorille, Andrew; Jarzynski, Christopher; Chipot, Christopher

    2013-01-01

    As access to computational resources continues to increase, free-energy calculations have emerged as a powerful tool that can play a predictive role in drug design. Yet, in a number of instances, the reliability of these calculations can be improved significantly if a number of precepts, or good practices are followed. For the most part, the theory upon which these good practices rely has been known for many years, but often overlooked, or simply ignored. In other cases, the theoretical developments are too recent for their potential to be fully grasped and merged into popular platforms for the computation of free-energy differences. The current best practices for carrying out free-energy calculations will be reviewed demonstrating that, at little to no additional cost, free-energy estimates could be markedly improved and bounded by meaningful error estimates. In energy perturbation and nonequilibrium work methods, monitoring the probability distributions that underlie the transformation between the states of interest, performing the calculation bidirectionally, stratifying the reaction pathway and choosing the most appropriate paradigms and algorithms for transforming between states offer significant gains in both accuracy and precision. In thermodynamic integration and probability distribution (histogramming) methods, properly designed adaptive techniques yield nearly uniform sampling of the relevant degrees of freedom and, by doing so, could markedly improve efficiency and accuracy of free energy calculations without incurring any additional computational expense.

  15. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  16. Neural representation of probabilities for Bayesian inference.

    Science.gov (United States)

    Rich, Dylan; Cazettes, Fanny; Wang, Yunyan; Peña, José Luis; Fischer, Brian J

    2015-04-01

    Bayesian models are often successful in describing perception and behavior, but the neural representation of probabilities remains in question. There are several distinct proposals for the neural representation of probabilities, but they have not been directly compared in an example system. Here we consider three models: a non-uniform population code where the stimulus-driven activity and distribution of preferred stimuli in the population represent a likelihood function and a prior, respectively; the sampling hypothesis which proposes that the stimulus-driven activity over time represents a posterior probability and that the spontaneous activity represents a prior; and the class of models which propose that a population of neurons represents a posterior probability in a distributed code. It has been shown that the non-uniform population code model matches the representation of auditory space generated in the owl's external nucleus of the inferior colliculus (ICx). However, the alternative models have not been tested, nor have the three models been directly compared in any system. Here we tested the three models in the owl's ICx. We found that spontaneous firing rate and the average stimulus-driven response of these neurons were not consistent with predictions of the sampling hypothesis. We also found that neural activity in ICx under varying levels of sensory noise did not reflect a posterior probability. On the other hand, the responses of ICx neurons were consistent with the non-uniform population code model. We further show that Bayesian inference can be implemented in the non-uniform population code model using one spike per neuron when the population is large and is thus able to support the rapid inference that is necessary for sound localization.

  17. Exact probability distribution for the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise

    Science.gov (United States)

    Calisto, H.; Bologna, M.

    2007-05-01

    We report an exact result for the calculation of the probability distribution of the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise. We study the conditions under which the probability distribution of the Malthus-Verhulst model can exhibit a transition from a unimodal to a bimodal distribution depending on the value of a critical parameter. Also we show that the mean value of x(t) in the latter model always approaches asymptotically the value 1.

  18. Calculativeness and trust

    DEFF Research Database (Denmark)

    Frederiksen, Morten

    2014-01-01

    Williamson’s characterisation of calculativeness as inimical to trust contradicts most sociological trust research. However, a similar argument is found within trust phenomenology. This paper re-investigates Williamson’s argument from the perspective of Løgstrup’s phenomenological theory of trust....... Contrary to Williamson, however, Løgstrup’s contention is that trust, not calculativeness, is the default attitude and only when suspicion is awoken does trust falter. The paper argues that while Williamson’s distinction between calculativeness and trust is supported by phenomenology, the analysis needs...... to take actual subjective experience into consideration. It points out that, first, Løgstrup places trust alongside calculativeness as a different mode of engaging in social interaction, rather conceiving of trust as a state or the outcome of a decision-making process. Secondly, the analysis must take...

  19. Unit Cost Compendium Calculations

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Unit Cost Compendium (UCC) Calculations raw data set was designed to provide for greater accuracy and consistency in the use of unit costs across the USEPA...

  20. Magnetic Field Grid Calculator

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Magnetic Field Properties Calculator will computes the estimated values of Earth's magnetic field(declination, inclination, vertical component, northerly...

  1. National Stormwater Calculator

    Science.gov (United States)

    EPA’s National Stormwater Calculator (SWC) is a desktop application that estimates the annual amount of rainwater and frequency of runoff from a specific site anywhere in the United States (including Puerto Rico).

  2. On The Left Tail-End Probabilities and the Probability Generating ...

    African Journals Online (AJOL)

    end probabilities, p( ≤ i ) = Πṙ The resulting function, πx(t), is continuous and converges uniformly within the unit circle, | t | < 1. A clear functional link is established between πx(t) and two other well known versions of the probability generating ...

  3. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  4. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  5. Statistics of adaptive optics speckles: From probability cloud to probability density function

    OpenAIRE

    Yaitskova, Natalia; Gladysz, Szymon

    2016-01-01

    The complex amplitude in the focal plane of adaptive optics system is modelled as an elliptical complex random variable. The geometrical properties of the probability density function of such variable relate directly to the statistics of the residual phase. Building solely on the twodimensional geometry, the expression for the probability density function of speckle intensity is derived.

  6. Calculation Tool for Engineering

    OpenAIRE

    Lampinen, Samuli

    2016-01-01

    The Study was conducted as qualitative research for K-S Konesuunnittelu Oy. The company provides mechanical engineering for technology suppliers in the Finnish export industries. The main objective was to study if the competitiveness of the case company could be improved using a self-made Calculation Tool (Excel Tool). The mission was to clarify processes in the case company to see the possibilities of Excel Tool and to compare it with other potential calculation applications. In addition,...

  7. Current interruption transients calculation

    CERN Document Server

    Peelo, David F

    2014-01-01

    Provides an original, detailed and practical description of current interruption transients, origins, and the circuits involved, and how they can be calculated Current Interruption Transients Calculationis a comprehensive resource for the understanding, calculation and analysis of the transient recovery voltages (TRVs) and related re-ignition or re-striking transients associated with fault current interruption and the switching of inductive and capacitive load currents in circuits. This book provides an original, detailed and practical description of current interruption transients, origins,

  8. Transmission probability of poly(dA)-poly(dT) DNA in electric field

    Science.gov (United States)

    Rahmi, K. A.; Yudiarsah, E.

    2017-07-01

    Transmission probability of poly(dA)-poly(dT) DNA in electric field for several voltages has been studied. The DNA molecule is modeled by using tight binding Hamiltonian model. It is contacted to electrodes at both sides with 32 long base pairs. The voltage is applied at the electrodes and assumed it can change base onsite energy linearly, so can influence charge transmission in DNA chain. The transmission probability is calculated using transfer matrix and scattering matrix method. The transmission probability results also be compared at different temperatures and twisting motion frequencies. The results show that as the voltage increases, the transmission probability at transmission region with energy higher energy than Fermi energy increases. The increment of transmission probability with voltage increment becomes larger at higher twisting motion frequency, but it becomes smaller at higher temperature.

  9. Probability Map Viewer: near real-time probability map generator of serial block electron microscopy collections.

    Science.gov (United States)

    Churas, Christopher; Perez, Alex J; Hakozaki, Hiroyuki; Wong, Willy; Lee, David; Peltier, Steven T; Ellisman, Mark H

    2017-10-01

    To expedite the review of semi-automated probability maps of organelles and other features from 3D electron microscopy data we have developed Probability Map Viewer, a Java-based web application that enables the computation and visualization of probability map generation results in near real-time as the data are being collected from the microscope. Probability Map Viewer allows the user to select one or more voxel classifiers, apply them on a sub-region of an active collection, and visualize the results as overlays on the raw data via any web browser using a personal computer or mobile device. Thus, Probability Map Viewer accelerates and informs the image analysis workflow by providing a tool for experimenting with and optimizing dataset-specific segmentation strategies during imaging. https://github.com/crbs/probabilitymapviewer. mellisman@ucsd.edu. Supplementary data are available at Bioinformatics online.

  10. Tracking of pitch probabilities in congenital amusia.

    Science.gov (United States)

    Omigie, Diana; Pearce, Marcus T; Stewart, Lauren

    2012-06-01

    Auditory perception involves not only hearing a series of sounds but also making predictions about future ones. For typical listeners, these predictions are formed on the basis of long-term schematic knowledge, gained over a lifetime of exposure to the auditory environment. Individuals with a developmental disorder known as congenital amusia show marked difficulties with music perception and production. The current study investigated whether these difficulties can be explained, either by a failure to internalise the statistical regularities present in music, or by a failure to consciously access this information. Two versions of a melodic priming paradigm were used to probe participants' abilities to form melodic pitch expectations, in an implicit and an explicit manner. In the implicit version (Experiment 1), participants made speeded, forced-choice discriminations concerning the timbre of a cued target note. In the explicit version (Experiment 2), participants used a 1-7 rating scale to indicate the degree to which the pitch of the cued target note was expected or unexpected. Target notes were chosen to have high or low probability in the context of the melody, based on the predictions of a computational model of melodic expectation. Analysis of the data from the implicit task revealed a melodic priming effect in both amusic and control participants whereby both groups showed faster responses to high probability than low probability notes rendered in the same timbre as the context. However, analysis of the data from the explicit task revealed that amusic participants were significantly worse than controls at using explicit ratings to differentiate between high and low probability events in a melodic context. Taken together, findings from the current study make an important contribution in demonstrating that amusic individuals track melodic pitch probabilities at an implicit level despite an impairment, relative to controls, when required to make explicit

  11. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  12. Transit probabilities for debris around white dwarfs

    Science.gov (United States)

    Lewis, John Arban; Johnson, John A.

    2017-01-01

    The discovery of WD 1145+017 (Vanderburg et al. 2015), a metal-polluted white dwarf with an infrared-excess and transits confirmed the long held theory that at least some metal-polluted white dwarfs are actively accreting material from crushed up planetesimals. A statistical understanding of WD 1145-like systems would inform us on the various pathways for metal-pollution and the end states of planetary systems around medium- to high-mass stars. However, we only have one example and there are presently no published studies of transit detection/discovery probabilities for white dwarfs within this interesting regime. We present a preliminary look at the transit probabilities for metal-polluted white dwarfs and their projected space density in the Solar Neighborhood, which will inform future searches for analogs to WD 1145+017.

  13. Estimation of transition probabilities of credit ratings

    Science.gov (United States)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  14. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  15. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  16. A probability distribution model for rain rate

    Science.gov (United States)

    Kedem, Benjamin; Pavlopoulos, Harry; Guan, Xiaodong; Short, David A.

    1994-01-01

    A systematic approach is suggested for modeling the probability distribution of rain rate. Rain rate, conditional on rain and averaged over a region, is modeled as a temporally homogeneous diffusion process with appropiate boundary conditions. The approach requires a drift coefficient-conditional average instantaneous rate of change of rain intensity-as well as a diffusion coefficient-the conditional average magnitude of the rate of growth and decay of rain rate about its drift. Under certain assumptions on the drift and diffusion coefficients compatible with rain rate, a new parametric family-containing the lognormal distribution-is obtained for the continuous part of the stationary limit probability distribution. The family is fitted to tropical rainfall from Darwin and Florida, and it is found that the lognormal distribution provides adequate fits as compared with other members of the family and also with the gamma distribution.

  17. Transition probabilities for Be I, Be II, Mg I, and Mg II

    CERN Document Server

    Zheng Neng Wu; Yangru Yi; Zhou Tao; Ma Dong Xia; Wu Yong Gang; Xu Hai Ta

    2001-01-01

    The Weakest Bound Electron Potential Model (WBEPM) is used to calculate transition probabilities between LS multiplets of Be I, Be II, Mg I, and Mg II. In this calculation, a coupled set of equations is employed to determine effective charges Z* and effective quantum numbers n* and l* using, as input data, experimental energy levels and radial expectation values obtained with the numerical Coulomb approximation. Transition probabilities between highly excited states are evaluated using modified hydrogenic wavefunctions. Good agreement is seen in comparisons of the present results with those from other works.

  18. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  19. An introduction to measure-theoretic probability

    CERN Document Server

    Roussas, George G

    2004-01-01

    This book provides in a concise, yet detailed way, the bulk of the probabilistic tools that a student working toward an advanced degree in statistics,probability and other related areas, should be equipped with. The approach is classical, avoiding the use of mathematical tools not necessary for carrying out the discussions. All proofs are presented in full detail.* Excellent exposition marked by a clear, coherent and logical devleopment of the subject* Easy to understand, detailed discussion of material* Complete proofs

  20. Quantization of Prior Probabilities for Hypothesis Testing

    OpenAIRE

    Varshney, Kush R.; Varshney, Lav R.

    2008-01-01

    Bayesian hypothesis testing is investigated when the prior probabilities of the hypotheses, taken as a random vector, are quantized. Nearest neighbor and centroid conditions are derived using mean Bayes risk error as a distortion measure for quantization. A high-resolution approximation to the distortion-rate function is also obtained. Human decision making in segregated populations is studied assuming Bayesian hypothesis testing with quantized priors.

  1. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  2. Multiple decomposability of probabilities on contractible locally ...

    Indian Academy of Sciences (India)

    Definition 3.1). As mentioned before, μ is n-times τ-decomposable iff μ has a representation as (n + 1)-times iterated convolution product. To be allowed to ..... Then the classical version of the equivalence theorem holds: If νi , i ≥ 0, ν, are probabilities and Xi ,i ≥ 0, Y are independent G-valued random variables with ...

  3. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  4. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  5. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    2011-05-23

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.  Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID).   Date Released: 5/25/2011.

  6. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  7. Atomic Transition Probabilities in TiI

    Science.gov (United States)

    Nitz, David E.; Siewert, Lowell K.; Schneider, Matthew N.

    2001-05-01

    We have measured branching fractions and atomic transition probabilities in TiI for 50 visible and near-IR transitions which connect odd-parity levels lying 25000 cm-1 to 27000 cm-1 above the ground state to low-lying even parity levels. Branching fractions are obtained from the analysis of six hollow cathode emission spectra recorded using the Fourier transform spectrometer at the National Solar Observatory, supplemented in cases susceptible to radiation-trapping problems by conventional emission spectroscopy using a commercial sealed lamp operated at very low discharge current. The absolute scale for normalizing the branching fractions is established using radiative lifetimes from time-resolved laser-induced fluorescence measurements.(S. Salih and J.E. Lawler, Astronomy and Astrophysics 239, 407 (1990).) Uncertainties of the transition probabilities range from ±5% for the stronger branches to ±20% for the weaker ones. Among the 16 lines for which previously-measured transition probabilities are listed in the NIST critical compilation,(G. A. Martin, J. R. Fuhr, and W. L. Wiese, J. Phys. Chem. Ref. Data 17, Suppl. 3, 85 (1988).) several significant discrepancies are noted.

  8. Probable doxycycline-induced acute pancreatitis.

    Science.gov (United States)

    Moy, Brian T; Kapila, Nikhil

    2016-03-01

    A probable case of doxycycline-induced pancreatitis is reported. A 51-year-old man was admitted to the emergency department with a one-week history of extreme fatigue, malaise, and confusion. Three days earlier he had been started on empirical doxycycline therapy for presumed Lyme disease; he was taking no other medications at the time of admission. A physical examination was remarkable for abdominal tenderness. Relevant laboratory data included a lipase concentration of 5410 units/L (normal range, 13-60 units/L), an amylase concentration of 1304 (normal range, 28-100 units/L), and a glycosylated hemoglobin concentration of 15.2% (normal, doxycycline was discontinued. With vasopressor support, aggressive fluid resuscitation, hemodialysis, and an insulin infusion, the patient's clinical course rapidly improved over five days. Scoring of the case via the method of Naranjo et al. yielded a score of 6, indicating a probable adverse reaction to doxycycline. A man developed AP three days after starting therapy with oral doxycycline, and the association between drug and reaction was determined to be probable. His case appears to be the third of doxycycline-associated AP, although tigecycline, tetracycline, and minocycline have also been implicated as causes of AP. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  9. Theoretical oscillator strengths, transition probabilities, and radiative lifetimes of levels in Pb V

    Energy Technology Data Exchange (ETDEWEB)

    Colón, C., E-mail: cristobal.colon@upm.es [Dpto. Física Aplicada. E.U.I.T. Industrial, Universidad Politécnica de Madrid, Ronda de Valencia 3, 28012 Madrid (Spain); Alonso-Medina, A. [Dpto. Física Aplicada. E.U.I.T. Industrial, Universidad Politécnica de Madrid, Ronda de Valencia 3, 28012 Madrid (Spain); Porcher, P. [Laboratoire de Chimie Appliquée de l’Etat Solide, CNRS-UMR 7574, Paris (France)

    2014-01-15

    Theoretical values of oscillator strengths and transition probabilities for 306 spectral lines arising from the 5d{sup 9}ns(n=7,8,9),5d{sup 9}np(n=6,7),5d{sup 9}6d, and 5d{sup 9} 5f configurations, and radiative lifetimes of 9 levels, of Pb V have been obtained. These values were obtained in intermediate coupling (IC) and using ab initio relativistic Hartree–Fock calculations including core-polarization effects. We use for the IC calculations the standard method of least squares fitting of experimental energy levels by means of computer codes from Cowan. We included in these calculations the 5d{sup 8}6s6p and 5d{sup 8}6s{sup 2} configurations. These calculations have facilitated the identification of the 214.25, 216.79, and 227.66 nm spectral lines of Pb V. In the absence of experimental results of oscillator strengths and transition probabilities, we could not make a direct comparison with our results. However, the Stark broadening parameters calculated from these values are in excellent agreement with experimental widening found in the literature. -- Highlights: •Theoretical values of transition probabilities of Pb V have been obtained. •We use for the IC calculations the standard method of least square. •The parameters calculated from these values are in agreement with the experimental values.

  10. Online plasma calculator

    Science.gov (United States)

    Wisniewski, H.; Gourdain, P.-A.

    2017-10-01

    APOLLO is an online, Linux based plasma calculator. Users can input variables that correspond to their specific plasma, such as ion and electron densities, temperatures, and external magnetic fields. The system is based on a webserver where a FastCGI protocol computes key plasma parameters including frequencies, lengths, velocities, and dimensionless numbers. FastCGI was chosen to overcome security problems caused by JAVA-based plugins. The FastCGI also speeds up calculations over PHP based systems. APOLLO is built upon the WT library, which turns any web browser into a versatile, fast graphic user interface. All values with units are expressed in SI units except temperature, which is in electron-volts. SI units were chosen over cgs units because of the gradual shift to using SI units within the plasma community. APOLLO is intended to be a fast calculator that also provides the user with the proper equations used to calculate the plasma parameters. This system is intended to be used by undergraduates taking plasma courses as well as graduate students and researchers who need a quick reference calculation.

  11. THE ACCOUNTING POSTEMPLOYMENT BENEFITS BASED ON ACTUARIAL CALCULATIONS

    OpenAIRE

    Anna CEBOTARI

    2017-01-01

    The accounting post-employment benefits, based on actuarial calculations, at present remains a subject studied in Moldova only theoretically. Applying actuarial calculations of accounting in fact denotes its character of evolving. Because national accounting standards have been adapted to international, which, in turn, require the valuation of assets and debts at fair value, there is a need to draw up exact calculations on which stands the theory of probability and ...

  12. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  13. LHC Bellows Impedance Calculations

    CERN Document Server

    Dyachkov, M

    1997-01-01

    To compensate for thermal expansion the LHC ring has to accommodate about 2500 bellows which, together with beam position monitors, are the main contributors to the LHC broad-band impedance budget. In order to reduce this impedance to an acceptable value the bellows have to be shielded. In this paper we compare different designs proposed for the bellows and calculate their transverse and longitudinal wakefields and impedances. Owing to the 3D geometry of the bellows, the code MAFIA was used for the wakefield calculations; when possible the MAFIA results were compared to those obtained with ABCI. The results presented in this paper indicate that the latest bellows design, in which shielding is provided by sprung fingers which can slide along the beam screen, has impedances smaller tha those previously estimated according to a rather conservative scaling of SSC calculations and LEP measurements. Several failure modes, such as missing fingers and imperfect RF contact, have also been studied.

  14. INVAP's Nuclear Calculation System

    Directory of Open Access Journals (Sweden)

    Ignacio Mochi

    2011-01-01

    Full Text Available Since its origins in 1976, INVAP has been on continuous development of the calculation system used for design and optimization of nuclear reactors. The calculation codes have been polished and enhanced with new capabilities as they were needed or useful for the new challenges that the market imposed. The actual state of the code packages enables INVAP to design nuclear installations with complex geometries using a set of easy-to-use input files that minimize user errors due to confusion or misinterpretation. A set of intuitive graphic postprocessors have also been developed providing a fast and complete visualization tool for the parameters obtained in the calculations. The capabilities and general characteristics of this deterministic software package are presented throughout the paper including several examples of its recent application.

  15. Graphing Calculator Mini Course

    Science.gov (United States)

    Karnawat, Sunil R.

    1996-01-01

    The "Graphing Calculator Mini Course" project provided a mathematically-intensive technologically-based summer enrichment workshop for teachers of American Indian students on the Turtle Mountain Indian Reservation. Eleven such teachers participated in the six-day workshop in summer of 1996 and three Sunday workshops in the academic year. The project aimed to improve science and mathematics education on the reservation by showing teachers effective ways to use high-end graphing calculators as teaching and learning tools in science and mathematics courses at all levels. In particular, the workshop concentrated on applying TI-82's user-friendly features to understand the various mathematical and scientific concepts.

  16. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  17. The paradigm of complex probability and Chebyshev's inequality

    National Research Council Canada - National Science Library

    Abou Jaoude, Abdo

    2016-01-01

    ... an additional three axioms. Therefore, we create the complex probability set , which is the sum of the real set with its corresponding real probability, and the imaginary set with its corresponding imaginary probability...

  18. Probability mapping of scarred myocardium using texture and intensity features in CMR images.

    Science.gov (United States)

    Kotu, Lasya Priya; Engan, Kjersti; Skretting, Karl; Måløy, Frode; Orn, Stein; Woie, Leik; Eftestøl, Trygve

    2013-09-22

    The myocardium exhibits heterogeneous nature due to scarring after Myocardial Infarction (MI). In Cardiac Magnetic Resonance (CMR) imaging, Late Gadolinium (LG) contrast agent enhances the intensity of scarred area in the myocardium. In this paper, we propose a probability mapping technique using Texture and Intensity features to describe heterogeneous nature of the scarred myocardium in Cardiac Magnetic Resonance (CMR) images after Myocardial Infarction (MI). Scarred tissue and non-scarred tissue are represented with high and low probabilities, respectively. Intermediate values possibly indicate areas where the scarred and healthy tissues are interwoven. The probability map of scarred myocardium is calculated by using a probability function based on Bayes rule. Any set of features can be used in the probability function. In the present study, we demonstrate the use of two different types of features. One is based on the mean intensity of pixel and the other on underlying texture information of the scarred and non-scarred myocardium. Examples of probability maps computed using the mean intensity of pixel and the underlying texture information are presented. We hypothesize that the probability mapping of myocardium offers alternate visualization, possibly showing the details with physiological significance difficult to detect visually in the original CMR image. The probability mapping obtained from the two features provides a way to define different cardiac segments which offer a way to identify areas in the myocardium of diagnostic importance (like core and border areas in scarred myocardium).

  19. Probability mapping of scarred myocardium using texture and intensity features in CMR images

    Science.gov (United States)

    2013-01-01

    Background The myocardium exhibits heterogeneous nature due to scarring after Myocardial Infarction (MI). In Cardiac Magnetic Resonance (CMR) imaging, Late Gadolinium (LG) contrast agent enhances the intensity of scarred area in the myocardium. Methods In this paper, we propose a probability mapping technique using Texture and Intensity features to describe heterogeneous nature of the scarred myocardium in Cardiac Magnetic Resonance (CMR) images after Myocardial Infarction (MI). Scarred tissue and non-scarred tissue are represented with high and low probabilities, respectively. Intermediate values possibly indicate areas where the scarred and healthy tissues are interwoven. The probability map of scarred myocardium is calculated by using a probability function based on Bayes rule. Any set of features can be used in the probability function. Results In the present study, we demonstrate the use of two different types of features. One is based on the mean intensity of pixel and the other on underlying texture information of the scarred and non-scarred myocardium. Examples of probability maps computed using the mean intensity of pixel and the underlying texture information are presented. We hypothesize that the probability mapping of myocardium offers alternate visualization, possibly showing the details with physiological significance difficult to detect visually in the original CMR image. Conclusion The probability mapping obtained from the two features provides a way to define different cardiac segments which offer a way to identify areas in the myocardium of diagnostic importance (like core and border areas in scarred myocardium). PMID:24053280

  20. Solution of a torsional Schrödinger equation with a periodic potential of general form. The probability amplitude and probability density

    Science.gov (United States)

    Turovtsev, V. V.; Orlov, M. Yu.; Orlov, Yu. D.

    2017-08-01

    Analytic expressions for the probability density of states of a molecule with internal rotation and the probability of finding the state in the potential well are derived for the first time. Two methods are proposed for assigning conformers to potential wells. A quantitative measure of localization and delocalization of a state in the potential well is introduced. The rotational symmetry number is generalized to the case of asymmetric rotation. On the basis of the localization criterion, a model is developed for calculating the internal rotation contribution to thermodynamic properties of individual conformers with low rotational barriers and/or at a high temperature.

  1. Gravitational constant calculation methodologies

    OpenAIRE

    Shakhparonov, V. M.; Karagioz, O. V.; Izmailov, V. P.

    2011-01-01

    We consider the gravitational constant calculation methodologies for a rectangular block of the torsion balance body presented in the papers Phys. Rev. Lett. 102, 240801 (2009) and Phys.Rev. D. 82, 022001 (2010). We have established the influence of non-equilibrium gas flows on the obtained values of G.

  2. Cosmological constraints from the convergence 1-point probability distribution

    Science.gov (United States)

    Patton, Kenneth; Blazek, Jonathan; Honscheid, Klaus; Huff, Eric; Melchior, Peter; Ross, Ashley J.; Suchyta, Eric

    2017-11-01

    We examine the cosmological information available from the 1-point probability density function (PDF) of the weak-lensing convergence field, utilizing fast l-picola simulations and a Fisher analysis. We find competitive constraints in the Ωm-σ8 plane from the convergence PDF with 188 arcmin2 pixels compared to the cosmic shear power spectrum with an equivalent number of modes (ℓ PDF also partially breaks the degeneracy cosmic shear exhibits in that parameter space. A joint analysis of the convergence PDF and shear 2-point function also reduces the impact of shape measurement systematics, to which the PDF is less susceptible, and improves the total figure of merit by a factor of 2-3, depending on the level of systematics. Finally, we present a correction factor necessary for calculating the unbiased Fisher information from finite differences using a limited number of cosmological simulations.

  3. Microstructure density generation for backlight display using probability analysis method

    Science.gov (United States)

    Lin, Shang-fei; Su, Cheng-yue; Feng, Zu-yong; Li, Xiao-duan

    2017-11-01

    This paper proposes a new 1D density generation method for micro prisms applied in a light guide plate (LGP) design. It has three parameters, including the width of micro prisms, the ratio of cells and the probability values. After calculation of the new densities, we applied it to a dynamical low discrepancy sequences method to generate the micro prisms distribution. We build models of 2.7 inches, 7 inches and 14 inches in the optical software. Our simulation results indicated that the luminance uniformity ranged between 89.9% and 92.2%, and the light utilizations ranged between 79.5% and 80.0%. This new density generation method is proved to be practicable for middle and small size BLU designs. It also helps to shorten the design period by substituting the determination of the prisms numbers in initialization.

  4. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  5. Correlation between the clinical pretest probability score and the lung ventilation and perfusion scan probability

    OpenAIRE

    Bhoobalan, Shanmugasundaram; Chakravartty, Riddhika; Dolbear, Gill; Al-Janabi, Mazin

    2013-01-01

    Purpose: Aim of the study was to determine the accuracy of the clinical pretest probability (PTP) score and its association with lung ventilation and perfusion (VQ) scan. Materials and Methods: A retrospective analysis of 510 patients who had a lung VQ scan between 2008 and 2010 were included in the study. Out of 510 studies, the number of normal, low, and high probability VQ scans were 155 (30%), 289 (57%), and 55 (11%), respectively. Results: A total of 103 patients underwent computed tomog...

  6. Calibration of weather radar using region probability matching method (RPMM)

    Science.gov (United States)

    Ayat, Hooman; Reza Kavianpour, M.; Moazami, Saber; Hong, Yang; Ghaemi, Esmail

    2017-09-01

    This research aims to develop a novel method named region probability matching method (RPMM) for calibrating the Amir-Abad weather radar located in the north of Iran. This approach also can overcome the limitations of probability matching method (PMM), window probability matching method (WPMM), and window correlation matching method (WCMM). The employing of these methods for calibrating the radars in light precipitation is associated with many errors. Additionally, in developing countries like Iran where ground stations have low temporal resolution, these methods cannot be benefited from. In these circumstances, RPMM by utilizing 18 synoptic stations with a temporal resolution of 6 h and radar data with a temporal resolution of 15 min has indicated an accurate estimation of cumulative precipitation over the entire study area in a specific period. Through a comparison of the two methods (RPMM and traditional matching method (TMM)) on March 22, 2014, the obtained correlation coefficients for TMM and RPMM were 0.13 and 0.95, respectively. It is noted that the cumulative precipitation of the whole rain gauges and the calibrated radar precipitation at the same pixels were 38.5 and 36.9 mm, respectively. Therefore, the obtained results prove the inefficiency of TMM and the capability of RPMM in the calibration process of the Amir-Abad weather radar. Besides, in determining the uncertainty associated with the calculated values of A and B in the Z e -R relation, a sensitivity analysis method was employed during the estimation of cumulative light precipitation for the period from 2014 to 2015. The results expressed that in the worst conditions, 69% of radar data are converted to R values by a maximum error less than 30%.

  7. Probability of failure of waste disposals sites in Žirovski vrh uranium mine

    Directory of Open Access Journals (Sweden)

    Tomaž Beguš

    2002-12-01

    Full Text Available The only Uranium mine in Slovenia @irovski vrh was closed in 1990 due to economic reasons. After the closure extensive decommissioning works in the mine and in the surrounding began. In the very beginning after the closure great landslide has been occurred in the mill tailings site and recalculation of stability of existent and alternate sites were performed. In this calculations I used statistical scatter of input variables and calculated probability of failure of sites.

  8. Probability of failure of waste disposals sites in Žirovski vrh uranium mine

    OpenAIRE

    Tomaž Beguš

    2002-01-01

    The only Uranium mine in Slovenia @irovski vrh was closed in 1990 due to economic reasons. After the closure extensive decommissioning works in the mine and in the surrounding began. In the very beginning after the closure great landslide has been occurred in the mill tailings site and recalculation of stability of existent and alternate sites were performed. In this calculations I used statistical scatter of input variables and calculated probability of failure of sites.

  9. Multidimensional rare event probability estimation algorithm

    Directory of Open Access Journals (Sweden)

    Leonidas Sakalauskas

    2013-09-01

    Full Text Available This work contains Monte–Carlo Markov Chain algorithm for estimation of multi-dimensional rare events frequencies. Logits of rare event likelihood we are modeling with Poisson distribution, which parameters are distributed by multivariate normal law with unknown parameters – mean vector and covariance matrix. The estimations of unknown parameters are calculated by the maximum likelihood method. There are equations derived, those must be satisfied with model’s maximum likelihood parameters estimations. Positive definition of evaluated covariance matrixes are controlled by calculating ratio between matrix maximum and minimum eigenvalues.

  10. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  11. Probability in biology: overview of a comprehensive theory of probability in living systems.

    Science.gov (United States)

    Nakajima, Toshiyuki

    2013-09-01

    Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Probability-based TCP congestion control mechanism

    Science.gov (United States)

    Xu, Changbiao; Yang, Shizhong; Xian, Yongju

    2005-11-01

    To mitigate TCP global synchronization and improve network throughput, an improved TCP congestion control mechanism is proposed, namely P-TCP, which adopts the probability-based way to adjust congestion window independently when the network occurs congestion. Therefore, some P-TCP connections may decrease the congestion window greatly while other P-TCP connections may decrease the congestion window lightly. Simulation results show that TCP global synchronization can be effectively mitigated, which leads to efficient utilization of network resources as well as the effective mitigation for network congestion. Simulation results also give some valuable references for determining the related parameters in P-TCP.

  13. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted...... to Gumbel distributions, and these fits are found to represent the data measured with good accuracy. The pressure distributions found have been used in a calibration of partial factors, which should achieve a certain theoretical target reliability index. For a target annual reliability index of 4...

  14. Maximum Probability Domains for Hubbard Models

    CERN Document Server

    Acke, Guillaume; Claeys, Pieter W; Van Raemdonck, Mario; Poelmans, Ward; Van Neck, Dimitri; Bultinck, Patrick

    2015-01-01

    The theory of Maximum Probability Domains (MPDs) is formulated for the Hubbard model in terms of projection operators and generating functions for both exact eigenstates as well as Slater determinants. A fast MPD analysis procedure is proposed, which is subsequently used to analyse numerical results for the Hubbard model. It is shown that the essential physics behind the considered Hubbard models can be exposed using MPDs. Furthermore, the MPDs appear to be in line with what is expected from Valence Bond Theory-based knowledge.

  15. Modulation Based on Probability Density Functions

    Science.gov (United States)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  16. Atomic transition probabilities of Er i

    Energy Technology Data Exchange (ETDEWEB)

    Lawler, J E; Den Hartog, E A [Department of Physics, University of Wisconsin, 1150 University Ave., Madison, WI 53706 (United States); Wyart, J-F, E-mail: jelawler@wisc.ed, E-mail: jean-francois.wyart@lac.u-psud.f, E-mail: eadenhar@wisc.ed [Laboratoire Aime Cotton, CNRS (UPR3321), Bat. 505, Centre Universitaire Paris-Sud, 91405-Orsay (France)

    2010-12-14

    Atomic transition probabilities for 562 lines of the first spectrum of erbium (Er i) are reported. These data are from new branching fraction measurements on Fourier transform spectra normalized with previously reported radiative lifetimes from time-resolved laser-induced fluorescence measurements (Den Hartog et al 2010 J. Phys. B: At. Mol. Opt. Phys. 43 155004). The wavelength range of the data set is from 298 to 1981 nm. In this work we explore the utility of parametric fits based on the Cowan code in assessing branching fraction errors due to lines connecting to unobserved lower levels.

  17. Atomic transition probabilities of Gd i

    Energy Technology Data Exchange (ETDEWEB)

    Lawler, J E; Den Hartog, E A [Department of Physics, University of Wisconsin, 1150 University Avenue, Madison, WI 53706 (United States); Bilty, K A, E-mail: jelawler@wisc.edu, E-mail: biltyka@uwec.edu, E-mail: eadenhar@wisc.edu [Department of Physics and Astronomy, University of Wisconsin-Eau Claire, Eau Claire, WI 54702 (United States)

    2011-05-14

    Fourier transform spectra are used to determine emission branching fractions for 1290 lines of the first spectrum of gadolinium (Gd i). These branching fractions are converted to absolute atomic transition probabilities using previously reported radiative lifetimes from time-resolved laser-induced-fluorescence measurements (Den Hartog et al 2011 J. Phys. B: At. Mol. Opt. Phys. 44 055001). The wavelength range of the data set is from 300 to 1850 nm. A least squares technique for separating blends of the first and second spectra lines is also described and demonstrated in this work.

  18. Probability density of quantum expectation values

    Science.gov (United States)

    Campos Venuti, L.; Zanardi, P.

    2013-10-01

    We consider the quantum expectation value A= of an observable A over the state |ψ>. We derive the exact probability distribution of A seen as a random variable when |ψ> varies over the set of all pure states equipped with the Haar-induced measure. To illustrate our results we compare the exact predictions for few concrete examples with the concentration bounds obtained using Levy's lemma. We also comment on the relevance of the central limit theorem and finally draw some results on an alternative statistical mechanics based on the uniform measure on the energy shell.

  19. Snell Envelope with Small Probability Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Del Moral, Pierre, E-mail: Pierre.Del-Moral@inria.fr; Hu, Peng, E-mail: Peng.Hu@inria.fr [Universite de Bordeaux I, Centre INRIA Bordeaux et Sud-Ouest and Institut de Mathematiques de Bordeaux (France); Oudjane, Nadia, E-mail: Nadia.Oudjane@edf.fr [EDF R and D Clamart (France)

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  20. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs