WorldWideScience

Sample records for high phonotactic probability

  1. The Influence of Phonotactic Probability on Word Recognition in Toddlers

    Science.gov (United States)

    MacRoy-Higgins, Michelle; Shafer, Valerie L.; Schwartz, Richard G.; Marton, Klara

    2014-01-01

    This study examined the influence of phonotactic probability on word recognition in English-speaking toddlers. Typically developing toddlers completed a preferential looking paradigm using familiar words, which consisted of either high or low phonotactic probability sound sequences. The participants' looking behavior was recorded in response to…

  2. The effect of incremental changes in phonotactic probability and neighborhood density on word learning by preschool children

    Science.gov (United States)

    Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon

    2013-01-01

    Purpose Phonotactic probability or neighborhood density have predominately been defined using gross distinctions (i.e., low vs. high). The current studies examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method The full range of probability or density was examined by sampling five nonwords from each of four quartiles. Three- and 5-year-old children received training on nonword-nonobject pairs. Learning was measured in a picture-naming task immediately following training and 1-week after training. Results were analyzed using multi-level modeling. Results A linear spline model best captured nonlinearities in phonotactic probability. Specifically word learning improved as probability increased in the lowest quartile, worsened as probability increased in the midlow quartile, and then remained stable and poor in the two highest quartiles. An ordinary linear model sufficiently described neighborhood density. Here, word learning improved as density increased across all quartiles. Conclusion Given these different patterns, phonotactic probability and neighborhood density appear to influence different word learning processes. Specifically, phonotactic probability may affect recognition that a sound sequence is an acceptable word in the language and is a novel word for the child, whereas neighborhood density may influence creation of a new representation in long-term memory. PMID:23882005

  3. Phonotactic probability of brand names: I'd buy that!

    Science.gov (United States)

    Vitevitch, Michael S; Donoso, Alexander J

    2012-11-01

    Psycholinguistic research shows that word-characteristics influence the speed and accuracy of various language-related processes. Analogous characteristics of brand names influence the retrieval of product information and the perception of risks associated with that product. In the present experiment we examined how phonotactic probability-the frequency with which phonological segments and sequences of segments appear in a word-might influence consumer behavior. Participants rated brand names that varied in phonotactic probability on the likelihood that they would buy the product. Participants indicated that they were more likely to purchase a product if the brand name was comprised of common segments and sequences of segments rather than less common segments and sequences of segments. This result suggests that word-characteristics may influence higher-level cognitive processes, in addition to language-related processes. Furthermore, the benefits of using objective measures of word characteristics in the design of brand names are discussed.

  4. The influence of phonotactic probability and neighborhood density on children's production of newly learned words.

    Science.gov (United States)

    Heisler, Lori; Goffman, Lisa

    A word learning paradigm was used to teach children novel words that varied in phonotactic probability and neighborhood density. The effects of frequency and density on speech production were examined when phonetic forms were non-referential (i.e., when no referent was attached) and when phonetic forms were referential (i.e., when a referent was attached through fast mapping). Two methods of analysis were included: (1) kinematic variability of speech movement patterning; and (2) measures of segmental accuracy. Results showed that phonotactic frequency influenced the stability of movement patterning whereas neighborhood density influenced phoneme accuracy. Motor learning was observed in both non-referential and referential novel words. Forms with low phonotactic probability and low neighborhood density showed a word learning effect when a referent was assigned during fast mapping. These results elaborate on and specify the nature of interactivity observed across lexical, phonological, and articulatory domains.

  5. The Effect of Incremental Changes in Phonotactic Probability and Neighborhood Density on Word Learning by Preschool Children

    Science.gov (United States)

    Storkel, Holly L.; Bontempo, Daniel E.; Aschenbrenner, Andrew J.; Maekawa, Junko; Lee, Su-Yeon

    2013-01-01

    Purpose: Phonotactic probability or neighborhood density has predominately been defined through the use of gross distinctions (i.e., low vs. high). In the current studies, the authors examined the influence of finer changes in probability (Experiment 1) and density (Experiment 2) on word learning. Method: The authors examined the full range of…

  6. Phonotactics, Neighborhood Activation, and Lexical Access for Spoken Words

    Science.gov (United States)

    Vitevitch, Michael S.; Luce, Paul A.; Pisoni, David B.; Auer, Edward T.

    2012-01-01

    Probabilistic phonotactics refers to the relative frequencies of segments and sequences of segments in spoken words. Neighborhood density refers to the number of words that are phonologically similar to a given word. Despite a positive correlation between phonotactic probability and neighborhood density, nonsense words with high probability segments and sequences are responded to more quickly than nonsense words with low probability segments and sequences, whereas real words occurring in dense similarity neighborhoods are responded to more slowly than real words occurring in sparse similarity neighborhoods. This contradiction may be resolved by hypothesizing that effects of probabilistic phonotactics have a sublexical focus and that effects of similarity neighborhood density have a lexical focus. The implications of this hypothesis for models of spoken word recognition are discussed. PMID:10433774

  7. Grammars Leak: Modeling How Phonotactic Generalizations Interact within the Grammar

    Science.gov (United States)

    Martin, Andrew

    2011-01-01

    I present evidence from Navajo and English that weaker, gradient versions of morpheme-internal phonotactic constraints, such as the ban on geminate consonants in English, hold even across prosodic word boundaries. I argue that these lexical biases are the result of a MAXIMUM ENTROPY phonotactic learning algorithm that maximizes the probability of…

  8. Growth of verbal short-term memory of nonwords varying in phonotactic probability : A longitudinal study with monolingual and bilingual children

    NARCIS (Netherlands)

    Messer, Marielle H.|info:eu-repo/dai/nl/304835226; Verhagen, Josje|info:eu-repo/dai/nl/277955882; Boom, Jan|info:eu-repo/dai/nl/07472732X; Mayo, Aziza Y.|info:eu-repo/dai/nl/271313404; Leseman, Paul P M|info:eu-repo/dai/nl/070760810

    2015-01-01

    This study investigates the hypothesis that verbal short-term memory growth in young children can be explained by increases in long-term linguistic knowledge. To this aim, we compare children's recall of nonwords varying in phonotactic probability. If our assumption holds, there should be growth in

  9. Word Recognition and Nonword Repetition in Children with Language Disorders: The Effects of Neighborhood Density, Lexical Frequency, and Phonotactic Probability

    Science.gov (United States)

    Rispens, Judith; Baker, Anne; Duinmeijer, Iris

    2015-01-01

    Purpose: The effects of neighborhood density (ND) and lexical frequency on word recognition and the effects of phonotactic probability (PP) on nonword repetition (NWR) were examined to gain insight into processing at the lexical and sublexical levels in typically developing (TD) children and children with developmental language problems. Method:…

  10. The Influence of Phonotactic Probability on Nonword Repetition and Fast Mapping in 3-Year-Olds with a History of Expressive Language Delay

    Science.gov (United States)

    MacRoy-Higgins, Michelle; Dalton, Kevin Patrick

    2015-01-01

    Purpose: The purpose of this study was to examine the influence of phonotactic probability on sublexical (phonological) and lexical representations in 3-year-olds who had a history of being late talkers in comparison with their peers with typical language development. Method: Ten 3-year-olds who were late talkers and 10 age-matched typically…

  11. Phonotactic awareness deficit following left-hemisphere stroke

    Directory of Open Access Journals (Sweden)

    Maryam Ghaleh

    2015-04-01

    Likert-type scale responses were z-transformed and coded accurate for positive z-values in condition 3 and negative z-values in condition 1 trials. Accuracy was analyzed using binomial mixed effects models and z-transformed scale responses were analyzed using linear mixed effects models. For both analyses, the fixed effects of stimulus, trial number, group (patient/control, education, age, response time, phonotactic regularity (1/3, and gender were examined along with all relevant interactions. Random effects for participant and stimuli as well as random slopes were also included. Model fitting was performed in a backward-stepwise iterative fashion, followed by forward fitting of maximal random effects structure. Models were evaluated by model fitness comparisons using Akaike Information Criterion and Bayesian Information Criterion. Accuracy analysis revealed that healthy participants were significantly more accurate than patients [β = 0.47, p<0.001] in Englishness rating. Scale response analysis revealed a significant effect of phonotactic regularity [β = 1.65, p<0.0001] indicating that participants were sensitive to phonotactic regularity differences among non-words. However, the significant interaction of group and phonotactic regularity [β = -0.5, p= 0.02] further demonstrated that, compared to healthy adults, patients were less able to recognize the phonotactic regularity differences between non-words. Results suggest that left-hemisphere lesions cause impaired phonotactic processing and that the left hemisphere might be necessary for phonotactic awareness. These preliminary findings will be followed up by further analyses investigating the interactions between phonotactic processing and participants’ scores on other linguistic/cognitive tasks as well as lesion-symptom mapping.

  12. Japanese Listeners' Perceptions of Phonotactic Violations

    Science.gov (United States)

    Fais, Laurel; Kajikawa, Sachiyo; Werker, Janet; Amano, Shigeaki

    2005-01-01

    The canonical form for Japanese words is (Consonant)Vowel(Consonant) Vowel[approximately]. However, a regular process of high vowel devoicing between voiceless consonants and word-finally after voiceless consonants results in consonant clusters and word-final consonants, apparent violations of that phonotactic pattern. We investigated Japanese…

  13. Phonotactic Probability Effect in Nonword Recall and Its Relationship with Vocabulary in Monolingual and Bilingual Preschoolers

    Science.gov (United States)

    Messer, Marielle H.; Leseman, Paul P. M.; Boom, Jan; Mayo, Aziza Y.

    2010-01-01

    The current study examined to what extent information in long-term memory concerning the distribution of phoneme clusters in a language, so-called long-term phonotactic knowledge, increased the capacity of verbal short-term memory in young language learners and, through increased verbal short-term memory capacity, supported these children's first…

  14. Phonotactic Constraints: Implications for Models of Oral Reading in Russian

    Science.gov (United States)

    Ulicheva, Anastasia; Coltheart, Max; Saunders, Steven; Perry, Conrad

    2016-01-01

    The present article investigates how phonotactic rules constrain oral reading in the Russian language. The pronunciation of letters in Russian is regular and consistent, but it is subject to substantial phonotactic influence: the position of a phoneme and its phonological context within a word can alter its pronunciation. In Part 1 of the article,…

  15. Lexical mediation of phonotactic frequency effects on spoken word recognition: A Granger causality analysis of MRI-constrained MEG/EEG data.

    Science.gov (United States)

    Gow, David W; Olson, Bruna B

    2015-07-01

    Phonotactic frequency effects play a crucial role in a number of debates over language processing and representation. It is unclear however, whether these effects reflect prelexical sensitivity to phonotactic frequency, or lexical "gang effects" in speech perception. In this paper, we use Granger causality analysis of MR-constrained MEG/EEG data to understand how phonotactic frequency influences neural processing dynamics during auditory lexical decision. Effective connectivity analysis showed weaker feedforward influence from brain regions involved in acoustic-phonetic processing (superior temporal gyrus) to lexical areas (supramarginal gyrus) for high phonotactic frequency words, but stronger top-down lexical influence for the same items. Low entropy nonwords (nonwords judged to closely resemble real words) showed a similar pattern of interactions between brain regions involved in lexical and acoustic-phonetic processing. These results contradict the predictions of a feedforward model of phonotactic frequency facilitation, but support the predictions of a lexically mediated account.

  16. Modelling the phonotactic structure of natural language words with simple recurrent networks

    NARCIS (Netherlands)

    Stoianov, [No Value; Nerbonne, J; Bouma, H; Coppen, PA; vanHalteren, H; Teunissen, L

    1998-01-01

    Simple Recurrent Networks (SRN) are Neural Network (connectionist) models able to process natural language. Phonotactics concerns the order of symbols in words. We continued an earlier unsuccessful trial to model the phonotactics of Dutch words with SRNs. In order to overcome the previously reported

  17. Beyond Phonotactic Frequency: Presentation Frequency Effects Word Productions in Specific Language Impairment

    Science.gov (United States)

    Plante, Elena; Bahl, Megha; Vance, Rebecca; Gerken, LouAnn

    2011-01-01

    Phonotactic frequency effects on word production are thought to reflect accumulated experience with a language. Here we demonstrate that frequency effects can also be obtained through short-term manipulations of the input to children. We presented children with nonwords in an experiment that systematically manipulated English phonotactic frequency…

  18. Phonotactics Constraints and the Spoken Word Recognition of Chinese Words in Speech

    Science.gov (United States)

    Yip, Michael C.

    2016-01-01

    Two word-spotting experiments were conducted to examine the question of whether native Cantonese listeners are constrained by phonotactics information in spoken word recognition of Chinese words in speech. Because no legal consonant clusters occurred within an individual Chinese word, this kind of categorical phonotactics information of Chinese…

  19. A Vowel Is a Vowel: Generalizing Newly Learned Phonotactic Constraints to New Contexts

    Science.gov (United States)

    Chambers, Kyle E.; Onishi, Kristine H.; Fisher, Cynthia

    2010-01-01

    Adults can learn novel phonotactic constraints from brief listening experience. We investigated the representations underlying phonotactic learning by testing generalization to syllables containing new vowels. Adults heard consonant-vowel-consonant study syllables in which particular consonants were artificially restricted to the onset or coda…

  20. Repair or Violation Detection? Pre-Attentive Processing Strategies of Phonotactic Illegality Demonstrated on the Constraint of g-Deletion in German

    Science.gov (United States)

    Steinberg, Johanna; Jacobsen, Thomas Konstantin; Jacobsen, Thomas

    2016-01-01

    Purpose: Effects of categorical phonotactic knowledge on pre-attentive speech processing were investigated by presenting illegal speech input that violated a phonotactic constraint in German called "g-deletion." The present study aimed to extend previous findings of automatic processing of phonotactic violations and to investigate the…

  1. Phonotactic spoken language identification with limited training data

    CSIR Research Space (South Africa)

    Peche, M

    2007-08-01

    Full Text Available The authors investigate the addition of a new language, for which limited resources are available, to a phonotactic language identification system. Two classes of approaches are studied: in the first class, only existing phonetic recognizers...

  2. Segmentation cues in conversational speech: Robust semantics and fragile phonotactics

    Directory of Open Access Journals (Sweden)

    Laurence eWhite

    2012-10-01

    Full Text Available Multiple cues influence listeners’ segmentation of connected speech into words, but most previous studies have used stimuli elicited in careful readings rather than natural conversation. Discerning word boundaries in conversational speech may differ from the laboratory setting. In particular, a speaker’s articulatory effort – hyperarticulation vs hypoarticulation (H&H – may vary according to communicative demands, suggesting a compensatory relationship whereby acoustic-phonetic cues are attenuated when other information sources strongly guide segmentation. We examined how listeners’ interpretation of segmentation cues is affected by speech style (spontaneous conversation vs read, using cross-modal identity priming. To elicit spontaneous stimuli, we used a map task in which speakers discussed routes around stylised landmarks. These landmarks were two-word phrases in which the strength of potential segmentation cues – semantic likelihood and cross-boundary diphone phonotactics – was systematically varied. Landmark-carrying utterances were transcribed and later re-recorded as read speech.Independent of speech style, we found an interaction between cue valence (favourable/unfavourable and cue type (phonotactics/semantics. Thus, there was an effect of semantic plausibility, but no effect of cross-boundary phonotactics, indicating that the importance of phonotactic segmentation may have been overstated in studies where lexical information was artificially suppressed. These patterns were unaffected by whether the stimuli were elicited in a spontaneous or read context, even though the difference in speech styles was evident in a main effect. Durational analyses suggested speaker-driven cue trade-offs congruent with an H&H account, but these modulations did not impact on listener behaviour. We conclude that previous research exploiting read speech is reliable in indicating the primacy of lexically-based cues in the segmentation of natural

  3. The roles of long-term phonotactic and lexical prosodic knowledge in phonological short-term memory.

    Science.gov (United States)

    Tanida, Yuki; Ueno, Taiji; Lambon Ralph, Matthew A; Saito, Satoru

    2015-04-01

    Many previous studies have explored and confirmed the influence of long-term phonological representations on phonological short-term memory. In most investigations, phonological effects have been explored with respect to phonotactic constraints or frequency. If interaction between long-term memory and phonological short-term memory is a generalized principle, then other phonological characteristics-that is, suprasegmental aspects of phonology-should also exert similar effects on phonological short-term memory. We explored this hypothesis through three immediate serial-recall experiments that manipulated Japanese nonwords with respect to lexical prosody (pitch-accent type, reflecting suprasegmental characteristics) as well as phonotactic frequency (reflecting segmental characteristics). The results showed that phonotactic frequency affected the retention not only of the phonemic sequences, but also of pitch-accent patterns, when participants were instructed to recall both the phoneme sequence and accent pattern of nonwords. In addition, accent pattern typicality influenced the retention of the accent pattern: Typical accent patterns were recalled more accurately than atypical ones. These results indicate that both long-term phonotactic and lexical prosodic knowledge contribute to phonological short-term memory performance.

  4. Infants Learn Phonotactic Regularities from Brief Auditory Experience.

    Science.gov (United States)

    Chambers, Kyle E.; Onishi, Kristine H.; Fisher, Cynthia

    2003-01-01

    Two experiments investigated whether novel phonotactic regularities, not present in English, could be acquired by 16.5-month-olds from brief auditory experience. Subjects listened to consonant-vowel-consonant syllables in which particular consonants were artificially restricted to either initial or final position. Findings in a subsequent…

  5. The different time course of phonotactic constraint learning in children and adults: Evidence from speech errors.

    Science.gov (United States)

    Smalle, Eleonore H M; Muylle, Merel; Szmalec, Arnaud; Duyck, Wouter

    2017-11-01

    Speech errors typically respect the speaker's implicit knowledge of language-wide phonotactics (e.g., /t/ cannot be a syllable onset in the English language). Previous work demonstrated that adults can learn novel experimentally induced phonotactic constraints by producing syllable strings in which the allowable position of a phoneme depends on another phoneme within the sequence (e.g., /t/ can only be an onset if the medial vowel is /i/), but not earlier than the second day of training. Thus far, no work has been done with children. In the current 4-day experiment, a group of Dutch-speaking adults and 9-year-old children were asked to rapidly recite sequences of novel word forms (e.g., kieng nief siet hiem ) that were consistent with phonotactics of the spoken Dutch language. Within the procedure of the experiment, some consonants (i.e., /t/ and /k/) were restricted to the onset or coda position depending on the medial vowel (i.e., /i/ or "ie" vs. /øː/ or "eu"). Speech errors in adults revealed a learning effect for the novel constraints on the second day of learning, consistent with earlier findings. A post hoc analysis at the trial level showed that learning was statistically reliable after an exposure of 120 sequence trials (including a consolidation period). However, children started learning the constraints already on the first day. More precisely, the effect appeared significantly after an exposure of 24 sequences. These findings indicate that children are rapid implicit learners of novel phonotactics, which bears important implications for theorizing about developmental sensitivities in language learning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. The Precise Time Course of Lexical Activation: MEG Measurements of the Effects of Frequency, Probability, and Density in Lexical Decision

    Science.gov (United States)

    Stockall, Linnaea; Stringfellow, Andrew; Marantz, Alec

    2004-01-01

    Visually presented letter strings consistently yield three MEG response components: the M170, associated with letter-string processing (Tarkiainen, Helenius, Hansen, Cornelissen, & Salmelin, 1999); the M250, affected by phonotactic probability, (Pylkkanen, Stringfellow, & Marantz, 2002); and the M350, responsive to lexical frequency (Embick,…

  7. The role of consolidation in learning context-dependent phonotactic patterns in speech and digital sequence production.

    Science.gov (United States)

    Anderson, Nathaniel D; Dell, Gary S

    2018-04-03

    Speakers implicitly learn novel phonotactic patterns by producing strings of syllables. The learning is revealed in their speech errors. First-order patterns, such as "/f/ must be a syllable onset," can be distinguished from contingent, or second-order, patterns, such as "/f/ must be an onset if the vowel is /a/, but a coda if the vowel is /o/." A metaanalysis of 19 experiments clearly demonstrated that first-order patterns affect speech errors to a very great extent in a single experimental session, but second-order vowel-contingent patterns only affect errors on the second day of testing, suggesting the need for a consolidation period. Two experiments tested an analogue to these studies involving sequences of button pushes, with fingers as "consonants" and thumbs as "vowels." The button-push errors revealed two of the key speech-error findings: first-order patterns are learned quickly, but second-order thumb-contingent patterns are only strongly revealed in the errors on the second day of testing. The influence of computational complexity on the implicit learning of phonotactic patterns in speech production may be a general feature of sequence production.

  8. Phonotactic diversity predicts the time depth of the world's language families.

    Directory of Open Access Journals (Sweden)

    Taraka Rama

    Full Text Available The ASJP (Automated Similarity Judgment Program described an automated, lexical similarity-based method for dating the world's language groups using 52 archaeological, epigraphic and historical calibration date points. The present paper describes a new automated dating method, based on phonotactic diversity. Unlike ASJP, our method does not require any information on the internal classification of a language group. Also, the method can use all the available word lists for a language and its dialects eschewing the debate on 'language' vs. 'dialect'. We further combine these dates and provide a new baseline which, to our knowledge, is the best one. We make a systematic comparison of our method, ASJP's dating procedure, and combined dates. We predict time depths for world's language families and sub-families using this new baseline. Finally, we explain our results in the model of language change given by Nettle.

  9. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  10. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Science.gov (United States)

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  11. Mining of high utility-probability sequential patterns from uncertain databases.

    Directory of Open Access Journals (Sweden)

    Binbin Zhang

    Full Text Available High-utility sequential pattern mining (HUSPM has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs. They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM for mining high utility-probability sequential patterns (HUPSPs in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds.

  12. High probability of disease in angina pectoris patients

    DEFF Research Database (Denmark)

    Høilund-Carlsen, Poul F.; Johansen, Allan; Vach, Werner

    2007-01-01

    BACKGROUND: According to most current guidelines, stable angina pectoris patients with a high probability of having coronary artery disease can be reliably identified clinically. OBJECTIVES: To examine the reliability of clinical evaluation with or without an at-rest electrocardiogram (ECG......) in patients with a high probability of coronary artery disease. PATIENTS AND METHODS: A prospective series of 357 patients referred for coronary angiography (CA) for suspected stable angina pectoris were examined by a trained physician who judged their type of pain and Canadian Cardiovascular Society grade...... on CA. Of the patients who had also an abnormal at-rest ECG, 14% to 21% of men and 42% to 57% of women had normal MPS. Sex-related differences were statistically significant. CONCLUSIONS: Clinical prediction appears to be unreliable. Addition of at-rest ECG data results in some improvement, particularly...

  13. Decomposition of conditional probability for high-order symbolic Markov chains

    Science.gov (United States)

    Melnik, S. S.; Usatenko, O. V.

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  14. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  15. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    Science.gov (United States)

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  16. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  17. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  18. Identifying Changes in the Probability of High Temperature, High Humidity Heat Wave Events

    Science.gov (United States)

    Ballard, T.; Diffenbaugh, N. S.

    2016-12-01

    Understanding how heat waves will respond to climate change is critical for adequate planning and adaptation. While temperature is the primary determinant of heat wave severity, humidity has been shown to play a key role in heat wave intensity with direct links to human health and safety. Here we investigate the individual contributions of temperature and specific humidity to extreme heat wave conditions in recent decades. Using global NCEP-DOE Reanalysis II daily data, we identify regional variability in the joint probability distribution of humidity and temperature. We also identify a statistically significant positive trend in humidity over the eastern U.S. during heat wave events, leading to an increased probability of high humidity, high temperature events. The extent to which we can expect this trend to continue under climate change is complicated due to variability between CMIP5 models, in particular among projections of humidity. However, our results support the notion that heat wave dynamics are characterized by more than high temperatures alone, and understanding and quantifying the various components of the heat wave system is crucial for forecasting future impacts.

  19. Increasing Classroom Compliance: Using a High-Probability Command Sequence with Noncompliant Students

    Science.gov (United States)

    Axelrod, Michael I.; Zank, Amber J.

    2012-01-01

    Noncompliance is one of the most problematic behaviors within the school setting. One strategy to increase compliance of noncompliant students is a high-probability command sequence (HPCS; i.e., a set of simple commands in which an individual is likely to comply immediately prior to the delivery of a command that has a lower probability of…

  20. Multidetector computed tomographic pulmonary angiography in patients with a high clinical probability of pulmonary embolism.

    Science.gov (United States)

    Moores, L; Kline, J; Portillo, A K; Resano, S; Vicente, A; Arrieta, P; Corres, J; Tapson, V; Yusen, R D; Jiménez, D

    2016-01-01

    ESSENTIALS: When high probability of pulmonary embolism (PE), sensitivity of computed tomography (CT) is unclear. We investigated the sensitivity of multidetector CT among 134 patients with a high probability of PE. A normal CT alone may not safely exclude PE in patients with a high clinical pretest probability. In patients with no clear alternative diagnosis after CTPA, further testing should be strongly considered. Whether patients with a negative multidetector computed tomographic pulmonary angiography (CTPA) result and a high clinical pretest probability of pulmonary embolism (PE) should be further investigated is controversial. This was a prospective investigation of the sensitivity of multidetector CTPA among patients with a priori clinical assessment of a high probability of PE according to the Wells criteria. Among patients with a negative CTPA result, the diagnosis of PE required at least one of the following conditions: ventilation/perfusion lung scan showing a high probability of PE in a patient with no history of PE, abnormal findings on venous ultrasonography in a patient without previous deep vein thrombosis at that site, or the occurrence of venous thromboembolism (VTE) in a 3-month follow-up period after anticoagulation was withheld because of a negative multidetector CTPA result. We identified 498 patients with a priori clinical assessment of a high probability of PE and a completed CTPA study. CTPA excluded PE in 134 patients; in these patients, the pooled incidence of VTE was 5.2% (seven of 134 patients; 95% confidence interval [CI] 1.5-9.0). Five patients had VTEs that were confirmed by an additional imaging test despite a negative CTPA result (five of 48 patients; 10.4%; 95% CI 1.8-19.1), and two patients had objectively confirmed VTEs that occurred during clinical follow-up of at least 3 months (two of 86 patients; 2.3%; 95% CI 0-5.5). None of the patients had a fatal PE during follow-up. A normal multidetector CTPA result alone may not safely

  1. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect

    Science.gov (United States)

    Gosling, Corentin J.; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation. PMID:28232808

  2. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.

    Science.gov (United States)

    Gosling, Corentin J; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  3. The probability distribution of intergranular stress corrosion cracking life for sensitized 304 stainless steels in high temperature, high purity water

    International Nuclear Information System (INIS)

    Akashi, Masatsune; Kenjyo, Takao; Matsukura, Shinji; Kawamoto, Teruaki

    1984-01-01

    In order to discuss the probability distribution of intergranular stress corrsion carcking life for sensitized 304 stainless steels, a series of the creviced bent beem (CBB) and the uni-axial constant load tests were carried out in oxygenated high temperature, high purity water. The following concludions were resulted; (1) The initiation process of intergranular stress corrosion cracking has been assumed to be approximated by the Poisson stochastic process, based on the CBB test results. (2) The probability distribution of intergranular stress corrosion cracking life may consequently be approximated by the exponential probability distribution. (3) The experimental data could be fitted to the exponential probability distribution. (author)

  4. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  5. High temperature triggers latent variation among individuals: oviposition rate and probability for outbreaks.

    Directory of Open Access Journals (Sweden)

    Christer Björkman

    2011-01-01

    Full Text Available It is anticipated that extreme population events, such as extinctions and outbreaks, will become more frequent as a consequence of climate change. To evaluate the increased probability of such events, it is crucial to understand the mechanisms involved. Variation between individuals in their response to climatic factors is an important consideration, especially if microevolution is expected to change the composition of populations.Here we present data of a willow leaf beetle species, showing high variation among individuals in oviposition rate at a high temperature (20 °C. It is particularly noteworthy that not all individuals responded to changes in temperature; individuals laying few eggs at 20 °C continued to do so when transferred to 12 °C, whereas individuals that laid many eggs at 20 °C reduced their oviposition and laid the same number of eggs as the others when transferred to 12 °C. When transferred back to 20 °C most individuals reverted to their original oviposition rate. Thus, high variation among individuals was only observed at the higher temperature. Using a simple population model and based on regional climate change scenarios we show that the probability of outbreaks increases if there is a realistic increase in the number of warm summers. The probability of outbreaks also increased with increasing heritability of the ability to respond to increased temperature.If climate becomes warmer and there is latent variation among individuals in their temperature response, the probability for outbreaks may increase. However, the likelihood for microevolution to play a role may be low. This conclusion is based on the fact that it has been difficult to show that microevolution affect the probability for extinctions. Our results highlight the urge for cautiousness when predicting the future concerning probabilities for extreme population events.

  6. The Effect of High Frequency Pulse on the Discharge Probability in Micro EDM

    Science.gov (United States)

    Liu, Y.; Qu, Y.; Zhang, W.; Ma, F.; Sha, Z.; Wang, Y.; Rolfe, B.; Zhang, S.

    2017-12-01

    High frequency pulse improves the machining efficiency of micro electric discharge machining (micro EDM), while it also brings some changes in micro EDM process. This paper focuses on the influence of skin-effect under the high frequency pulse on energy distribution and transmission in micro EDM, based on which, the rules of discharge probability of electrode end face are also analysed. On the basis of the electrical discharge process under the condition of high frequency pulse in micro EDM, COMSOL Multiphysics software is used to establish energy transmission model in micro electrode. The discharge energy distribution and transmission within tool electrode under different pulse frequencies, electrical currents, and permeability situation are studied in order to get the distribution pattern of current density and electric field intensity in the electrode end face under the influence of electrical parameters change. The electric field intensity distribution is regarded as the influencing parameter of discharge probability on the electrode end. Finally, MATLAB is used to fit the curve and obtain the distribution of discharge probability of electrode end face.

  7. Probabilistic analysis of Millstone Unit 3 ultimate containment failure probability given high pressure: Chapter 14

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1983-01-01

    The quantification of the containment event trees in the Millstone Unit 3 Probabilistic Safety Study utilizes a conditional probability of failure given high pressure which is based on a new approach. The generation of this conditional probability was based on a weakest link failure mode model which considered contributions from a number of overlapping failure modes. This overlap effect was due to a number of failure modes whose mean failure pressures were clustered within a 5 psi range and which had uncertainties due to variances in material strengths and analytical uncertainties which were between 9 and 15 psi. Based on a review of possible probability laws to describe the failure probability of individual structural failure modes, it was determined that a Weibull probability law most adequately described the randomness in the physical process of interest. The resultant conditional probability of failure is found to have a median failure pressure of 132.4 psia. The corresponding 5-95 percentile values are 112 psia and 146.7 psia respectively. The skewed nature of the conditional probability of failure vs. pressure results in a lower overall containment failure probability for an appreciable number of the severe accident sequences of interest, but also probabilities which are more rigorously traceable from first principles

  8. Vocabulary Facilitates Speech Perception in Children With Hearing Aids.

    Science.gov (United States)

    Klein, Kelsey E; Walker, Elizabeth A; Kirby, Benjamin; McCreery, Ryan W

    2017-08-16

    We examined the effects of vocabulary, lexical characteristics (age of acquisition and phonotactic probability), and auditory access (aided audibility and daily hearing aid [HA] use) on speech perception skills in children with HAs. Participants included 24 children with HAs and 25 children with normal hearing (NH), ages 5-12 years. Groups were matched on age, expressive and receptive vocabulary, articulation, and nonverbal working memory. Participants repeated monosyllabic words and nonwords in noise. Stimuli varied on age of acquisition, lexical frequency, and phonotactic probability. Performance in each condition was measured by the signal-to-noise ratio at which the child could accurately repeat 50% of the stimuli. Children from both groups with larger vocabularies showed better performance than children with smaller vocabularies on nonwords and late-acquired words but not early-acquired words. Overall, children with HAs showed poorer performance than children with NH. Auditory access was not associated with speech perception for the children with HAs. Children with HAs show deficits in sensitivity to phonological structure but appear to take advantage of vocabulary skills to support speech perception in the same way as children with NH. Further investigation is needed to understand the causes of the gap that exists between the overall speech perception abilities of children with HAs and children with NH.

  9. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  10. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    Science.gov (United States)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to

  11. A prototype method for diagnosing high ice water content probability using satellite imager data

    Science.gov (United States)

    Yost, Christopher R.; Bedka, Kristopher M.; Minnis, Patrick; Nguyen, Louis; Strapp, J. Walter; Palikonda, Rabindra; Khlopenkov, Konstantin; Spangenberg, Douglas; Smith, William L., Jr.; Protat, Alain; Delanoe, Julien

    2018-03-01

    Recent studies have found that ingestion of high mass concentrations of ice particles in regions of deep convective storms, with radar reflectivity considered safe for aircraft penetration, can adversely impact aircraft engine performance. Previous aviation industry studies have used the term high ice water content (HIWC) to define such conditions. Three airborne field campaigns were conducted in 2014 and 2015 to better understand how HIWC is distributed in deep convection, both as a function of altitude and proximity to convective updraft regions, and to facilitate development of new methods for detecting HIWC conditions, in addition to many other research and regulatory goals. This paper describes a prototype method for detecting HIWC conditions using geostationary (GEO) satellite imager data coupled with in situ total water content (TWC) observations collected during the flight campaigns. Three satellite-derived parameters were determined to be most useful for determining HIWC probability: (1) the horizontal proximity of the aircraft to the nearest overshooting convective updraft or textured anvil cloud, (2) tropopause-relative infrared brightness temperature, and (3) daytime-only cloud optical depth. Statistical fits between collocated TWC and GEO satellite parameters were used to determine the membership functions for the fuzzy logic derivation of HIWC probability. The products were demonstrated using data from several campaign flights and validated using a subset of the satellite-aircraft collocation database. The daytime HIWC probability was found to agree quite well with TWC time trends and identified extreme TWC events with high probability. Discrimination of HIWC was more challenging at night with IR-only information. The products show the greatest capability for discriminating TWC ≥ 0.5 g m-3. Product validation remains challenging due to vertical TWC uncertainties and the typically coarse spatio-temporal resolution of the GEO data.

  12. Probability of fracture and life extension estimate of the high-flux isotope reactor vessel

    International Nuclear Information System (INIS)

    Chang, S.J.

    1998-01-01

    The state of the vessel steel embrittlement as a result of neutron irradiation can be measured by its increase in ductile-brittle transition temperature (DBTT) for fracture, often denoted by RT NDT for carbon steel. This transition temperature can be calibrated by the drop-weight test and, sometimes, by the Charpy impact test. The life extension for the high-flux isotope reactor (HFIR) vessel is calculated by using the method of fracture mechanics that is incorporated with the effect of the DBTT change. The failure probability of the HFIR vessel is limited as the life of the vessel by the reactor core melt probability of 10 -4 . The operating safety of the reactor is ensured by periodic hydrostatic pressure test (hydrotest). The hydrotest is performed in order to determine a safe vessel static pressure. The fracture probability as a result of the hydrostatic pressure test is calculated and is used to determine the life of the vessel. Failure to perform hydrotest imposes the limit on the life of the vessel. The conventional method of fracture probability calculations such as that used by the NRC-sponsored PRAISE CODE and the FAVOR CODE developed in this Laboratory are based on the Monte Carlo simulation. Heavy computations are required. An alternative method of fracture probability calculation by direct probability integration is developed in this paper. The present approach offers simple and expedient ways to obtain numerical results without losing any generality. In this paper, numerical results on (1) the probability of vessel fracture, (2) the hydrotest time interval, and (3) the hydrotest pressure as a result of the DBTT increase are obtained

  13. Risk-averse decision-making for civil infrastructure exposed to low-probability, high-consequence events

    International Nuclear Information System (INIS)

    Cha, Eun Jeong; Ellingwood, Bruce R.

    2012-01-01

    Quantitative analysis and assessment of risk to civil infrastructure has two components: probability of a potentially damaging event and consequence of damage, measured in terms of financial or human losses. Decision models that have been utilized during the past three decades take into account the probabilistic component rationally, but address decision-maker attitudes toward consequences and risk only to a limited degree. The application of models reflecting these attitudes to decisions involving low-probability, high-consequence events that may impact civil infrastructure requires a fundamental understanding of risk acceptance attitudes and how they affect individual and group choices. In particular, the phenomenon of risk aversion may be a significant factor in decisions for civil infrastructure exposed to low-probability events with severe consequences, such as earthquakes, hurricanes or floods. This paper utilizes cumulative prospect theory to investigate the role and characteristics of risk-aversion in assurance of structural safety.

  14. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  15. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  16. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  17. Neutron emission probability at high excitation and isospin

    International Nuclear Information System (INIS)

    Aggarwal, Mamta

    2005-01-01

    One-neutron and two-neutron emission probability at different excitations and varying isospin have been studied. Several degrees of freedom like deformation, rotations, temperature, isospin fluctuations and shell structure are incorporated via statistical theory of hot rotating nuclei

  18. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    (Divoky et al., 2005). Nevertheless, such events occur and in Ireland alone there are several cases of serious damage due to flooding resulting from a combination of high sea water levels and river flows driven by the same meteorological conditions (e.g. Olbert et al. 2015). A November 2009 fluvial-coastal flooding of Cork City bringing €100m loss was one such incident. This event was used by Olbert et al. (2015) to determine processes controlling urban flooding and is further explored in this study to elaborate on coastal and fluvial flood mechanisms and their roles in controlling water levels. The objective of this research is to develop a methodology to assess combined effect of multiple source flooding on flood probability and severity in urban areas and to establish a set of conditions that dictate urban flooding due to extreme climatic events. These conditions broadly combine physical flood drivers (such as coastal and fluvial processes), their mechanisms and thresholds defining flood severity. The two main physical processes controlling urban flooding: high sea water levels (coastal flooding) and high river flows (fluvial flooding), and their threshold values for which flood is likely to occur, are considered in this study. Contribution of coastal and fluvial drivers to flooding and their impacts are assessed in a two-step process. The first step involves frequency analysis and extreme value statistical modelling of storm surges, tides and river flows and ultimately the application of joint probability method to estimate joint exceedence return periods for combination of surges, tide and river flows. In the second step, a numerical model of Cork Harbour MSN_Flood comprising a cascade of four nested high-resolution models is used to perform simulation of flood inundation under numerous hypothetical coastal and fluvial flood scenarios. The risk of flooding is quantified based on a range of physical aspects such as the extent and depth of inundation (Apel et al

  19. Domain general constraints on statistical learning.

    Science.gov (United States)

    Thiessen, Erik D

    2011-01-01

    All theories of language development suggest that learning is constrained. However, theories differ on whether these constraints arise from language-specific processes or have domain-general origins such as the characteristics of human perception and information processing. The current experiments explored constraints on statistical learning of patterns, such as the phonotactic patterns of an infants' native language. Infants in these experiments were presented with a visual analog of a phonotactic learning task used by J. R. Saffran and E. D. Thiessen (2003). Saffran and Thiessen found that infants' phonotactic learning was constrained such that some patterns were learned more easily than other patterns. The current results indicate that infants' learning of visual patterns shows the same constraints as infants' learning of phonotactic patterns. This is consistent with theories suggesting that constraints arise from domain-general sources and, as such, should operate over many kinds of stimuli in addition to linguistic stimuli. © 2011 The Author. Child Development © 2011 Society for Research in Child Development, Inc.

  20. Prognostic value of stress echocardiography in women with high (⩾80%) probability of coronary artery disease

    OpenAIRE

    Davar, J; Roberts, E; Coghlan, J; Evans, T; Lipkin, D

    2001-01-01

    OBJECTIVE—To assess the prognostic significance of stress echocardiography in women with a high probability of coronary artery disease (CAD).
SETTING—Secondary and tertiary cardiology unit at a university teaching hospital.
PARTICIPANTS—A total of 135 women (mean (SD) age 63 (9) years) with pre-test probability of CAD ⩾80% were selected from a database of patients investigated by treadmill or dobutamine stress echocardiography between 1995 and 1998.
MAIN OUTCOME MEASURES—Patients were followe...

  1. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  2. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  3. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  4. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  5. Learning difficulties of senior high school students based on probability understanding levels

    Science.gov (United States)

    Anggara, B.; Priatna, N.; Juandi, D.

    2018-05-01

    Identifying students' difficulties in learning concept of probability is important for teachers to prepare the appropriate learning processes and can overcome obstacles that may arise in the next learning processes. This study revealed the level of students' understanding of the concept of probability and identified their difficulties as a part of the epistemological obstacles identification of the concept of probability. This study employed a qualitative approach that tends to be the character of descriptive research involving 55 students of class XII. In this case, the writer used the diagnostic test of probability concept learning difficulty, observation, and interview as the techniques to collect the data needed. The data was used to determine levels of understanding and the learning difficulties experienced by the students. From the result of students' test result and learning observation, it was found that the mean cognitive level was at level 2. The findings indicated that students had appropriate quantitative information of probability concept but it might be incomplete or incorrectly used. The difficulties found are the ones in arranging sample space, events, and mathematical models related to probability problems. Besides, students had difficulties in understanding the principles of events and prerequisite concept.

  6. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  7. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  8. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  9. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  10. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  11. Wordlikeness and Word Learning in Children with Hearing Loss

    Science.gov (United States)

    Stiles, Derek J.; McGregor, Karla K.; Bentler, Ruth A.

    2013-01-01

    Background: The more a novel word conforms to the phonotactics of the language, the more wordlike it is and the easier it is to learn. It is unknown to what extent children with hearing loss (CHL) take advantage of phonotactic cues to support word learning. Aims: This study investigated whether CHL had similar sensitivities to wordlikeness during…

  12. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    Directory of Open Access Journals (Sweden)

    Katherine E Baird

    2016-09-01

    Full Text Available Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills.

  13. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  14. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  15. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  16. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  17. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  18. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  19. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  20. Climate drives inter-annual variability in probability of high severity fire occurrence in the western United States

    Science.gov (United States)

    Keyser, Alisa; Westerling, Anthony LeRoy

    2017-05-01

    A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.

  1. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  2. Approximation of rejective sampling inclusion probabilities and application to high order correlations

    NARCIS (Netherlands)

    Boistard, H.; Lopuhää, H.P.; Ruiz-Gazen, A.

    2012-01-01

    This paper is devoted to rejective sampling. We provide an expansion of joint inclusion probabilities of any order in terms of the inclusion probabilities of order one, extending previous results by Hájek (1964) and Hájek (1981) and making the remainder term more precise. Following Hájek (1981), the

  3. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  4. A 'new' Cromer-related high frequency antigen probably antithetical to WES.

    Science.gov (United States)

    Daniels, G L; Green, C A; Darr, F W; Anderson, H; Sistonen, P

    1987-01-01

    An antibody to a high frequency antigen, made in a WES+ Black antenatal patient (Wash.), failed to react with the red cells of a presumed WES+ homozygote and is, therefore, probably antithetical to anti-WES. Like anti-WES, it reacted with papain, ficin, trypsin or neuraminidase treated cells but not with alpha-chymotrypsin or pronase treated cells and was specifically inhibited by concentrated serum. It also reacted more strongly in titration with WES- cells than with WES+ cells. The antibody is Cromer-related as it failed to react with Inab phenotype (IFC-) cells and reacted only weakly with Dr(a-) cells. Wash. cells and those of the other possible WES+ homozygote are Cr(a+) Tc(a+b-c-) Dr(a+) IFC+ but reacted only very weakly with anti-Esa.

  5. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  6. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  7. Development of risk assessment simulation tool for optimal control of a low probability-high consequence disaster

    International Nuclear Information System (INIS)

    Yotsumoto, Hiroki; Yoshida, Kikuo; Genchi, Hiroshi

    2011-01-01

    In order to control low probability-high consequence disaster which causes huge social and economic damage, it is necessary to develop simultaneous risk assessment simulation tool based on the scheme of disaster risk including diverse effects of primary disaster and secondary damages. We propose the scheme of this risk simulation tool. (author)

  8. Male responses to conspecific advertisement signals in the field cricket Gryllus rubens (Orthoptera: Gryllidae).

    Science.gov (United States)

    Jang, Yikweon

    2011-01-20

    In many species males aggregate and produce long-range advertisement signals to attract conspecific females. The majority of the receivers of these signals are probably other males most of the time, and male responses to competitors' signals can structure the spatial and temporal organization of the breeding aggregation and affect male mating tactics. I quantified male responses to a conspecific advertisement stimulus repeatedly over three age classes in Gryllus rubens (Orthoptera: Gryllidae) in order to estimate the type and frequency of male responses to the broadcast stimulus and to determine the factors affecting them. Factors tested included body size, wing dimorphism, age, and intensity of the broadcast stimulus. Overall, males employed acoustic response more often than positive phonotactic response. As males aged, the frequency of positive phonotactic response decreased but that of the acoustic response increased. That is, males may use positive phonotaxis in the early stages of their adult lives, possibly to find suitable calling sites or parasitize calling males, and then later in life switch to acoustic responses in response to conspecific advertisement signals. Males with smaller body size more frequently exhibited acoustic responses. This study suggests that individual variation, more than any factors measured, is critical for age-dependent male responses to conspecific advertisement signals.

  9. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  10. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  11. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  12. Influence of the Probability Level on the Framing Effect

    Directory of Open Access Journals (Sweden)

    Kaja Damnjanovic

    2016-11-01

    Full Text Available Research of the framing effect of risky choice mostly applies to the tasks where the effect of only one probability or risk level on the choice of non-risky or risky options was examined. The conducted research was aimed to examine the framing effect in the function of probability level in the outcome of a risk option in three decision-making domains: health, money and human lives. It has been confirmed that the decision-making domain moderates the framing effect. In the monetary domain, the general risk aversion has been confirmed as registered in earlier research. At high probability levels, the framing effect is registered in both frames, while no framing effect is registered at lower probability levels. In the domain of decision-making about human lives, the framing effect is registered at medium high and medium low probability levels. In the domain of decision-making about health, the framing effect is registered almost in the entire probability range while this domain differs from the former two. The results show that the attitude to risk is not identical at different probability levels, that the dynamics of the attitude to risk influences the framing effect, and that the framing effect pattern is different in different decision-making domains. In other words, linguistic manipulation representing the frame in the tasks affects the change in the preference order only when the possibility of gain (expressed in probability is estimated as sufficiently high.

  13. Inferior Frontal Sensitivity to Common Speech Sounds Is Amplified by Increasing Word Intelligibility

    Science.gov (United States)

    Vaden, Kenneth I., Jr.; Kuchinsky, Stefanie E.; Keren, Noam I.; Harris, Kelly C.; Ahlstrom, Jayne B.; Dubno, Judy R.; Eckert, Mark A.

    2011-01-01

    The left inferior frontal gyrus (LIFG) exhibits increased responsiveness when people listen to words composed of speech sounds that frequently co-occur in the English language (Vaden, Piquado, & Hickok, 2011), termed high phonotactic frequency (Vitevitch & Luce, 1998). The current experiment aimed to further characterize the relation of…

  14. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  15. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  16. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  17. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  18. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  19. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  20. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  1. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  2. Serial follow up V/P scanning in assessment of treatment response in high probability scans for pulmonary embolism

    Energy Technology Data Exchange (ETDEWEB)

    Moustafa, H; Elhaddad, SH; Wagih, SH; Ziada, G; Samy, A; Saber, R [Department of nuclear medicine and radiology, faculty of medicine, Cairo university, Cairo, (Egypt)

    1995-10-01

    138 patients proved with V/P scan to have different probabilities of pulmonary emboli event. Serial follow up scanning after 3 days, 2 weeks, 1 month and 3 months was done, with anticoagulant therapy. Out of the remaining 10 patients, 6 patients died with documented P.E. by P.M. study and lost follow up recorded in 4 patients. Complete response with disappearance of all perfusion defects after 2 weeks was detected in 37 patients (49.3%), partial improvement of lesions after 3 months was elicited in 32%. The overall incidence of response was (81.3%) such response was complete in low probability group (100%), (84.2%) in intermediate group and (79.3%) in high probability group with partial response in 45.3%. New lesions were evident in 18.7% of this series. To conclude that serial follow up V/P scan is mandatory for evaluation of response to anticoagulant therapy specially in first 3 months. 2 figs., 3 tabs.

  3. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  4. Measurements of atomic transition probabilities in highly ionized atoms by fast ion beams

    International Nuclear Information System (INIS)

    Martinson, I.; Curtis, L.J.; Lindgaerd, A.

    1977-01-01

    A summary is given of the beam-foil method by which level lifetimes and transition probabilities can be determined in atoms and ions. Results are presented for systems of particular interest for fusion research, such as the Li, Be, Na, Mg, Cu and Zn isoelectronic sequences. The available experimental material is compared to theoretical transition probabilities. (author)

  5. Oil spill contamination probability in the southeastern Levantine basin.

    Science.gov (United States)

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Modulation of Brain Activity during Phonological Familiarization

    Science.gov (United States)

    Majerus, S.; Van der Linden, M.; Collette, F.; Laureys, S.; Poncelet, M.; Degueldre, C.; Delfiore, G.; Luxen, A.; Salmon, E.

    2005-01-01

    We measured brain activity in 12 adults for the repetition of auditorily presented words and nonwords, before and after repeated exposure to their phonological form. The nonword phoneme combinations were either of high (HF) or low (LF) phonotactic frequency. After familiarization, we observed, for both word and nonword conditions, decreased…

  7. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  8. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  9. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  10. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  11. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  12. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    KAUST Repository

    Genton, Marc G.

    2017-09-07

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  13. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    KAUST Repository

    Genton, Marc G.; Keyes, David E.; Turkiyyah, George

    2017-01-01

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  14. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  15. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  16. Assessment of local variability by high-throughput e-beam metrology for prediction of patterning defect probabilities

    Science.gov (United States)

    Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram

    2018-03-01

    We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.

  17. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  18. Why segmentation matters: experience-driven segmentation errors impair “morpheme” learning

    Science.gov (United States)

    Finn, Amy S.; Hudson Kam, Carla L.

    2015-01-01

    We ask whether an adult learner’s knowledge of their native language impedes statistical learning in a new language beyond just word segmentation (as previously shown). In particular, we examine the impact of native-language word-form phonotactics on learners’ ability to segment words into their component morphemes and learn phonologically triggered variation of morphemes. We find that learning is impaired when words and component morphemes are structured to conflict with a learner’s native-language phonotactic system, but not when native-language phonotactics do not conflict with morpheme boundaries in the artificial language. A learner’s native-language knowledge can therefore have a cascading impact affecting word segmentation and the morphological variation that relies upon proper segmentation. These results show that getting word segmentation right early in learning is deeply important for learning other aspects of language, even those (morphology) that are known to pose a great difficulty for adult language learners. PMID:25730305

  19. Why segmentation matters: Experience-driven segmentation errors impair "morpheme" learning.

    Science.gov (United States)

    Finn, Amy S; Hudson Kam, Carla L

    2015-09-01

    We ask whether an adult learner's knowledge of their native language impedes statistical learning in a new language beyond just word segmentation (as previously shown). In particular, we examine the impact of native-language word-form phonotactics on learners' ability to segment words into their component morphemes and learn phonologically triggered variation of morphemes. We find that learning is impaired when words and component morphemes are structured to conflict with a learner's native-language phonotactic system, but not when native-language phonotactics do not conflict with morpheme boundaries in the artificial language. A learner's native-language knowledge can therefore have a cascading impact affecting word segmentation and the morphological variation that relies upon proper segmentation. These results show that getting word segmentation right early in learning is deeply important for learning other aspects of language, even those (morphology) that are known to pose a great difficulty for adult language learners. (c) 2015 APA, all rights reserved).

  20. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  1. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  2. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  3. Trial type probability modulates the cost of antisaccades

    Science.gov (United States)

    Chiau, Hui-Yan; Tseng, Philip; Su, Jia-Han; Tzeng, Ovid J. L.; Hung, Daisy L.; Muggleton, Neil G.

    2011-01-01

    The antisaccade task, where eye movements are made away from a target, has been used to investigate the flexibility of cognitive control of behavior. Antisaccades usually have longer saccade latencies than prosaccades, the so-called antisaccade cost. Recent studies have shown that this antisaccade cost can be modulated by event probability. This may mean that the antisaccade cost can be reduced, or even reversed, if the probability of surrounding events favors the execution of antisaccades. The probabilities of prosaccades and antisaccades were systematically manipulated by changing the proportion of a certain type of trial in an interleaved pro/antisaccades task. We aimed to disentangle the intertwined relationship between trial type probabilities and the antisaccade cost with the ultimate goal of elucidating how probabilities of trial types modulate human flexible behaviors, as well as the characteristics of such modulation effects. To this end, we examined whether implicit trial type probability can influence saccade latencies and also manipulated the difficulty of cue discriminability to see how effects of trial type probability would change when the demand on visual perceptual analysis was high or low. A mixed-effects model was applied to the analysis to dissect the factors contributing to the modulation effects of trial type probabilities. Our results suggest that the trial type probability is one robust determinant of antisaccade cost. These findings highlight the importance of implicit probability in the flexibility of cognitive control of behavior. PMID:21543748

  4. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  5. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  6. Semantic and associative factors in probability learning with words.

    Science.gov (United States)

    Schipper, L M; Hanson, B L; Taylor, G; Thorpe, J A

    1973-09-01

    Using a probability-learning technique with a single word as the cue and with the probability of a given event following this word fixed at .80, it was found (1) that neither high nor low associates to the original word and (2) that neither synonyms nor antonyms showed differential learning curves subsequent to original learning when the probability for the following event was shifted to .20. In a second study when feedback, in the form of knowledge of results, was withheld, there was a clear-cut similarity of predictions to the originally trained word and the synonyms of both high and low association value and a dissimilarity of these words to a set of antonyms of both high and low association value. Two additional studies confirmed the importance of the semantic dimension as compared with association value as traditionally measured.

  7. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  8. Addendum to ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’

    Science.gov (United States)

    Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis

    2018-02-01

    This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.

  9. Phonological Concept Learning.

    Science.gov (United States)

    Moreton, Elliott; Pater, Joe; Pertsova, Katya

    2017-01-01

    Linguistic and non-linguistic pattern learning have been studied separately, but we argue for a comparative approach. Analogous inductive problems arise in phonological and visual pattern learning. Evidence from three experiments shows that human learners can solve them in analogous ways, and that human performance in both cases can be captured by the same models. We test GMECCS (Gradual Maximum Entropy with a Conjunctive Constraint Schema), an implementation of the Configural Cue Model (Gluck & Bower, ) in a Maximum Entropy phonotactic-learning framework (Goldwater & Johnson, ; Hayes & Wilson, ) with a single free parameter, against the alternative hypothesis that learners seek featurally simple algebraic rules ("rule-seeking"). We study the full typology of patterns introduced by Shepard, Hovland, and Jenkins () ("SHJ"), instantiated as both phonotactic patterns and visual analogs, using unsupervised training. Unlike SHJ, Experiments 1 and 2 found that both phonotactic and visual patterns that depended on fewer features could be more difficult than those that depended on more features, as predicted by GMECCS but not by rule-seeking. GMECCS also correctly predicted performance differences between stimulus subclasses within each pattern. A third experiment tried supervised training (which can facilitate rule-seeking in visual learning) to elicit simple rule-seeking phonotactic learning, but cue-based behavior persisted. We conclude that similar cue-based cognitive processes are available for phonological and visual concept learning, and hence that studying either kind of learning can lead to significant insights about the other. Copyright © 2015 Cognitive Science Society, Inc.

  10. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  11. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  12. Developing a probability-based model of aquifer vulnerability in an agricultural region

    Science.gov (United States)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  13. Highly enhanced avalanche probability using sinusoidally-gated silicon avalanche photodiode

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Shingo; Namekata, Naoto, E-mail: nnao@phys.cst.nihon-u.ac.jp; Inoue, Shuichiro [Institute of Quantum Science, Nihon University, 1-8-14 Kanda-Surugadai, Chiyoda-ku, Tokyo 101-8308 (Japan); Tsujino, Kenji [Tokyo Women' s Medical University, 8-1 Kawada-cho, Shinjuku-ku, Tokyo 162-8666 (Japan)

    2014-01-27

    We report on visible light single photon detection using a sinusoidally-gated silicon avalanche photodiode. Detection efficiency of 70.6% was achieved at a wavelength of 520 nm when an electrically cooled silicon avalanche photodiode with a quantum efficiency of 72.4% was used, which implies that a photo-excited single charge carrier in a silicon avalanche photodiode can trigger a detectable avalanche (charge) signal with a probability of 97.6%.

  14. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  15. Skin damage probabilities using fixation materials in high-energy photon beams

    International Nuclear Information System (INIS)

    Carl, J.; Vestergaard, A.

    2000-01-01

    Patient fixation, such as thermoplastic masks, carbon-fibre support plates and polystyrene bead vacuum cradles, is used to reproduce patient positioning in radiotherapy. Consequently low-density materials may be introduced in high-energy photon beams. The aim of the this study was to measure the increase in skin dose when low-density materials are present and calculate the radiobiological consequences in terms of probabilities of early and late skin damage. An experimental thin-windowed plane-parallel ion chamber was used. Skin doses were measured using various overlaying low-density fixation materials. A fixed geometry of a 10 x 10 cm field, a SSD = 100 cm and photon energies of 4, 6 and 10 MV on Varian Clinac 2100C accelerators were used for all measurements. Radiobiological consequences of introducing these materials into the high-energy photon beams were evaluated in terms of early and late damage of the skin based on the measured surface doses and the LQ-model. The experimental ion chamber save results consistent with other studies. A relationship between skin dose and material thickness in mg/cm 2 was established and used to calculate skin doses in scenarios assuming radiotherapy treatment with opposed fields. Conventional radiotherapy may apply mid-point doses up to 60-66 Gy in daily 2-Gy fractions opposed fields. Using thermoplastic fixation and high-energy photons as low as 4 MV do increase the dose to the skin considerably. However, using thermoplastic materials with thickness less than 100 mg/cm 2 skin doses are comparable with those produced by variation in source to skin distance, field size or blocking trays within clinical treatment set-ups. The use of polystyrene cradles and carbon-fibre materials with thickness less than 100 mg/cm 2 should be avoided at 4 MV at doses above 54-60 Gy. (author)

  16. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  17. Poor concordance of spiral CT (SCT) and high probability ventilation-perfusion (V/Q) studies in the diagnosis of pulmonary embolism (PE)

    International Nuclear Information System (INIS)

    Roman, M.R.; Angelides, S.; Chen, N.

    2000-01-01

    Full text: Despite its limitations, V/Q scintigraphy remains the favoured non-invasive technique for the diagnosis of pulmonary embolism (PE). PE is present in 85-90% and 30-40% of high and intermediate probability V/Q studies respectively. The value of spiral CT (SCT), a newer imaging modality, has yet to be determined. The aims of this study were to determine the frequency of positive SCT for PE in high and intermediate probability V/Q studies performed within 24hr apart. 15 patients (6M, 9F, mean age - 70.2) with a high probability study were included. Six (40%) SCT were reported as positive (four with emboli present in the main pulmonary arteries), seven as negative, one equivocal and one was technically sub-optimal. Pulmonary angiography was not performed in any patient. In all seven negative studies, the SCT was performed before the V/Q study. Of these, two studies were revised to positive once the result of the V/Q study was known, while, three others had resolving mismatch V/Q defects on follow-up studies (performed 5-14 days later); two of these three also had a positive duplex scan of the lower limbs. One other was most likely due to chronic thromboembolic disease. Only three patients had a V/Q scan prior to the SCT; all were positive for PE on both imaging modalities. Of 26 patients (11M, 15F, mean age - 68.5) with an intermediate probability V/Q study, SCT was positive in only two (8%). Thus the low detection rate of PE by SCT in this albeit small series, raises doubts as to its role in the diagnosis of PE. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  18. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  19. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    International Nuclear Information System (INIS)

    Krupnick, A.J.; Markandya, A.; Nickell, E.

    1994-01-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report

  20. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    Energy Technology Data Exchange (ETDEWEB)

    Krupnick, A J; Markandya, A; Nickell, E

    1994-07-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report.

  1. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Tercariol, Cesar Augusto Sangaletti; Kiipper, Felipe de Moura; Martinez, Alexandre Souto

    2007-01-01

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P (d,N) m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  2. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  3. Conditional probability of intense rainfall producing high ground concentrations from radioactive plumes

    International Nuclear Information System (INIS)

    Wayland, J.R.

    1977-03-01

    The overlap of the expanding plume of radioactive material from a hypothetical nuclear accident with rainstorms over dense population areas is considered. The conditional probability of the occurrence of hot spots from intense cellular rainfall is presented

  4. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  5. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  6. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  7. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  8. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  9. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  10. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  11. Subsequent investigation and management of patients with intermediate-category and - probability ventilation - perfusion scintigraphy

    International Nuclear Information System (INIS)

    Walsh, G.; Jones, D.N.

    2000-01-01

    The authors wished to determine the proportion of patients with intermediate-category and intermediate-probability ventilation-perfusion scintigraphy (IVQS) who proceed to further imaging for investigation of thromboembolism, to identify the defining clinical parameters and to determine the proportion of patients who have a definite imaging diagnosis of thromboembolism prior to discharge from hospital on anticoagulation therapy. One hundred and twelve VQS studies performed at the Flinders Medical Centre over a 9-month period were reported as having intermediate category and probability for pulmonary embolism. Medical case notes were available for review in 99 of these patients and from these the pretest clinical probability, subsequent patient progress and treatment were recorded. Eight cases were excluded because they were already receiving anticoagulation therapy. In the remaining 91 patients the pretest clinical probability was considered to be low in 25; intermediate in 30; and high in 36 cases. In total, 51.6% (n = 47) of these patients (8% (n = 2) with low, 66% (n = 20) with intermediate, and 69.4% (n = 25) with high pretest probability) proceeded to CT pulmonary angiography (CTPA) and/or lower limb duplex Doppler ultrasound (DUS) evaluation. Of the patients with IVQS results, 30.7% (n 28) were evaluated with CTPA. No patient with a low, all patients with a high and 46% of patients with an intermediate pretest probability initially received anticoagulation therapy. This was discontinued in three patients with high and in 12 patients with intermediate clinical probability prior to discharge from hospital. Overall, 40% of patients discharged on anticoagulation therapy (including 39% of those with a high pretest probability) had a positive imaging diagnosis of thromboembolism The results suggest that, although the majority of patients with intermediate-to-high pretest probability and IVQS proceed to further imaging investigation, CTPA is relatively underused in

  12. Probability of misclassifying biological elements in surface waters.

    Science.gov (United States)

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  13. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  14. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  15. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  16. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  17. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  18. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  19. Unrelated Hematopoietic Stem Cell Donor Matching Probability and Search Algorithm

    Directory of Open Access Journals (Sweden)

    J.-M. Tiercy

    2012-01-01

    Full Text Available In transplantation of hematopoietic stem cells (HSCs from unrelated donors a high HLA compatibility level decreases the risk of acute graft-versus-host disease and mortality. The diversity of the HLA system at the allelic and haplotypic level and the heterogeneity of HLA typing data of the registered donors render the search process a complex task. This paper summarizes our experience with a search algorithm that includes at the start of the search a probability estimate (high/intermediate/low to identify a HLA-A, B, C, DRB1, DQB1-compatible donor (a 10/10 match. Based on 2002–2011 searches about 30% of patients have a high, 30% an intermediate, and 40% a low probability search. Search success rate and duration are presented and discussed in light of the experience of other centers. Overall a 9-10/10 matched HSC donor can now be identified for 60–80% of patients of European descent. For high probability searches donors can be selected on the basis of DPB1-matching with an estimated success rate of >40%. For low probability searches there is no consensus on which HLA incompatibilities are more permissive, although HLA-DQB1 mismatches are generally considered as acceptable. Models for the discrimination of more detrimental mismatches based on specific amino acid residues rather than specific HLA alleles are presented.

  20. Jihadist Foreign Fighter Phenomenon in Western Europe: A Low-Probability, High-Impact Threat

    Directory of Open Access Journals (Sweden)

    Edwin Bakker

    2015-11-01

    Full Text Available The phenomenon of foreign fighters in Syria and Iraq is making headlines. Their involvement in the atrocities committed by terrorist groups such as the so-called “Islamic State” and Jabhat al-Nusra have caused grave concern and public outcry in the foreign fighters’ European countries of origin. While much has been written about these foreign fighters and the possible threat they pose, the impact of this phenomenon on Western European societies has yet to be documented. This Research Paper explores four particular areas where this impact is most visible: a violent incidents associated with (returned foreign fighters, b official and political responses linked to these incidents, c public opinion, and d anti-Islam reactions linked to these incidents. The authors conclude that the phenomenon of jihadist foreign fighters in European societies should be primarily regarded as a social and political threat, not a physical one. They consider the phenomenon of European jihadist foreign fighters a “low-probability, high-impact” threat.

  1. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  2. Tuned by experience: How orientation probability modulates early perceptual processing.

    Science.gov (United States)

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-09-01

    Probable stimuli are more often and more quickly detected. While stimulus probability is known to affect decision-making, it can also be explained as a perceptual phenomenon. Using spatial gratings, we have previously shown that probable orientations are also more precisely estimated, even while participants remained naive to the manipulation. We conducted an electrophysiological study to investigate the effect that probability has on perception and visual-evoked potentials. In line with previous studies on oddballs and stimulus prevalence, low-probability orientations were associated with a greater late positive 'P300' component which might be related to either surprise or decision-making. However, the early 'C1' component, thought to reflect V1 processing, was dampened for high-probability orientations while later P1 and N1 components were unaffected. Exploratory analyses revealed a participant-level correlation between C1 and P300 amplitudes, suggesting a link between perceptual processing and decision-making. We discuss how these probability effects could be indicative of sharpening of neurons preferring the probable orientations, due either to perceptual learning, or to feature-based attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  4. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  5. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  6. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  7. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  8. Entanglement probabilities of polymers: a white noise functional approach

    International Nuclear Information System (INIS)

    Bernido, Christopher C; Carpio-Bernido, M Victoria

    2003-01-01

    The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)θ. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel

  9. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  10. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  11. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  12. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  13. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  14. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  15. Ruin probability with claims modeled by a stationary ergodic stable process

    NARCIS (Netherlands)

    Mikosch, T.; Samorodnitsky, G.

    2000-01-01

    For a random walk with negative drift we study the exceedance probability (ruin probability) of a high threshold. The steps of this walk (claim sizes) constitute a stationary ergodic stable process. We study how ruin occurs in this situation and evaluate the asymptotic behavior of the ruin

  16. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  17. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  18. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  19. Probability of collective excited state decay

    International Nuclear Information System (INIS)

    Manykin, Eh.A.; Ozhovan, M.I.; Poluehktov, P.P.

    1987-01-01

    Decay mechanisms of condensed excited state formed of highly excited (Rydberg) atoms are considered, i.e. stability of so-called Rydberg substance is analyzed. It is shown that Auger recombination and radiation transitions are the basic processes. The corresponding probabilities are calculated and compared. It is ascertained that the ''Rydberg substance'' possesses macroscopic lifetime (several seconds) and in a sense it is metastable

  20. The mediating effect of psychosocial factors on suicidal probability among adolescents.

    Science.gov (United States)

    Hur, Ji-Won; Kim, Won-Joong; Kim, Yong-Ku

    2011-01-01

    Suicidal probability is an actual tendency including negative self-evaluation, hopelessness, suicidal ideation, and hostility. The purpose of this study was to examine the role of psychosocial variances in the suicidal probability of adolescents, especially the role of mediating variance. This study investigated the mediating effects of psychosocial factors such as depression, anxiety, self-esteem, stress, and social support on the suicidal probability among 1,586 adolescents attending middle and high schools in the Kyunggi Province area of South Korea. The relationship between depression and anxiety/suicidal probability was mediated by both social resources and self-esteem. Furthermore, the influence of social resources was mediated by interpersonal and achievement stress as well as self-esteem. This study suggests that suicidal probability in adolescents has various relationships, including mediating relations, with several psychosocial factors. The interventions on suicidal probability in adolescents should focus on social factors as well as clinical symptoms.

  1. Acquisition of speech rhythm in first language.

    Science.gov (United States)

    Polyanskaya, Leona; Ordin, Mikhail

    2015-09-01

    Analysis of English rhythm in speech produced by children and adults revealed that speech rhythm becomes increasingly more stress-timed as language acquisition progresses. Children reach the adult-like target by 11 to 12 years. The employed speech elicitation paradigm ensured that the sentences produced by adults and children at different ages were comparable in terms of lexical content, segmental composition, and phonotactic complexity. Detected differences between child and adult rhythm and between rhythm in child speech at various ages cannot be attributed to acquisition of phonotactic language features or vocabulary, and indicate the development of language-specific phonetic timing in the course of acquisition.

  2. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  3. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  4. Production of 147Eu for gamma-ray emission probability measurement

    International Nuclear Information System (INIS)

    Katoh, Keiji; Marnada, Nada; Miyahara, Hiroshi

    2002-01-01

    Gamma-ray emission probability is one of the most important decay parameters of radionuclide and many researchers are paying efforts to improve the certainty of it. The certainties of γ-ray emission probabilities for neutron-rich nuclides are being improved little by little, but the improvements of those for proton-rich nuclides are still insufficient. Europium-147 that decays by electron capture or β + -particle emission is a proton-rich nuclide and the γ-ray emission probabilities evaluated by Mateosian and Peker have large uncertainties. They referred to only one report concerning with γ-ray emission probabilities. Our final purpose is to determine the precise γ-ray emission probabilities of 147 Eu from disintegration rates and γ-ray intensities by using a 4πβ-γ coincidence apparatus. Impurity nuclides affect largely to the determination of disintegration rate; therefore, a highly pure 147 Eu source is required. This short note will describe the most proper energy for 147 Eu production through 147 Sm(p, n) reaction. (author)

  5. Probability intervals for the top event unavailability of fault trees

    International Nuclear Information System (INIS)

    Lee, Y.T.; Apostolakis, G.E.

    1976-06-01

    The evaluation of probabilities of rare events is of major importance in the quantitative assessment of the risk from large technological systems. In particular, for nuclear power plants the complexity of the systems, their high reliability and the lack of significant statistical records have led to the extensive use of logic diagrams in the estimation of low probabilities. The estimation of probability intervals for the probability of existence of the top event of a fault tree is examined. Given the uncertainties of the primary input data, a method is described for the evaluation of the first four moments of the top event occurrence probability. These moments are then used to estimate confidence bounds by several approaches which are based on standard inequalities (e.g., Tchebycheff, Cantelli, etc.) or on empirical distributions (the Johnson family). Several examples indicate that the Johnson family of distributions yields results which are in good agreement with those produced by Monte Carlo simulation

  6. Risk Preferences, Probability Weighting, and Strategy Tradeoffs in Wildfire Management.

    Science.gov (United States)

    Hand, Michael S; Wibbenmeyer, Matthew J; Calkin, David E; Thompson, Matthew P

    2015-10-01

    Wildfires present a complex applied risk management environment, but relatively little attention has been paid to behavioral and cognitive responses to risk among public agency wildfire managers. This study investigates responses to risk, including probability weighting and risk aversion, in a wildfire management context using a survey-based experiment administered to federal wildfire managers. Respondents were presented with a multiattribute lottery-choice experiment where each lottery is defined by three outcome attributes: expenditures for fire suppression, damage to private property, and exposure of firefighters to the risk of aviation-related fatalities. Respondents choose one of two strategies, each of which includes "good" (low cost/low damage) and "bad" (high cost/high damage) outcomes that occur with varying probabilities. The choice task also incorporates an information framing experiment to test whether information about fatality risk to firefighters alters managers' responses to risk. Results suggest that managers exhibit risk aversion and nonlinear probability weighting, which can result in choices that do not minimize expected expenditures, property damage, or firefighter exposure. Information framing tends to result in choices that reduce the risk of aviation fatalities, but exacerbates nonlinear probability weighting. © 2015 Society for Risk Analysis.

  7. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  8. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  9. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  10. On the probability of occurrence of rogue waves

    Directory of Open Access Journals (Sweden)

    E. M. Bitner-Gregersen

    2012-03-01

    Full Text Available A number of extreme and rogue wave studies have been conducted theoretically, numerically, experimentally and based on field data in the last years, which have significantly advanced our knowledge of ocean waves. So far, however, consensus on the probability of occurrence of rogue waves has not been achieved. The present investigation is addressing this topic from the perspective of design needs. Probability of occurrence of extreme and rogue wave crests in deep water is here discussed based on higher order time simulations, experiments and hindcast data. Focus is given to occurrence of rogue waves in high sea states.

  11. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  12. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  13. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  14. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  15. A Multidisciplinary Approach for Teaching Statistics and Probability

    Science.gov (United States)

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  16. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  17. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  18. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    Science.gov (United States)

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  19. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    Science.gov (United States)

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  20. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  1. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  2. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  3. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  4. An Upper Bound on High Speed Satellite Collision Probability When Only One Object has Position Uncertainty Information

    Science.gov (United States)

    Frisbee, Joseph H., Jr.

    2015-01-01

    Upper bounds on high speed satellite collision probability, PC †, have been investigated. Previous methods assume an individual position error covariance matrix is available for each object. The two matrices being combined into a single, relative position error covariance matrix. Components of the combined error covariance are then varied to obtain a maximum PC. If error covariance information for only one of the two objects was available, either some default shape has been used or nothing could be done. An alternative is presented that uses the known covariance information along with a critical value of the missing covariance to obtain an approximate but potentially useful Pc upper bound.

  5. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  6. High-resolution elastic recoil detection utilizing Bayesian probability theory

    International Nuclear Information System (INIS)

    Neumaier, P.; Dollinger, G.; Bergmaier, A.; Genchev, I.; Goergens, L.; Fischer, R.; Ronning, C.; Hofsaess, H.

    2001-01-01

    Elastic recoil detection (ERD) analysis is improved in view of depth resolution and the reliability of the measured spectra. Good statistics at even low ion fluences is obtained utilizing a large solid angle of 5 msr at the Munich Q3D magnetic spectrograph and using a 40 MeV 197 Au beam. In this way the elemental depth profiles are not essentially altered during analysis even if distributions with area densities below 1x10 14 atoms/cm 2 are measured. As the energy spread due to the angular acceptance is fully eliminated by ion-optical and numerical corrections, an accurate and reliable apparatus function is derived. It allows to deconvolute the measured spectra using the adaptive kernel method, a maximum entropy concept in the framework of Bayesian probability theory. In addition, the uncertainty of the reconstructed spectra is quantified. The concepts are demonstrated at 13 C depth profiles measured at ultra-thin films of tetrahedral amorphous carbon (ta-C). Depth scales of those profiles are given with an accuracy of 1.4x10 15 atoms/cm 2

  7. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  8. Impact probabilities of meteoroid streams with artificial satellites: An assessment

    International Nuclear Information System (INIS)

    Foschini, L.; Cevolani, G.

    1997-01-01

    Impact probabilities of artificial satellites with meteoroid streams were calculated using data collected with the CNR forward scatter (FS) bistatic radar over the Bologna-Lecce baseline (about 700 km). Results show that impact probabilities are 2 times higher than other previously calculated values. Nevertheless, although catastrophic impacts are still rare even in the case of meteor storm conditions, it is expected that high meteoroid fluxes can erode satellites surfaces and weaken their external structures

  9. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  10. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  11. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  12. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  13. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  14. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  15. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  16. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  17. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  19. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  20. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  1. Kepler Planet Reliability Metrics: Astrophysical Positional Probabilities for Data Release 25

    Science.gov (United States)

    Bryson, Stephen T.; Morton, Timothy D.

    2017-01-01

    This document is very similar to KSCI-19092-003, Planet Reliability Metrics: Astrophysical Positional Probabilities, which describes the previous release of the astrophysical positional probabilities for Data Release 24. The important changes for Data Release 25 are:1. The computation of the astrophysical positional probabilities uses the Data Release 25 processed pixel data for all Kepler Objects of Interest.2. Computed probabilities now have associated uncertainties, whose computation is described in x4.1.3.3. The scene modeling described in x4.1.2 uses background stars detected via ground-based high-resolution imaging, described in x5.1, that are not in the Kepler Input Catalog or UKIRT catalog. These newly detected stars are presented in Appendix B. Otherwise the text describing the algorithms and examples is largely unchanged from KSCI-19092-003.

  2. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  4. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  5. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  6. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  7. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  8. Calculating the albedo characteristics by the method of transmission probabilities

    International Nuclear Information System (INIS)

    Lukhvich, A.A.; Rakhno, I.L.; Rubin, I.E.

    1983-01-01

    The possibility to use the method of transmission probabilities for calculating the albedo characteristics of homogeneous and heterogeneous zones is studied. The transmission probabilities method is a numerical method for solving the transport equation in the integrated form. All calculations have been conducted as a one-group approximation for the planes and rods with different optical thicknesses and capture-to-scattering ratios. Above calculations for plane and cylindrical geometries have shown the possibility to use the numerical method of transmission probabilities for calculating the albedo characteristics of homogeneous and heterogeneous zones with high accuracy. In this case the computer time consumptions are minimum even with the cylindrical geometry, if the interpolation calculation of characteristics is used for the neutrons of the first path

  9. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  10. Tailoring single-photon and multiphoton probabilities of a single-photon on-demand source

    International Nuclear Information System (INIS)

    Migdall, A.L.; Branning, D.; Castelletto, S.

    2002-01-01

    As typically implemented, single-photon sources cannot be made to produce single photons with high probability, while simultaneously suppressing the probability of yielding two or more photons. Because of this, single-photon sources cannot really produce single photons on demand. We describe a multiplexed system that allows the probabilities of producing one and more photons to be adjusted independently, enabling a much better approximation of a source of single photons on demand

  11. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  12. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  13. Dynamic encoding of speech sequence probability in human temporal cortex.

    Science.gov (United States)

    Leonard, Matthew K; Bouchard, Kristofer E; Tang, Claire; Chang, Edward F

    2015-05-06

    Sensory processing involves identification of stimulus features, but also integration with the surrounding sensory and cognitive context. Previous work in animals and humans has shown fine-scale sensitivity to context in the form of learned knowledge about the statistics of the sensory environment, including relative probabilities of discrete units in a stream of sequential auditory input. These statistics are a defining characteristic of one of the most important sequential signals humans encounter: speech. For speech, extensive exposure to a language tunes listeners to the statistics of sound sequences. To address how speech sequence statistics are neurally encoded, we used high-resolution direct cortical recordings from human lateral superior temporal cortex as subjects listened to words and nonwords with varying transition probabilities between sound segments. In addition to their sensitivity to acoustic features (including contextual features, such as coarticulation), we found that neural responses dynamically encoded the language-level probability of both preceding and upcoming speech sounds. Transition probability first negatively modulated neural responses, followed by positive modulation of neural responses, consistent with coordinated predictive and retrospective recognition processes, respectively. Furthermore, transition probability encoding was different for real English words compared with nonwords, providing evidence for online interactions with high-order linguistic knowledge. These results demonstrate that sensory processing of deeply learned stimuli involves integrating physical stimulus features with their contextual sequential structure. Despite not being consciously aware of phoneme sequence statistics, listeners use this information to process spoken input and to link low-level acoustic representations with linguistic information about word identity and meaning. Copyright © 2015 the authors 0270-6474/15/357203-12$15.00/0.

  14. Fragility estimation for seismically isolated nuclear structures by high confidence low probability of failure values and bi-linear regression

    International Nuclear Information System (INIS)

    Carausu, A.

    1996-01-01

    A method for the fragility estimation of seismically isolated nuclear power plant structure is proposed. The relationship between the ground motion intensity parameter (e.g. peak ground velocity or peak ground acceleration) and the response of isolated structures is expressed in terms of a bi-linear regression line, whose coefficients are estimated by the least-square method in terms of available data on seismic input and structural response. The notion of high confidence low probability of failure (HCLPF) value is also used for deriving compound fragility curves for coupled subsystems. (orig.)

  15. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  16. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  17. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  18. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  19. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  1. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  2. High-severity fire: evaluating its key drivers and mapping its probability across western US forests

    Science.gov (United States)

    Parks, Sean A.; Holsinger, Lisa M.; Panunto, Matthew H.; Jolly, W. Matt; Dobrowski, Solomon Z.; Dillon, Gregory K.

    2018-04-01

    Wildland fire is a critical process in forests of the western United States (US). Variation in fire behavior, which is heavily influenced by fuel loading, terrain, weather, and vegetation type, leads to heterogeneity in fire severity across landscapes. The relative influence of these factors in driving fire severity, however, is poorly understood. Here, we explore the drivers of high-severity fire for forested ecoregions in the western US over the period 2002–2015. Fire severity was quantified using a satellite-inferred index of severity, the relativized burn ratio. For each ecoregion, we used boosted regression trees to model high-severity fire as a function of live fuel, topography, climate, and fire weather. We found that live fuel, on average, was the most important factor driving high-severity fire among ecoregions (average relative influence = 53.1%) and was the most important factor in 14 of 19 ecoregions. Fire weather was the second most important factor among ecoregions (average relative influence = 22.9%) and was the most important factor in five ecoregions. Climate (13.7%) and topography (10.3%) were less influential. We also predicted the probability of high-severity fire, were a fire to occur, using recent (2016) satellite imagery to characterize live fuel for a subset of ecoregions in which the model skill was deemed acceptable (n = 13). These ‘wall-to-wall’ gridded ecoregional maps provide relevant and up-to-date information for scientists and managers who are tasked with managing fuel and wildland fire. Lastly, we provide an example of the predicted likelihood of high-severity fire under moderate and extreme fire weather before and after fuel reduction treatments, thereby demonstrating how our framework and model predictions can potentially serve as a performance metric for land management agencies tasked with reducing hazardous fuel across large landscapes.

  3. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  4. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  5. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  6. Substrate texture affects female cricket walking response to male calling song

    Science.gov (United States)

    Sarmiento-Ponce, E. J.; Sutcliffe, M. P. F.; Hedwig, B.

    2018-03-01

    Field crickets are extensively used as a model organism to study female phonotactic walking behaviour, i.e. their attraction to the male calling song. Laboratory-based phonotaxis experiments generally rely on arena or trackball-based settings; however, no attention has been paid to the effect of substrate texture on the response. Here, we tested phonotaxis in female Gryllus bimaculatus, walking on trackballs machined from methyl-methacrylate foam with different cell sizes. Surface height variations of the trackballs, due to the cellular composition of the material, were measured with profilometry and characterized as smooth, medium or rough, with roughness amplitudes of 7.3, 16 and 180 µm. Female phonotaxis was best on a rough and medium trackball surface, a smooth surface resulted in a significant lower phonotactic response. Claws of the cricket foot were crucial for effective walking. Females insert their claws into the surface pores to allow mechanical interlocking with the substrate texture and a high degree of attachment, which cannot be established on smooth surfaces. These findings provide insight to the biomechanical basis of insect walking and may inform behavioural studies that the surface texture on which walking insects are tested is crucial for the resulting behavioural response.

  7. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  8. Midcourse Guidance Law Based on High Target Acquisition Probability Considering Angular Constraint and Line-of-Sight Angle Rate Control

    Directory of Open Access Journals (Sweden)

    Xiao Liu

    2016-01-01

    Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.

  9. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  10. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  11. Codificação estatística das categorias fonéticas: vestígio da dinâmica da fala na fonotaxe lexical

    Directory of Open Access Journals (Sweden)

    Eleonora C. Albano

    2012-11-01

    Full Text Available This paper presents a new, intriguing finding and discusses some ofits theoretical implications: phonetic categorization, including majorclass membership, is entirely predictable from phonotactic biases inthree Brazilian Portuguese word databases. The predictors are logfrequencies of ‘VC, ‘CV and V’CV sequences consisting of the 7stressed vowels combined with the 19 onset consonants, plus the 5Southeastern pre-stressed vowels. Correct vowel categorizationarises through discriminant analysis of ‘VC and ‘CV data. Correctconsonant categorization arises through discriminant analysis of V’CVdata. Results are consistent across databases and, thus, strongly suggestthat statistical biases in the lexicon can be so stable as to code phoneticcategories. The findings and their corollaries bear on the issue of therelationship of lexical phonotactics to speech dynamics.

  12. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  13. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  14. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  15. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  16. A Probability-based Evolutionary Algorithm with Mutations to Learn Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Sho Fukuda

    2014-12-01

    Full Text Available Bayesian networks are regarded as one of the essential tools to analyze causal relationship between events from data. To learn the structure of highly-reliable Bayesian networks from data as quickly as possible is one of the important problems that several studies have been tried to achieve. In recent years, probability-based evolutionary algorithms have been proposed as a new efficient approach to learn Bayesian networks. In this paper, we target on one of the probability-based evolutionary algorithms called PBIL (Probability-Based Incremental Learning, and propose a new mutation operator. Through performance evaluation, we found that the proposed mutation operator has a good performance in learning Bayesian networks

  17. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  18. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  19. Estimation of functional failure probability of passive systems based on subset simulation method

    International Nuclear Information System (INIS)

    Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)

  20. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  1. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  2. Subspace Learning via Local Probability Distribution for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Huiwu Luo

    2015-01-01

    Full Text Available The computational procedure of hyperspectral image (HSI is extremely complex, not only due to the high dimensional information, but also due to the highly correlated data structure. The need of effective processing and analyzing of HSI has met many difficulties. It has been evidenced that dimensionality reduction has been found to be a powerful tool for high dimensional data analysis. Local Fisher’s liner discriminant analysis (LFDA is an effective method to treat HSI processing. In this paper, a novel approach, called PD-LFDA, is proposed to overcome the weakness of LFDA. PD-LFDA emphasizes the probability distribution (PD in LFDA, where the maximum distance is replaced with local variance for the construction of weight matrix and the class prior probability is applied to compute the affinity matrix. The proposed approach increases the discriminant ability of the transformed features in low dimensional space. Experimental results on Indian Pines 1992 data indicate that the proposed approach significantly outperforms the traditional alternatives.

  3. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  4. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  5. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  6. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  7. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  8. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  9. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  10. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  11. Application of escape probability to line transfer in laser-produced plasmas

    International Nuclear Information System (INIS)

    Lee, Y.T.; London, R.A.; Zimmerman, G.B.; Haglestein, P.L.

    1989-01-01

    In this paper the authors apply the escape probability method to treat transfer of optically thick lines in laser-produced plasmas in plan-parallel geometry. They investigate the effect of self-absorption on the ionization balance and ion level populations. In addition, they calculate such effect on the laser gains in an exploding foil target heated by an optical laser. Due to the large ion streaming motion in laser-produced plasmas, absorption of an emitted photon occurs only over the length in which the Doppler shift is equal to the line width. They find that the escape probability calculated with the Doppler shift is larger compared to the escape probability for a static plasma. Therefore, the ion streaming motion contributes significantly to the line transfer process in laser-produced plasmas. As examples, they have applied escape probability to calculate transfer of optically thick lines in both ablating slab and exploding foil targets under irradiation of a high-power optical laser

  12. Numeracy moderates the influence of task-irrelevant affect on probability weighting.

    Science.gov (United States)

    Traczyk, Jakub; Fulawka, Kamil

    2016-06-01

    Statistical numeracy, defined as the ability to understand and process statistical and probability information, plays a significant role in superior decision making. However, recent research has demonstrated that statistical numeracy goes beyond simple comprehension of numbers and mathematical operations. On the contrary to previous studies that were focused on emotions integral to risky prospects, we hypothesized that highly numerate individuals would exhibit more linear probability weighting because they would be less biased by incidental and decision-irrelevant affect. Participants were instructed to make a series of insurance decisions preceded by negative (i.e., fear-inducing) or neutral stimuli. We found that incidental negative affect increased the curvature of the probability weighting function (PWF). Interestingly, this effect was significant only for less numerate individuals, while probability weighting in more numerate people was not altered by decision-irrelevant affect. We propose two candidate mechanisms for the observed effect. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Probable high prevalence of limb-girdle muscular dystrophy type 2D in Taiwan.

    Science.gov (United States)

    Liang, Wen-Chen; Chou, Po-Ching; Hung, Chia-Cheng; Su, Yi-Ning; Kan, Tsu-Min; Chen, Wan-Zi; Hayashi, Yukiko K; Nishino, Ichizo; Jong, Yuh-Jyh

    2016-03-15

    Limb-girdle muscular dystrophy type 2D (LGMD2D), an autosomal-recessive inherited LGMD, is caused by the mutations in SGCA. SGCA encodes alpha-sarcoglycan (SG) that forms a heterotetramer with other SGs in the sarcolemma, and comprises part of the dystrophin-glycoprotein complex. The frequency of LGMD2D is variable among different ethnic backgrounds, and so far only a few patients have been reported in Asia. We identified five patients with a novel homozygous mutation of c.101G>T (p.Arg34Leu) in SGCA from a big aboriginal family ethnically consisting of two tribes in Taiwan. Patient 3 is the maternal uncle of patients 1 and 2. All their parents, heterozygous for c.101G>T, denied consanguineous marriages although they were from the same tribe. The heterozygous parents of patients 4 and 5 were from two different tribes, originally residing in different geographic regions in Taiwan. Haplotype analysis showed that all five patients shared the same mutation-associated haplotype, indicating the probability of a founder effect and consanguinity. The results suggest that the carrier rate of c.101G>T in SGCA may be high in Taiwan, especially in the aboriginal population regardless of the tribes. It is important to investigate the prevalence of LGMD2D in Taiwan for early diagnosis and treatment. Copyright © 2016. Published by Elsevier B.V.

  14. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  15. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  16. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  17. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  18. Soybean P34 Probable Thiol Protease Probably Has Proteolytic Activity on Oleosins.

    Science.gov (United States)

    Zhao, Luping; Kong, Xiangzhen; Zhang, Caimeng; Hua, Yufei; Chen, Yeming

    2017-07-19

    P34 probable thiol protease (P34) and Gly m Bd 30K (30K) show high relationship with the protease of 24 kDa oleosin of soybean oil bodies. In this study, 9 day germinated soybean was used to separate bioprocessed P34 (P32) from bioprocessed 30K (28K). Interestingly, P32 existed as dimer, whereas 28K existed as monomer; a P32-rich sample had proteolytic activity and high cleavage site specificity (Lys-Thr of 24 kDa oleosin), whereas a 28K-rich sample showed low proteolytic activity; the P32-rich sample contained one thiol protease. After mixing with purified oil bodies, all P32 dimers were dissociated and bound to 24 kDa oleosins to form P32-24 kDa oleosin complexes. By incubation, 24 kDa oleosin was preferentially hydrolyzed, and two hydrolyzed products (HPs; 17 and 7 kDa) were confirmed. After most of 24 kDa oleosin was hydrolyzed, some P32 existed as dimer, and the other as P32-17 kDa HP. It was suggested that P32 was the protease.

  19. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  20. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  1. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  2. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    Science.gov (United States)

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  3. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  4. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  5. Decreased Serum Lipids in Patients with Probable Alzheimer´s Disease

    Directory of Open Access Journals (Sweden)

    Orhan Lepara

    2009-08-01

    Full Text Available Alzheimer’s disease (AD is a multifactorial disease but its aetiology and pathophisiology are still not fully understood. Epidemiologic studies examining the association between lipids and dementia have reported conflicting results. High total cholesterol has been associated with both an increased, and decreased, risk of AD and/or vascular dementia (VAD, whereas other studies found no association. The aim of this study was to investigate the serum lipids concentration in patients with probable AD, as well as possible correlation between serum lipids concentrations and cognitive impairment.Our cross-sectional study included 30 patients with probable AD and 30 age and sex matched control subjects. The probable AD was clinically diagnosed by NINCDS-ADRDA criteria. Serum total cholesterol (TC, high-density lipoprotein cholesterol (HDL-C and triglyceride (TG levels were determined at the initial assessment using standard enzymatic colorimetric techniques. Low-den- sity lipoprotein cholesterol (LDL-C and very low density lipoprotein cholesterol (VLDL-C levels were calculated. Subjects with probable AD had significantly lower serum TG (p<0,01, TC (p<0,05, LDL-C (p<0,05 and VLDL-C (p<0,01 compared to the control group. We did not observe signifi-cant difference in HDL-C level between patients with probable AD and control subjects. Negative, although not significant correlation between TG, TC and VLDL-C and MMSE in patients with AD was observed. In the control group of subjects there was a negative correlation between TC and MMSE but it was not statistically significant (r = -0,28. Further studies are required to explore the possibility for serum lipids to serve as diagnostic and therapeutic markers of AD.

  6. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  7. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  8. Modelling detection probabilities to evaluate management and control tools for an invasive species

    Science.gov (United States)

    Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.

    2010-01-01

    For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By

  9. Exact calculation of loop formation probability identifies folding motifs in RNA secondary structures

    Science.gov (United States)

    Sloma, Michael F.; Mathews, David H.

    2016-01-01

    RNA secondary structure prediction is widely used to analyze RNA sequences. In an RNA partition function calculation, free energy nearest neighbor parameters are used in a dynamic programming algorithm to estimate statistical properties of the secondary structure ensemble. Previously, partition functions have largely been used to estimate the probability that a given pair of nucleotides form a base pair, the conditional stacking probability, the accessibility to binding of a continuous stretch of nucleotides, or a representative sample of RNA structures. Here it is demonstrated that an RNA partition function can also be used to calculate the exact probability of formation of hairpin loops, internal loops, bulge loops, or multibranch loops at a given position. This calculation can also be used to estimate the probability of formation of specific helices. Benchmarking on a set of RNA sequences with known secondary structures indicated that loops that were calculated to be more probable were more likely to be present in the known structure than less probable loops. Furthermore, highly probable loops are more likely to be in the known structure than the set of loops predicted in the lowest free energy structures. PMID:27852924

  10. Measurement of probability distributions for internal stresses in dislocated crystals

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)

    2014-11-03

    Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.

  11. Failure Probability Estimation of Wind Turbines by Enhanced Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Naess, Arvid

    2012-01-01

    This paper discusses the estimation of the failure probability of wind turbines required by codes of practice for designing them. The Standard Monte Carlo (SMC) simulations may be used for this reason conceptually as an alternative to the popular Peaks-Over-Threshold (POT) method. However......, estimation of very low failure probabilities with SMC simulations leads to unacceptably high computational costs. In this study, an Enhanced Monte Carlo (EMC) method is proposed that overcomes this obstacle. The method has advantages over both POT and SMC in terms of its low computational cost and accuracy...... is controlled by the pitch controller. This provides a fair framework for comparison of the behavior and failure event of the wind turbine with emphasis on the effect of the pitch controller. The Enhanced Monte Carlo method is then applied to the model and the failure probabilities of the model are estimated...

  12. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  13. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  14. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  15. Dual Diagnosis and Suicide Probability in Poly-Drug Users.

    Science.gov (United States)

    Youssef, Ismail M; Fahmy, Magda T; Haggag, Wafaa L; Mohamed, Khalid A; Baalash, Amany A

    2016-02-01

    To determine the frequency of suicidal thoughts and suicidal probability among poly-substance abusers in Saudi population, and to examine the relation between dual diagnosis and suicidal thoughts. Case control study. Al-Baha Psychiatric Hospital, Saudi Arabia, from May 2011 to June 2012. Participants were 239 subjects, aged 18 - 45 years. We reviewed 122 individuals who fulfilled the DSM-IV-TR criteria of substance abuse for two or more substances, and their data were compared with that collected from 117 control persons. Suicidal cases were highly present among poly-substance abusers 64.75%. Amphetamine and cannabis were the most abused substances, (87.7% and 70.49%, respectively). Astatistically significant association with suicidality was found with longer duration of substance abuse (p Suicidal cases showed significant higher scores (p suicide probability scale and higher scores in Beck depressive inventory (p Abusing certain substances for long duration, in addition to comorbid psychiatric disorders especially with disturbed-mood element, may trigger suicidal thoughts in poly-substance abusers. Depression and suicide probability is common consequences of substance abuse.

  16. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  17. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  18. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  19. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  20. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  1. Changes in the high-mountain vegetation of the Central Iberian Peninsula as a probable sign of global warming.

    Science.gov (United States)

    Sanz-Elorza, Mario; Dana, Elías D; González, Alberto; Sobrino, Eduardo

    2003-08-01

    Aerial images of the high summits of the Spanish Central Range reveal significant changes in vegetation over the period 1957 to 1991. These changes include the replacement of high-mountain grassland communities dominated by Festuca aragonensis, typical of the Cryoro-Mediterranean belt, by shrub patches of Juniperus communis ssp. alpina and Cytisus oromediterraneus from lower altitudes (Oro-Mediterranean belt). Climatic data indicate a shift towards warmer conditions in this mountainous region since the 1940s, with the shift being particularly marked from 1960. Changes include significantly higher minimum and maximum temperatures, fewer days with snow cover and a redistribution of monthly rainfall. Total yearly precipitation showed no significant variation. There were no marked changes in land use during the time frame considered, although there were minor changes in grazing species in the 19th century. It is hypothesized that the advance of woody species into higher altitudes is probably related to climate change, which could have acted in conjunction with discrete variations in landscape management. The pronounced changes observed in the plant communities of the area reflect the susceptibility of high-mountain Mediterranean species to environmental change.

  2. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  3. Predicting the probability of slip in gait: methodology and distribution study.

    Science.gov (United States)

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  4. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  5. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  6. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  7. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  8. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  9. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  10. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  11. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  12. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  13. Diagnostic accuracy of the MMSE in detecting probable and possible Alzheimer's disease in ethnically diverse highly educated individuals: an analysis of the NACC database.

    Science.gov (United States)

    Spering, Cynthia C; Hobson, Valerie; Lucas, John A; Menon, Chloe V; Hall, James R; O'Bryant, Sid E

    2012-08-01

    To validate and extend the findings of a raised cut score of O'Bryant and colleagues (O'Bryant SE, Humphreys JD, Smith GE, et al. Detecting dementia with the mini-mental state examination in highly educated individuals. Arch Neurol. 2008;65(7):963-967.) for the Mini-Mental State Examination in detecting cognitive dysfunction in a bilingual sample of highly educated ethnically diverse individuals. Archival data were reviewed from participants enrolled in the National Alzheimer's Coordinating Center minimum data set. Data on 7,093 individuals with 16 or more years of education were analyzed, including 2,337 cases with probable and possible Alzheimer's disease, 1,418 mild cognitive impairment patients, and 3,088 nondemented controls. Ethnic composition was characterized as follows: 6,296 Caucasians, 581 African Americans, 4 American Indians or Alaska natives, 2 native Hawaiians or Pacific Islanders, 149 Asians, 43 "Other," and 18 of unknown origin. Diagnostic accuracy estimates (sensitivity, specificity, and likelihood ratio) of Mini-Mental State Examination cut scores in detecting probable and possible Alzheimer's disease were examined. A standard Mini-Mental State Examination cut score of 24 (≤23) yielded a sensitivity of 0.58 and a specificity of 0.98 in detecting probable and possible Alzheimer's disease across ethnicities. A cut score of 27 (≤26) resulted in an improved balance of sensitivity and specificity (0.79 and 0.90, respectively). In the cognitively impaired group (mild cognitive impairment and probable and possible Alzheimer's disease), the standard cut score yielded a sensitivity of 0.38 and a specificity of 1.00 while raising the cut score to 27 resulted in an improved balance of 0.59 and 0.96 of sensitivity and specificity, respectively. These findings cross-validate our previous work and extend them to an ethnically diverse cohort. A higher cut score is needed to maximize diagnostic accuracy of the Mini-Mental State Examination in individuals

  14. On the probability distribution of the stochastic saturation scale in QCD

    International Nuclear Information System (INIS)

    Marquet, C.; Soyez, G.; Xiao Bowen

    2006-01-01

    It was recently noticed that high-energy scattering processes in QCD have a stochastic nature. An event-by-event scattering amplitude is characterised by a saturation scale which is a random variable. The statistical ensemble of saturation scales formed with all the events is distributed according to a probability law whose cumulants have been recently computed. In this work, we obtain the probability distribution from the cumulants. We prove that it can be considered as Gaussian over a large domain that we specify and our results are confirmed by numerical simulations

  15. "I Don't Really Understand Probability at All": Final Year Pre-Service Teachers' Understanding of Probability

    Science.gov (United States)

    Maher, Nicole; Muir, Tracey

    2014-01-01

    This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…

  16. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  17. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  18. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  19. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  20. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  1. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  2. Probability distributions with truncated, log and bivariate extensions

    CERN Document Server

    Thomopoulos, Nick T

    2018-01-01

    This volume presents a concise and practical overview of statistical methods and tables not readily available in other publications. It begins with a review of the commonly used continuous and discrete probability distributions. Several useful distributions that are not so common and less understood are described with examples and applications in full detail: discrete normal, left-partial, right-partial, left-truncated normal, right-truncated normal, lognormal, bivariate normal, and bivariate lognormal. Table values are provided with examples that enable researchers to easily apply the distributions to real applications and sample data. The left- and right-truncated normal distributions offer a wide variety of shapes in contrast to the symmetrically shaped normal distribution, and a newly developed spread ratio enables analysts to determine which of the three distributions best fits a particular set of sample data. The book will be highly useful to anyone who does statistical and probability analysis. This in...

  3. Multiple-event probability in general-relativistic quantum mechanics

    International Nuclear Information System (INIS)

    Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-01-01

    We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse

  4. Evaluation of probability and hazard in nuclear energy

    International Nuclear Information System (INIS)

    Novikov, V.Ya.; Romanov, N.L.

    1979-01-01

    Various methods of evaluation of accident probability on NPP are proposed because of NPP security statistic evaluation unreliability. The conception of subjective probability for quantitative analysis of security and hazard are described. Intrepretation of probability as real faith of an expert is assumed as a basis of the conception. It is suggested to study the event uncertainty in the framework of subjective probability theory which not only permits but demands to take into account expert opinions when evaluating the probability. These subjective expert evaluations effect to a certain extent the calculation of the usual mathematical event probability. The above technique is advantageous to use for consideration of a separate experiment or random event

  5. p-adic probability interpretation of Bell's inequality

    International Nuclear Information System (INIS)

    Khrennikov, A.

    1995-01-01

    We study the violation of Bell's inequality using a p-adic generalization of the theory of probability. p-adic probability is introduced as a limit of relative frequencies but this limit exists with respect to a p-adic metric. In particular, negative probability distributions are well defined on the basis of the frequency definition. This new type of stochastics can be used to describe hidden-variables distributions of some quantum models. If the hidden variables have a p-adic probability distribution, Bell's inequality is not valid and it is not necessary to discuss the experimental violations of this inequality. ((orig.))

  6. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  7. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  8. An Efficient Simulation Scheme of the Outage Probability with Co-Channel Interference

    KAUST Repository

    Rached, Nadhir B.

    2016-03-28

    © 2015 IEEE. The outage probability (OP) of the signal-to-interference-plus-noise ratio (SINR) is an important metric used to evaluate the performance of wireless communication systems operating over fading channels. One major difficulty toward assessing the OP is that, in most of the realistic scenarios, closed-form expressions cannot be derived. This is for instance the case of Log-normal fading environments, in which evaluating the OP of the SINR amounts to computing the probability that a sum of correlated Log-normal variates exceeds a given threshold. Since such a probability is not known to admit a closed-form expression, it has thus far been evaluated by several approximation techniques, the accuracies of which are unfortunately not guaranteed in the interesting region of small outage probabilities. For these regions, simulation techniques based on variance reduction algorithms can represent a good alternative, being well-recognized to be quick and highly accurate for estimating rare event probabilities. This constitutes the major motivation behind our work. More specifically, we propose an efficient importance sampling approach which is based on a covariance matrix scaling technique and illustrate its computational gain over naive Monte Carlo simulations through some selected simulation results.

  9. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  10. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  11. Pretest probability of a normal echocardiography: validation of a simple and practical algorithm for routine use.

    Science.gov (United States)

    Hammoudi, Nadjib; Duprey, Matthieu; Régnier, Philippe; Achkar, Marc; Boubrit, Lila; Preud'homme, Gisèle; Healy-Brucker, Aude; Vignalou, Jean-Baptiste; Pousset, Françoise; Komajda, Michel; Isnard, Richard

    2014-02-01

    Management of increased referrals for transthoracic echocardiography (TTE) examinations is a challenge. Patients with normal TTE examinations take less time to explore than those with heart abnormalities. A reliable method for assessing pretest probability of a normal TTE may optimize management of requests. To establish and validate, based on requests for examinations, a simple algorithm for defining pretest probability of a normal TTE. In a retrospective phase, factors associated with normality were investigated and an algorithm was designed. In a prospective phase, patients were classified in accordance with the algorithm as being at high or low probability of having a normal TTE. In the retrospective phase, 42% of 618 examinations were normal. In multivariable analysis, age and absence of cardiac history were associated to normality. Low pretest probability of normal TTE was defined by known cardiac history or, in case of doubt about cardiac history, by age>70 years. In the prospective phase, the prevalences of normality were 72% and 25% in high (n=167) and low (n=241) pretest probability of normality groups, respectively. The mean duration of normal examinations was significantly shorter than abnormal examinations (13.8 ± 9.2 min vs 17.6 ± 11.1 min; P=0.0003). A simple algorithm can classify patients referred for TTE as being at high or low pretest probability of having a normal examination. This algorithm might help to optimize management of requests in routine practice. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  12. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    International Nuclear Information System (INIS)

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.; Rincon, G.; Go, R.T.; Williams, G.; Leatherman, J.

    1984-01-01

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence in patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application

  13. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  14. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  15. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  16. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  17. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  18. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  19. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  20. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  1. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  2. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  3. Probability mapping of scarred myocardium using texture and intensity features in CMR images

    Science.gov (United States)

    2013-01-01

    Background The myocardium exhibits heterogeneous nature due to scarring after Myocardial Infarction (MI). In Cardiac Magnetic Resonance (CMR) imaging, Late Gadolinium (LG) contrast agent enhances the intensity of scarred area in the myocardium. Methods In this paper, we propose a probability mapping technique using Texture and Intensity features to describe heterogeneous nature of the scarred myocardium in Cardiac Magnetic Resonance (CMR) images after Myocardial Infarction (MI). Scarred tissue and non-scarred tissue are represented with high and low probabilities, respectively. Intermediate values possibly indicate areas where the scarred and healthy tissues are interwoven. The probability map of scarred myocardium is calculated by using a probability function based on Bayes rule. Any set of features can be used in the probability function. Results In the present study, we demonstrate the use of two different types of features. One is based on the mean intensity of pixel and the other on underlying texture information of the scarred and non-scarred myocardium. Examples of probability maps computed using the mean intensity of pixel and the underlying texture information are presented. We hypothesize that the probability mapping of myocardium offers alternate visualization, possibly showing the details with physiological significance difficult to detect visually in the original CMR image. Conclusion The probability mapping obtained from the two features provides a way to define different cardiac segments which offer a way to identify areas in the myocardium of diagnostic importance (like core and border areas in scarred myocardium). PMID:24053280

  4. Limited test data: The choice between confidence limits and inverse probability

    International Nuclear Information System (INIS)

    Nichols, P.

    1975-01-01

    For a unit which has been successfully designed to a high standard of reliability, any test programme of reasonable size will result in only a small number of failures. In these circumstances the failure rate estimated from the tests will depend on the statistical treatment applied. When a large number of units is to be manufactured, an unexpected high failure rate will certainly result in a large number of failures, so it is necessary to guard against optimistic unrepresentative test results by using a confidence limit approach. If only a small number of production units is involved, failures may not occur even with a higher than expected failure rate, and so one may be able to accept a method which allows for the possibility of either optimistic or pessimistic test results, and in this case an inverse probability approach, based on Bayes' theorem, might be used. The paper first draws attention to an apparently significant difference in the numerical results from the two methods, particularly for the overall probability of several units arranged in redundant logic. It then discusses a possible objection to the inverse method, followed by a demonstration that, for a large population and a very reasonable choice of prior probability, the inverse probability and confidence limit methods give the same numerical result. Finally, it is argued that a confidence limit approach is overpessimistic when a small number of production units is involved, and that both methods give the same answer for a large population. (author)

  5. Analysis of HIV-1 intersubtype recombination breakpoints suggests region with high pairing probability may be a more fundamental factor than sequence similarity affecting HIV-1 recombination.

    Science.gov (United States)

    Jia, Lei; Li, Lin; Gui, Tao; Liu, Siyang; Li, Hanping; Han, Jingwan; Guo, Wei; Liu, Yongjian; Li, Jingyun

    2016-09-21

    With increasing data on HIV-1, a more relevant molecular model describing mechanism details of HIV-1 genetic recombination usually requires upgrades. Currently an incomplete structural understanding of the copy choice mechanism along with several other issues in the field that lack elucidation led us to perform an analysis of the correlation between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarity to further explore structural mechanisms. Near full length sequences of URFs from Asia, Europe, and Africa (one sequence/patient), and representative sequences of worldwide CRFs were retrieved from the Los Alamos HIV database. Their recombination patterns were analyzed by jpHMM in detail. Then the relationships between breakpoint distributions and (1) the probability of base pairing, and (2) intersubtype genetic similarities were investigated. Pearson correlation test showed that all URF groups and the CRF group exhibit the same breakpoint distribution pattern. Additionally, the Wilcoxon two-sample test indicated a significant and inexplicable limitation of recombination in regions with high pairing probability. These regions have been found to be strongly conserved across distinct biological states (i.e., strong intersubtype similarity), and genetic similarity has been determined to be a very important factor promoting recombination. Thus, the results revealed an unexpected disagreement between intersubtype similarity and breakpoint distribution, which were further confirmed by genetic similarity analysis. Our analysis reveals a critical conflict between results from natural HIV-1 isolates and those from HIV-1-based assay vectors in which genetic similarity has been shown to be a very critical factor promoting recombination. These results indicate the region with high-pairing probabilities may be a more fundamental factor affecting HIV-1 recombination than sequence similarity in natural HIV-1 infections. Our

  6. Effects of NMDA receptor antagonists on probability discounting depend on the order of probability presentation.

    Science.gov (United States)

    Yates, Justin R; Breitenstein, Kerry A; Gunkel, Benjamin T; Hughes, Mallory N; Johnson, Anthony B; Rogers, Katherine K; Shape, Sara M

    Risky decision making can be measured using a probability-discounting procedure, in which animals choose between a small, certain reinforcer and a large, uncertain reinforcer. Recent evidence has identified glutamate as a mediator of risky decision making, as blocking the N-methyl-d-aspartate (NMDA) receptor with MK-801 increases preference for a large, uncertain reinforcer. Because the order in which probabilities associated with the large reinforcer can modulate the effects of drugs on choice, the current study determined if NMDA receptor ligands alter probability discounting using ascending and descending schedules. Sixteen rats were trained in a probability-discounting procedure in which the odds against obtaining the large reinforcer increased (n=8) or decreased (n=8) across blocks of trials. Following behavioral training, rats received treatments of the NMDA receptor ligands MK-801 (uncompetitive antagonist; 0, 0.003, 0.01, or 0.03mg/kg), ketamine (uncompetitive antagonist; 0, 1.0, 5.0, or 10.0mg/kg), and ifenprodil (NR2B-selective non-competitive antagonist; 0, 1.0, 3.0, or 10.0mg/kg). Results showed discounting was steeper (indicating increased risk aversion) for rats on an ascending schedule relative to rats on the descending schedule. Furthermore, the effects of MK-801, ketamine, and ifenprodil on discounting were dependent on the schedule used. Specifically, the highest dose of each drug decreased risk taking in rats in the descending schedule, but only MK-801 (0.03mg/kg) increased risk taking in rats on an ascending schedule. These results show that probability presentation order modulates the effects of NMDA receptor ligands on risky decision making. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Can infants learn phonology in the lab? A meta-analytic answer.

    Science.gov (United States)

    Cristia, Alejandrina

    2018-01-01

    Two of the key tasks facing the language-learning infant lie at the level of phonology: establishing which sounds are contrastive in the native inventory, and determining what their possible syllabic positions and permissible combinations (phonotactics) are. In 2002-2003, two theoretical proposals, one bearing on how infants can learn sounds (Maye, Werker, & Gerken, 2002) and the other on phonotactics (Chambers, Onishi, & Fisher, 2003), were put forward on the pages of Cognition, each supported by two laboratory experiments, wherein a group of infants was briefly exposed to a set of pseudo-words, and plausible phonological generalizations were tested subsequently. These two papers have received considerable attention from the general scientific community, and inspired a flurry of follow-up work. In the context of questions regarding the replicability of psychological science, the present work uses a meta-analytic approach to appraise extant empirical evidence for infant phonological learning in the laboratory. It is found that neither seminal finding (on learning sounds and learning phonotactics) holds up when close methodological replications are integrated, although less close methodological replications do provide some evidence in favor of the sound learning strand of work. Implications for authors and readers of this literature are drawn out. It would be desirable that additional mechanisms for phonological learning be explored, and that future infant laboratory work employ paradigms that rely on constrained and unambiguous links between experimental exposure and measured infant behavior. Copyright © 2017. Published by Elsevier B.V.

  8. Expert estimation of human error probabilities in nuclear power plant operations: a review of probability assessment and scaling

    International Nuclear Information System (INIS)

    Stillwell, W.G.; Seaver, D.A.; Schwartz, J.P.

    1982-05-01

    This report reviews probability assessment and psychological scaling techniques that could be used to estimate human error probabilities (HEPs) in nuclear power plant operations. The techniques rely on expert opinion and can be used to estimate HEPs where data do not exist or are inadequate. These techniques have been used in various other contexts and have been shown to produce reasonably accurate probabilities. Some problems do exist, and limitations are discussed. Additional topics covered include methods for combining estimates from multiple experts, the effects of training on probability estimates, and some ideas on structuring the relationship between performance shaping factors and HEPs. Preliminary recommendations are provided along with cautions regarding the costs of implementing the recommendations. Additional research is required before definitive recommendations can be made

  9. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  10. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  11. Rejecting probability summation for radial frequency patterns, not so Quick!

    Science.gov (United States)

    Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F

    2016-05-01

    Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Convergence of Transition Probability Matrix in CLVMarkov Models

    Science.gov (United States)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  13. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  14. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  15. Bremsstrahlung emission probability in the α decay of 210Po

    International Nuclear Information System (INIS)

    Boie, Hans-Hermann

    2009-01-01

    A high-statistics measurement of bremsstrahlung emitted in the α decay of 210 Po has been performed. The measured differential emission probabilities, which could be followed up to γ-energies of ∝ 500 keV, allow for the first time for a serious test of various model calculations of the bremsstrahlung accompanied α decay. It is shown that corrections to the α-γ angular correlation due to the interference between the electric dipole and quadrupole amplitudes and due to the relativistic character of the process have to be taken into account. With the experimentally derived angular correlation the measured energydifferential bremsstrahlung emission probabilities show excellent agreement with the fully quantum mechanical calculation. (orig.)

  16. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  17. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  18. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  19. On the shake-off probability for atomic systems

    Energy Technology Data Exchange (ETDEWEB)

    Santos, A.C.F., E-mail: toniufrj@gmail.com [Instituto de Física, Universidade Federal do Rio de Janeiro, P.O. Box 68528, 21941-972 Rio de Janeiro, RJ (Brazil); Almeida, D.P. [Departamento de Física, Universidade Federal de Santa Catarina, 88040-900 Florianópolis (Brazil)

    2016-07-15

    Highlights: • The scope is to find the relationship among SO probabilities, Z and electron density. • A scaling law is suggested, allowing us to find the SO probabilities for atoms. • SO probabilities have been scaled as a function of target Z and polarizability. - Abstract: The main scope in this work has been upon the relationship between shake-off probabilities, target atomic number and electron density. By comparing the saturation values of measured double-to-single photoionization ratios from the literature, a simple scaling law has been found, which allows us to predict the shake-off probabilities for several elements up to Z = 54 within a factor 2. The electron shake-off probabilities accompanying valence shell photoionization have been scaled as a function of the target atomic number, Z, and polarizability, α. This behavior is in qualitative agreement with the experimental results.

  20. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  1. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  2. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  3. The estimation of small probabilities and risk assessment

    International Nuclear Information System (INIS)

    Kalbfleisch, J.D.; Lawless, J.F.; MacKay, R.J.

    1982-01-01

    The primary contribution of statistics to risk assessment is in the estimation of probabilities. Frequently the probabilities in question are small, and their estimation is particularly difficult. The authors consider three examples illustrating some problems inherent in the estimation of small probabilities

  4. Probability Weighting and Loss Aversion in Futures Hedging

    NARCIS (Netherlands)

    Mattos, F.; Garcia, P.; Pennings, J.M.E.

    2008-01-01

    We analyze how the introduction of probability weighting and loss aversion in a futures hedging model affects decision making. Analytical findings indicate that probability weighting alone always affects optimal hedge ratios, while loss and risk aversion only have an impact when probability

  5. More than words: Adults learn probabilities over categories and relationships between them.

    Science.gov (United States)

    Hudson Kam, Carla L

    2009-04-01

    This study examines whether human learners can acquire statistics over abstract categories and their relationships to each other. Adult learners were exposed to miniature artificial languages containing variation in the ordering of the Subject, Object, and Verb constituents. Different orders (e.g. SOV, VSO) occurred in the input with different frequencies, but the occurrence of one order versus another was not predictable. Importantly, the language was constructed such that participants could only match the overall input probabilities if they were tracking statistics over abstract categories, not over individual words. At test, participants reproduced the probabilities present in the input with a high degree of accuracy. Closer examination revealed that learner's were matching the probabilities associated with individual verbs rather than the category as a whole. However, individual nouns had no impact on word orders produced. Thus, participants learned the probabilities of a particular ordering of the abstract grammatical categories Subject and Object associated with each verb. Results suggest that statistical learning mechanisms are capable of tracking relationships between abstract linguistic categories in addition to individual items.

  6. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  7. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  8. Framing of decision problem in short and long term and probability perception

    Directory of Open Access Journals (Sweden)

    Anna Wielicka-Regulska

    2010-01-01

    Full Text Available Consumer preferences are dependent on problem framing and time perspective. For experiment’s participants avoiding of losses was less probable in distant time perspective than in near term. On the contrary, achieving gains in near future was less probable than in remote time. One may expect different reactions when presenting problem in terms of gains than in terms of losses. This can be exploited in promotion of highly desired social behaviours like savings for retirement, keeping good diet, investing in learning, and other advantageous activities that are usually put forward by consumers.

  9. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  10. Using extreme value theory approaches to forecast the probability of outbreak of highly pathogenic influenza in Zhejiang, China.

    Directory of Open Access Journals (Sweden)

    Jiangpeng Chen

    Full Text Available Influenza is a contagious disease with high transmissibility to spread around the world with considerable morbidity and mortality and presents an enormous burden on worldwide public health. Few mathematical models can be used because influenza incidence data are generally not normally distributed. We developed a mathematical model using Extreme Value Theory (EVT to forecast the probability of outbreak of highly pathogenic influenza.The incidence data of highly pathogenic influenza in Zhejiang province from April 2009 to November 2013 were retrieved from the website of Health and Family Planning Commission of Zhejiang Province. MATLAB "VIEM" toolbox was used to analyze data and modelling. In the present work, we used the Peak Over Threshold (POT model, assuming the frequency as a Poisson process and the intensity to be Pareto distributed, to characterize the temporal variability of the long-term extreme incidence of highly pathogenic influenza in Zhejiang, China.The skewness and kurtosis of the incidence of highly pathogenic influenza in Zhejiang between April 2009 and November 2013 were 4.49 and 21.12, which indicated a "fat tail" distribution. A QQ plot and a mean excess plot were used to further validate the features of the distribution. After determining the threshold, we modeled the extremes and estimated the shape parameter and scale parameter by the maximum likelihood method. The results showed that months in which the incidence of highly pathogenic influenza is about 4462/2286/1311/487 are predicted to occur once every five/three/two/one year, respectively.Despite the simplicity, the present study successfully offers the sound modeling strategy and a methodological avenue to implement forecasting of an epidemic in the midst of its course.

  11. Maximum Entropy and Probability Kinematics Constrained by Conditionals

    Directory of Open Access Journals (Sweden)

    Stefan Lukits

    2015-03-01

    Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.

  12. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  13. Teaching Probability with the Support of the R Statistical Software

    Science.gov (United States)

    dos Santos Ferreira, Robson; Kataoka, Verônica Yumi; Karrer, Monica

    2014-01-01

    The objective of this paper is to discuss aspects of high school students' learning of probability in a context where they are supported by the statistical software R. We report on the application of a teaching experiment, constructed using the perspective of Gal's probabilistic literacy and Papert's constructionism. The results show improvement…

  14. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  15. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  16. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  17. Electrofishing capture probability of smallmouth bass in streams

    Science.gov (United States)

    Dauwalter, D.C.; Fisher, W.L.

    2007-01-01

    Abundance estimation is an integral part of understanding the ecology and advancing the management of fish populations and communities. Mark-recapture and removal methods are commonly used to estimate the abundance of stream fishes. Alternatively, abundance can be estimated by dividing the number of individuals sampled by the probability of capture. We conducted a mark-recapture study and used multiple repeated-measures logistic regression to determine the influence of fish size, sampling procedures, and stream habitat variables on the cumulative capture probability for smallmouth bass Micropterus dolomieu in two eastern Oklahoma streams. The predicted capture probability was used to adjust the number of individuals sampled to obtain abundance estimates. The observed capture probabilities were higher for larger fish and decreased with successive electrofishing passes for larger fish only. Model selection suggested that the number of electrofishing passes, fish length, and mean thalweg depth affected capture probabilities the most; there was little evidence for any effect of electrofishing power density and woody debris density on capture probability. Leave-one-out cross validation showed that the cumulative capture probability model predicts smallmouth abundance accurately. ?? Copyright by the American Fisheries Society 2007.

  18. On the Hitting Probability of Max-Stable Processes

    OpenAIRE

    Hofmann, Martin

    2012-01-01

    The probability that a max-stable process {\\eta} in C[0, 1] with identical marginal distribution function F hits x \\in R with 0 < F (x) < 1 is the hitting probability of x. We show that the hitting probability is always positive, unless the components of {\\eta} are completely dependent. Moreover, we consider the event that the paths of standard MSP hit some x \\in R twice and we give a sufficient condition for a positive probability of this event.

  19. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  20. Earthquake Probability Assessment for the Active Faults in Central Taiwan: A Case Study

    Directory of Open Access Journals (Sweden)

    Yi-Rui Lee

    2016-06-01

    Full Text Available Frequent high seismic activities occur in Taiwan due to fast plate motions. According to the historical records the most destructive earthquakes in Taiwan were caused mainly by inland active faults. The Central Geological Survey (CGS of Taiwan has published active fault maps in Taiwan since 1998. There are 33 active faults noted in the 2012 active fault map. After the Chi-Chi earthquake, CGS launched a series of projects to investigate the details to better understand each active fault in Taiwan. This article collected this data to develop active fault parameters and referred to certain experiences from Japan and the United States to establish a methodology for earthquake probability assessment via active faults. We consider the active faults in Central Taiwan as a good example to present the earthquake probability assessment process and results. The appropriate “probability model” was used to estimate the conditional probability where M ≥ 6.5 and M ≥ 7.0 earthquakes. Our result shows that the highest earthquake probability for M ≥ 6.5 earthquake occurring in 30, 50, and 100 years in Central Taiwan is the Tachia-Changhua fault system. Conversely, the lowest earthquake probability is the Chelungpu fault. The goal of our research is to calculate the earthquake probability of the 33 active faults in Taiwan. The active fault parameters are important information that can be applied in the following seismic hazard analysis and seismic simulation.

  1. Fixation Probability in a Haploid-Diploid Population.

    Science.gov (United States)

    Bessho, Kazuhiro; Otto, Sarah P

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright-Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. Copyright © 2017 by the Genetics Society of America.

  2. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  3. Chance, determinism and the classical theory of probability.

    Science.gov (United States)

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  5. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  6. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  7. Spatial probability of soil water repellency in an abandoned agricultural field in Lithuania

    Science.gov (United States)

    Pereira, Paulo; Misiūnė, Ieva

    2015-04-01

    Water repellency is a natural soil property with implications on infiltration, erosion and plant growth. It depends on soil texture, type and amount of organic matter, fungi, microorganisms, and vegetation cover (Doerr et al., 2000). Human activities as agriculture can have implications on soil water repellency (SWR) due tillage and addition of organic compounds and fertilizers (Blanco-Canqui and Lal, 2009; Gonzalez-Penaloza et al., 2012). It is also assumed that SWR has a high small-scale variability (Doerr et al., 2000). The aim of this work is to study the spatial probability of SWR in an abandoned field testing several geostatistical methods, Organic Kriging (OK), Simple Kriging (SK), Indicator Kriging (IK), Probability Kriging (PK) and Disjunctive Kriging (DK). The study area it is located near Vilnius urban area at (54 49' N, 25 22', 104 masl) in Lithuania (Pereira and Oliva, 2013). It was designed a experimental plot with 21 m2 (07x03 m). Inside this area it was measured SWR was measured every 50 cm using the water drop penetration time (WDPT) (Wessel, 1998). A total of 105 points were measured. The probability of SWR was classified in 0 (No probability) to 1 (High probability). The methods accuracy was assessed with the cross validation method. The best interpolation method was the one with the lowest Root Mean Square Error (RMSE). The results showed that the most accurate probability method was SK (RMSE=0.436), followed by DK (RMSE=0.437), IK (RMSE=0.448), PK (RMSE=0.452) and OK (RMSE=0.537). Significant differences were identified among probability tests (Kruskal-Wallis test =199.7597 ptested technique. Simple Kriging, DK, IK and PK methods identified the high SWR probabilities in the northeast and central part of the plot, while OK observed mainly in the south-western part of the plot. In conclusion, before predict the spatial probability of SWR it is important to test several methods in order to identify the most accurate. Acknowledgments COST action ES

  8. 14 CFR 417.224 - Probability of failure analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle failure...

  9. Students' Understanding of Conditional Probability on Entering University

    Science.gov (United States)

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  10. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  11. The collision probability modules of WIMS-E

    International Nuclear Information System (INIS)

    Roth, M.J.

    1985-04-01

    This report describes how flat source first flight collision probabilities are calculated and used in the WIMS-E modular program. It includes a description of the input to the modules W-FLU, W-THES, W-PIP, W-PERS and W-MERGE. Input to other collision probability modules are described in separate reports. WIMS-E is capable of calculating collision probabilities in a wide variety of geometries, some of them quite complicated. It can also use them for a variety of purposes. (author)

  12. Comparison of Wells and Revised Geneva Rule to Assess Pretest Probability of Pulmonary Embolism in High-Risk Hospitalized Elderly Adults.

    Science.gov (United States)

    Di Marca, Salvatore; Cilia, Chiara; Campagna, Andrea; D'Arrigo, Graziella; Abd ElHafeez, Samar; Tripepi, Giovanni; Puccia, Giuseppe; Pisano, Marcella; Mastrosimone, Gianluca; Terranova, Valentina; Cardella, Antonella; Buonacera, Agata; Stancanelli, Benedetta; Zoccali, Carmine; Malatino, Lorenzo

    2015-06-01

    To assess and compare the diagnostic power for pulmonary embolism (PE) of Wells and revised Geneva scores in two independent cohorts (training and validation groups) of elderly adults hospitalized in a non-emergency department. Prospective clinical study, January 2011 to January 2013. Unit of Internal Medicine inpatients, University of Catania, Italy. Elderly adults (mean age 76 ± 12), presenting with dyspnea or chest pain and with high clinical probability of PE or D-dimer values greater than 500 ng/mL (N = 203), were enrolled and consecutively assigned to a training (n = 101) or a validation (n = 102) group. The clinical probability of PE was assessed using Wells and revised Geneva scores. Clinical examination, D-dimer test, and multidetector computed angiotomography were performed in all participants. The accuracy of the scores was assessed using receiver operating characteristic analyses. PE was confirmed in 46 participants (23%) (24 training group, 22 validation group). In the training group, the area under the receiver operating characteristic curve was 0.91 (95% confidence interval (CI) = 0.85-0.98) for the Wells score and 0.69 (95% CI = 0.56-0.82) for the revised Geneva score (P < .001). These results were confirmed in the validation group (P < .05). The positive (LR+) and negative likelihood ratios (LR-) (two indices combining sensitivity and specificity) of the Wells score were superior to those of the revised Geneva score in the training (LR+, 7.90 vs 1.34; LR-, 0.23 vs 0.66) and validation (LR+, 13.5 vs 1.46; LR-, 0.47 vs 0.54) groups. In high-risk elderly hospitalized adults, the Wells score is more accurate than the revised Geneva score for diagnosing PE. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.

  13. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  14. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  15. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  16. Probabilities the little numbers that rule our lives

    CERN Document Server

    Olofsson, Peter

    2014-01-01

    Praise for the First Edition"If there is anything you want to know, or remind yourself, about probabilities, then look no further than this comprehensive, yet wittily written and enjoyable, compendium of how to apply probability calculations in real-world situations."- Keith Devlin, Stanford University, National Public Radio's "Math Guy" and author of The Math Gene and The Unfinished GameFrom probable improbabilities to regular irregularities, Probabilities: The Little Numbers That Rule Our Lives, Second Edition investigates the often surprising effects of risk and chance in our lives. Featur

  17. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  18. Evolvement simulation of the probability of neutron-initiating persistent fission chain

    International Nuclear Information System (INIS)

    Wang Zhe; Hong Zhenying

    2014-01-01

    Background: Probability of neutron-initiating persistent fission chain, which has to be calculated in analysis of critical safety, start-up of reactor, burst waiting time on pulse reactor, bursting time on pulse reactor, etc., is an inherent parameter in a multiplying assembly. Purpose: We aim to derive time-dependent integro-differential equation for such probability in relative velocity space according to the probability conservation, and develop the deterministic code Dynamic Segment Number Probability (DSNP) based on the multi-group S N method. Methods: The reliable convergence of dynamic calculation was analyzed and numerical simulation of the evolvement process of dynamic probability for varying concentration was performed under different initial conditions. Results: On Highly Enriched Uranium (HEU) Bare Spheres, when the time is long enough, the results of dynamic calculation approach to those of static calculation. The most difference of such results between DSNP and Partisn code is less than 2%. On Baker model, over the range of about 1 μs after the first criticality, the most difference between the dynamic and static calculation is about 300%. As for a super critical system, the finite fission chains decrease and the persistent fission chains increase as the reactivity aggrandizes, the dynamic evolvement curve of initiation probability is close to the static curve within the difference of 5% when the K eff is more than 1.2. The cumulative probability curve also indicates that the difference of integral results between the dynamic calculation and the static calculation decreases from 35% to 5% as the K eff increases. This demonstrated that the ability of initiating a self-sustaining fission chain reaction approaches stabilization, while the former difference (35%) showed the important difference of the dynamic results near the first criticality with the static ones. The DSNP code agrees well with Partisn code. Conclusions: There are large numbers of

  19. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1996-07-01

    In this paper the classical sequential probability ratio testing method (SPRT) is reconsidered. Every individual boundary crossing event of the SPRT is regarded as a new piece of evidence about the problem under hypothesis testing. The Bayes method is applied for belief updating, i.e. integrating these individual decisions. The procedure is recommended to use when the user (1) would like to be informed about the tested hypothesis continuously and (2) would like to achieve his final conclusion with high confidence level. (Author).

  1. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  2. Transition probabilities of Ce I obtained from Boltzmann analysis of visible and near-infrared emission spectra

    Science.gov (United States)

    Nitz, D. E.; Curry, J. J.; Buuck, M.; DeMann, A.; Mitchell, N.; Shull, W.

    2018-02-01

    We report radiative transition probabilities for 5029 emission lines of neutral cerium within the wavelength range 417-1110 nm. Transition probabilities for only 4% of these lines have been previously measured. These results are obtained from a Boltzmann analysis of two high resolution Fourier transform emission spectra used in previous studies of cerium, obtained from the digital archives of the National Solar Observatory at Kitt Peak. The set of transition probabilities used for the Boltzmann analysis are those published by Lawler et al (2010 J. Phys. B: At. Mol. Opt. Phys. 43 085701). Comparisons of branching ratios and transition probabilities for lines common to the two spectra provide important self-consistency checks and test for the presence of self-absorption effects. Estimated 1σ uncertainties for our transition probability results range from 10% to 18%.

  3. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  4. An explicit statistical model of learning lexical segmentation using multiple cues

    NARCIS (Netherlands)

    Çöltekin, Ça ̆grı; Nerbonne, John; Lenci, Alessandro; Padró, Muntsa; Poibeau, Thierry; Villavicencio, Aline

    2014-01-01

    This paper presents an unsupervised and incremental model of learning segmentation that combines multiple cues whose use by children and adults were attested by experimental studies. The cues we exploit in this study are predictability statistics, phonotactics, lexical stress and partial lexical

  5. Phonotactics in inductive logic programming

    NARCIS (Netherlands)

    Nerbonne, J.; Konstantopoulos, S.; Klopotek, M.A.; Wierzchon, S.T.; Trojanowski, K.

    2004-01-01

    We examine the results of applying inductive logic programming (ILP) to a relatively simple linguistic task, that of recognizing monosyllables in one language. ILP is suited to linguistic problems given linguists' preference for formulating their theories in discrete rules, and because of ILP's

  6. Generic Degraded Configuration Probability Analysis for the Codisposal Waste Package

    International Nuclear Information System (INIS)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-01-01

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M and O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k eff in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package

  7. An novel frequent probability pattern mining algorithm based on circuit simulation method in uncertain biological networks

    Science.gov (United States)

    2014-01-01

    Background Motif mining has always been a hot research topic in bioinformatics. Most of current research on biological networks focuses on exact motif mining. However, due to the inevitable experimental error and noisy data, biological network data represented as the probability model could better reflect the authenticity and biological significance, therefore, it is more biological meaningful to discover probability motif in uncertain biological networks. One of the key steps in probability motif mining is frequent pattern discovery which is usually based on the possible world model having a relatively high computational complexity. Methods In this paper, we present a novel method for detecting frequent probability patterns based on circuit simulation in the uncertain biological networks. First, the partition based efficient search is applied to the non-tree like subgraph mining where the probability of occurrence in random networks is small. Then, an algorithm of probability isomorphic based on circuit simulation is proposed. The probability isomorphic combines the analysis of circuit topology structure with related physical properties of voltage in order to evaluate the probability isomorphism between probability subgraphs. The circuit simulation based probability isomorphic can avoid using traditional possible world model. Finally, based on the algorithm of probability subgraph isomorphism, two-step hierarchical clustering method is used to cluster subgraphs, and discover frequent probability patterns from the clusters. Results The experiment results on data sets of the Protein-Protein Interaction (PPI) networks and the transcriptional regulatory networks of E. coli and S. cerevisiae show that the proposed method can efficiently discover the frequent probability subgraphs. The discovered subgraphs in our study contain all probability motifs reported in the experiments published in other related papers. Conclusions The algorithm of probability graph isomorphism

  8. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  9. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    Science.gov (United States)

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which

  10. A Probability Distribution over Latent Causes, in the Orbitofrontal Cortex.

    Science.gov (United States)

    Chan, Stephanie C Y; Niv, Yael; Norman, Kenneth A

    2016-07-27

    The orbitofrontal cortex (OFC) has been implicated in both the representation of "state," in studies of reinforcement learning and decision making, and also in the representation of "schemas," in studies of episodic memory. Both of these cognitive constructs require a similar inference about the underlying situation or "latent cause" that generates our observations at any given time. The statistically optimal solution to this inference problem is to use Bayes' rule to compute a posterior probability distribution over latent causes. To test whether such a posterior probability distribution is represented in the OFC, we tasked human participants with inferring a probability distribution over four possible latent causes, based on their observations. Using fMRI pattern similarity analyses, we found that BOLD activity in the OFC is best explained as representing the (log-transformed) posterior distribution over latent causes. Furthermore, this pattern explained OFC activity better than other task-relevant alternatives, such as the most probable latent cause, the most recent observation, or the uncertainty over latent causes. Our world is governed by hidden (latent) causes that we cannot observe, but which generate the observations we see. A range of high-level cognitive processes require inference of a probability distribution (or "belief distribution") over the possible latent causes that might be generating our current observations. This is true for reinforcement learning and decision making (where the latent cause comprises the true "state" of the task), and for episodic memory (where memories are believed to be organized by the inferred situation or "schema"). Using fMRI, we show that this belief distribution over latent causes is encoded in patterns of brain activity in the orbitofrontal cortex, an area that has been separately implicated in the representations of both states and schemas. Copyright © 2016 the authors 0270-6474/16/367817-12$15.00/0.

  11. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  12. Probability of inadvertent operation of electrical components in harsh environments

    International Nuclear Information System (INIS)

    Knoll, A.

    1989-01-01

    Harsh environment, which means humidity and high temperature, may and will affect unsealed electrical components by causing leakage ground currents in ungrounded direct current systems. The concern in a nuclear power plant is that such harsh environment conditions could cause inadvertent operation of normally deenergized components, which may have a safety-related isolation function. Harsh environment is a common cause failure, and one way to approach the problem is to assume that all the unsealed electrical components will simultaneously and inadvertently energize as a result of the environmental common cause failure. This assumption is unrealistically conservative. Test results indicated that insulating resistences of any terminal block in harsh environments have a random distribution in the range of 1 to 270 kΩ, with a mean value ∼59 kΩ. The objective of this paper is to evaluate a realistic conditional failure probability for inadvertent operation of electrical components in harsh environments. This value will be used thereafter in probabilistic safety evaluations of harsh environment events and will replace both the overconservative common cause probability of 1 and the random failure probability used for mild environments

  13. Grit-mediated frictional ignition of a polymer-bonded explosive during oblique impacts: Probability calculations for safety engineering

    International Nuclear Information System (INIS)

    Heatwole, Eric; Parker, Gary; Holmes, Matt; Dickson, Peter

    2015-01-01

    Frictional heating of high-melting-point grit particles during oblique impacts of consolidated explosives is considered to be the major source of ignition in accidents involving dropped explosives. It has been shown in other work that the lower temperature melting point of two frictionally interacting surfaces will cap the maximum temperature reached, which provides a simple way to mitigate the danger in facilities by implementing surfaces with melting points below the ignition temperature of the explosive. However, a recent series of skid testing experiments has shown that ignition can occur on low-melting-point surfaces with a high concentration of grit particles, most likely due to a grit–grit collision mechanism. For risk-based safety engineering purposes, the authors present a method to estimate the probability of grit contact and/or grit–grit collision during an oblique impact. These expressions are applied to potentially high-consequence oblique impact scenarios in order to give the probability of striking one or more grit particles (for high-melting-point surfaces), or the probability of one or more grit–grit collisions occurring (for low-melting-point surfaces). The probability is dependent on a variety of factors, many of which can be controlled for mitigation to achieve acceptable risk levels for safe explosives handling operations. - Highlights: • Unexpectedly, grit-mediated ignition of a PBX occurred on low-melting point surfaces. • On high-melting surfaces frictional heating is due to a grit–surface interaction. • For low-melting point surfaces the heating mechanism is grit–grit collisions. • A method for estimating the probability of ignition is presented for both surfaces

  14. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  15. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  16. A discussion on the origin of quantum probabilities

    International Nuclear Information System (INIS)

    Holik, Federico; Sáenz, Manuel; Plastino, Angel

    2014-01-01

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivation of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases

  17. More efficient integrated safeguards by applying a reasonable detection probability for maintaining low presence probability of undetected nuclear proliferating activities

    International Nuclear Information System (INIS)

    Otsuka, Naoto

    2013-01-01

    Highlights: • A theoretical foundation is presented for more efficient Integrated Safeguards (IS). • Probability of undetected nuclear proliferation activities should be maintained low. • For nations under IS, the probability to start proliferation activities is very low. • The fact can decrease the detection probability of IS by dozens of percentage points. • The cost of IS per nation can be cut down by reducing inspection frequencies etc. - Abstract: A theoretical foundation is presented for implementing more efficiently the present International Atomic Energy Agency (IAEA) integrated safeguards (ISs) on the basis of fuzzy evaluation of the probability that the evaluated nation will continue peaceful activities. It is shown that by determining the presence probability of undetected nuclear proliferating activities, nations under IS can be maintained at acceptably low proliferation risk levels even if the detection probability of current IS is decreased by dozens of percentage from the present value. This makes it possible to reduce inspection frequency and the number of collected samples, allowing the IAEA to cut costs per nation. This will contribute to further promotion and application of IS to more nations by the IAEA, and more efficient utilization of IAEA resources from the viewpoint of whole IS framework

  18. The Probability of Uranium Deposit Occurrences at Hatapang and Its Surrounding

    International Nuclear Information System (INIS)

    Soepradto-Tjokrokardono; Ngadenin

    2004-01-01

    This study was carried out based on a geological condition of Hatapang and is surroundings areas that are favourable for uranium accumulation, which are indicated by the existence of granite high uranium content, having mobilizations process and uranium trapping rocks. Referring to the plate tectonic and geochemical situation of Hatapang, those condition will give a significant indications for the possible occurrence of deposit of uranium in the area. The goal of this study is to know the probability occurrences of uranium deposit based on the regional tectonic, geology, mineralogy, geochemical, and radioactivity characters. It is concluded that Hatapang granite is potential for U source granite, and U deposit of black shale type is probably accurate in this area. (author)

  19. Bremsstrahlung emission probability in the {alpha} decay of {sup 210}Po

    Energy Technology Data Exchange (ETDEWEB)

    Boie, Hans-Hermann

    2009-06-03

    A high-statistics measurement of bremsstrahlung emitted in the {alpha} decay of {sup 210}Po has been performed. The measured differential emission probabilities, which could be followed up to {gamma}-energies of {proportional_to} 500 keV, allow for the first time for a serious test of various model calculations of the bremsstrahlung accompanied {alpha} decay. It is shown that corrections to the {alpha}-{gamma} angular correlation due to the interference between the electric dipole and quadrupole amplitudes and due to the relativistic character of the process have to be taken into account. With the experimentally derived angular correlation the measured energydifferential bremsstrahlung emission probabilities show excellent agreement with the fully quantum mechanical calculation. (orig.)

  20. Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations?

    Directory of Open Access Journals (Sweden)

    David Howden

    2009-10-01

    Full Text Available Frequency probability theorists define an event’s probability distribution as the limit of a repeated set of trials belonging to a homogeneous collective. The subsets of this collective are events which we have deficient knowledge about on an individual level, although for the larger collective we have knowledge its aggregate behavior. Hence, probabilities can only be achieved through repeated trials of these subsets arriving at the established frequencies that define the probabilities. Crovelli (2009 argues that this is a mistaken approach, and that a subjective assessment of individual trials should be used instead. Bifurcating between the two concepts of risk and uncertainty, Crovelli first asserts that probability is the tool used to manage uncertain situations, and then attempts to rebuild a definition of probability theory with this in mind. We show that such an attempt has little to gain, and results in an indeterminate application of entrepreneurial forecasting to uncertain decisions—a process far-removed from any application of probability theory.

  1. Predicting Cumulative Incidence Probability by Direct Binomial Regression

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...

  2. Subjective Probabilities for State-Dependent Continuous Utility

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1987-01-01

    textabstractFor the expected utility model with state dependent utilities, Karni, Schmeidler and Vind (1983) have shown how to recover uniquely the involved subjective probabilities if the preferences, contingent on a hypothetical probability distribution over the state space, are known. This they

  3. Surprisingly rational: probability theory plus noise explains biases in judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. (c) 2014 APA, all rights reserved.

  4. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    Science.gov (United States)

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  5. A Probability Co-Kriging Model to Account for Reporting Bias and Recognize Areas at High Risk for Zebra Mussels and Eurasian Watermilfoil Invasions in Minnesota

    Directory of Open Access Journals (Sweden)

    Kaushi S. T. Kanankege

    2018-01-01

    Full Text Available Zebra mussels (ZMs (Dreissena polymorpha and Eurasian watermilfoil (EWM (Myriophyllum spicatum are aggressive aquatic invasive species posing a conservation burden on Minnesota. Recognizing areas at high risk for invasion is a prerequisite for the implementation of risk-based prevention and mitigation management strategies. The early detection of invasion has been challenging, due in part to the imperfect observation process of invasions including the absence of a surveillance program, reliance on public reporting, and limited resource availability, which results in reporting bias. To predict the areas at high risk for invasions, while accounting for underreporting, we combined network analysis and probability co-kriging to estimate the risk of ZM and EWM invasions. We used network analysis to generate a waterbody-specific variable representing boater traffic, a known high risk activity for human-mediated transportation of invasive species. In addition, co-kriging was used to estimate the probability of species introduction, using waterbody-specific variables. A co-kriging model containing distance to the nearest ZM infested location, boater traffic, and road access was used to recognize the areas at high risk for ZM invasions (AUC = 0.78. The EWM co-kriging model included distance to the nearest EWM infested location, boater traffic, and connectivity to infested waterbodies (AUC = 0.76. Results suggested that, by 2015, nearly 20% of the waterbodies in Minnesota were at high risk of ZM (12.45% or EWM (12.43% invasions, whereas only 125/18,411 (0.67% and 304/18,411 (1.65% are currently infested, respectively. Prediction methods presented here can support decisions related to solving the problems of imperfect detection, which subsequently improve the early detection of biological invasions.

  6. A Probability Co-Kriging Model to Account for Reporting Bias and Recognize Areas at High Risk for Zebra Mussels and Eurasian Watermilfoil Invasions in Minnesota.

    Science.gov (United States)

    Kanankege, Kaushi S T; Alkhamis, Moh A; Phelps, Nicholas B D; Perez, Andres M

    2017-01-01

    Zebra mussels (ZMs) ( Dreissena polymorpha ) and Eurasian watermilfoil (EWM) ( Myriophyllum spicatum ) are aggressive aquatic invasive species posing a conservation burden on Minnesota. Recognizing areas at high risk for invasion is a prerequisite for the implementation of risk-based prevention and mitigation management strategies. The early detection of invasion has been challenging, due in part to the imperfect observation process of invasions including the absence of a surveillance program, reliance on public reporting, and limited resource availability, which results in reporting bias. To predict the areas at high risk for invasions, while accounting for underreporting, we combined network analysis and probability co-kriging to estimate the risk of ZM and EWM invasions. We used network analysis to generate a waterbody-specific variable representing boater traffic, a known high risk activity for human-mediated transportation of invasive species. In addition, co-kriging was used to estimate the probability of species introduction, using waterbody-specific variables. A co-kriging model containing distance to the nearest ZM infested location, boater traffic, and road access was used to recognize the areas at high risk for ZM invasions (AUC = 0.78). The EWM co-kriging model included distance to the nearest EWM infested location, boater traffic, and connectivity to infested waterbodies (AUC = 0.76). Results suggested that, by 2015, nearly 20% of the waterbodies in Minnesota were at high risk of ZM (12.45%) or EWM (12.43%) invasions, whereas only 125/18,411 (0.67%) and 304/18,411 (1.65%) are currently infested, respectively. Prediction methods presented here can support decisions related to solving the problems of imperfect detection, which subsequently improve the early detection of biological invasions.

  7. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  8. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  9. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  10. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  11. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    Science.gov (United States)

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  12. Hamiltonian theories quantization based on a probability operator

    International Nuclear Information System (INIS)

    Entral'go, E.E.

    1986-01-01

    The quantization method with a linear reflection of classical coordinate-momentum-time functions Λ(q,p,t) at quantum operators in a space of quantum states ψ, is considered. The probability operator satisfies a system of equations representing the principles of dynamical and canonical correspondences between the classical and quantum theories. The quantization based on a probability operator leads to a quantum theory with a nonnegative joint coordinate-momentum distribution function for any state ψ. The main consequences of quantum mechanics with a probability operator are discussed in comparison with the generally accepted quantum and classical theories. It is shown that a probability operator leads to an appearance of some new notions called ''subquantum'' ones. Hence the quantum theory with a probability operator does not pretend to any complete description of physical reality in terms of classical variables and by this reason contains no problems like Einstein-Podolsky-Rosen paradox. The results of some concrete problems are given: a free particle, a harmonic oscillator, an electron in the Coulomb field. These results give hope on the possibility of an experimental verification of the quantization based on a probability operator

  13. Fitness Probability Distribution of Bit-Flip Mutation.

    Science.gov (United States)

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  14. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  15. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  16. Probability calculations for three-part mineral resource assessments

    Science.gov (United States)

    Ellefsen, Karl J.

    2017-06-27

    Three-part mineral resource assessment is a methodology for predicting, in a specified geographic region, both the number of undiscovered mineral deposits and the amount of mineral resources in those deposits. These predictions are based on probability calculations that are performed with computer software that is newly implemented. Compared to the previous implementation, the new implementation includes new features for the probability calculations themselves and for checks of those calculations. The development of the new implementation lead to a new understanding of the probability calculations, namely the assumptions inherent in the probability calculations. Several assumptions strongly affect the mineral resource predictions, so it is crucial that they are checked during an assessment. The evaluation of the new implementation leads to new findings about the probability calculations,namely findings regarding the precision of the computations,the computation time, and the sensitivity of the calculation results to the input.

  17. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  18. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  19. Transition Probabilities in {sup 189}Os

    Energy Technology Data Exchange (ETDEWEB)

    Malmskog, S G; Berg, V; Baecklin, A

    1970-02-15

    The level structure of {sup 189}Os has been studied from the decay of {sup 189}Ir (13,3 days) produced in proton spallation at CERN and mass separated in the ISOLDE on-line facility. The gamma-ray spectrum has been recorded both with a high resolution Si(Li) - detector and Ge(Li) - detectors. Three previously not reported transitions were observed defining a new level at 348.5 keV. Special attention was given to the low energy level band structure. Several multipolarity mixing ratios were deduced from measured L-subshell ratios which, together with measured level half-lives, gave absolute transition probabilities. The low level decay properties are discussed in terms of the Nilsson model with the inclusion of Coriolis coupling.

  20. Quantum interference of probabilities and hidden variable theories

    International Nuclear Information System (INIS)

    Srinivas, M.D.

    1984-01-01

    One of the fundamental contributions of Louis de Broglie, which does not get cited often, has been his analysis of the basic difference between the calculus of the probabilities as predicted by quantum theory and the usual calculus of probabilities - the one employed by most mathematicians, in its standard axiomatised version due to Kolmogorov. This paper is basically devoted to a discussion of the 'quantum interference of probabilities', discovered by de Broglie. In particular, it is shown that it is this feature of the quantum theoretic probabilities which leads to some serious constraints on the possible 'hidden-variable formulations' of quantum mechanics, including the celebrated theorem of Bell. (Auth.)

  1. A Challenge to Ludwig von Mises’s Theory of Probability

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2010-10-01

    Full Text Available The most interesting and completely overlooked aspect of Ludwig von Mises’s theory of probability is the total absence of any explicit definition for probability in his theory. This paper examines Mises’s theory of probability in light of the fact that his theory possesses no definition for probability. It is argued, first, that Mises’s theory differs in important respects from his brother’s famous theory of probability. A defense of the subjective definition for probability is then provided, which is subsequently used to critique Ludwig von Mises’s theory. It is argued that only the subjective definition for probability comports with Mises’s other philosophical positions. Since Mises did not provide an explicit definition for probability, it is suggested that he ought to have adopted a subjective definition.

  2. A Probability-Based Hybrid User Model for Recommendation System

    Directory of Open Access Journals (Sweden)

    Jia Hao

    2016-01-01

    Full Text Available With the rapid development of information communication technology, the available information or knowledge is exponentially increased, and this causes the well-known information overload phenomenon. This problem is more serious in product design corporations because over half of the valuable design time is consumed in knowledge acquisition, which highly extends the design cycle and weakens the competitiveness. Therefore, the recommender systems become very important in the domain of product domain. This research presents a probability-based hybrid user model, which is a combination of collaborative filtering and content-based filtering. This hybrid model utilizes user ratings and item topics or classes, which are available in the domain of product design, to predict the knowledge requirement. The comprehensive analysis of the experimental results shows that the proposed method gains better performance in most of the parameter settings. This work contributes a probability-based method to the community for implement recommender system when only user ratings and item topics are available.

  3. Cognitive-psychology expertise and the calculation of the probability of a wrongful conviction.

    Science.gov (United States)

    Rouder, Jeffrey N; Wixted, John T; Christenfeld, Nicholas J S

    2018-05-08

    Cognitive psychologists are familiar with how their expertise in understanding human perception, memory, and decision-making is applicable to the justice system. They may be less familiar with how their expertise in statistical decision-making and their comfort working in noisy real-world environments is just as applicable. Here we show how this expertise in ideal-observer models may be leveraged to calculate the probability of guilt of Gary Leiterman, a man convicted of murder on the basis of DNA evidence. We show by common probability theory that Leiterman is likely a victim of a tragic contamination event rather than a murderer. Making any calculation of the probability of guilt necessarily relies on subjective assumptions. The conclusion about Leiterman's innocence is not overly sensitive to the assumptions-the probability of innocence remains high for a wide range of reasonable assumptions. We note that cognitive psychologists may be well suited to make these calculations because as working scientists they may be comfortable with the role a reasonable degree of subjectivity plays in analysis.

  4. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  5. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  6. A note on iterated function systems with discontinuous probabilities

    International Nuclear Information System (INIS)

    Jaroszewska, Joanna

    2013-01-01

    Highlights: ► Certain iterated function system with discontinuous probabilities is discussed. ► Existence of an invariant measure via the Schauder–Tychonov theorem is established. ► Asymptotic stability of the system under examination is proved. -- Abstract: We consider an example of an iterated function system with discontinuous probabilities. We prove that it posses an invariant probability measure. We also prove that it is asymptotically stable provided probabilities are positive

  7. Evaluation of Correlation between Pretest Probability for Clostridium difficile Infection and Clostridium difficile Enzyme Immunoassay Results.

    Science.gov (United States)

    Kwon, Jennie H; Reske, Kimberly A; Hink, Tiffany; Burnham, C A; Dubberke, Erik R

    2017-02-01

    The objective of this study was to evaluate the clinical characteristics and outcomes of hospitalized patients tested for Clostridium difficile and determine the correlation between pretest probability for C. difficile infection (CDI) and assay results. Patients with testing ordered for C. difficile were enrolled and assigned a high, medium, or low pretest probability of CDI based on clinical evaluation, laboratory, and imaging results. Stool was tested for C. difficile by toxin enzyme immunoassay (EIA) and toxigenic culture (TC). Chi-square analyses and the log rank test were utilized. Among the 111 patients enrolled, stool samples from nine were TC positive and four were EIA positive. Sixty-one (55%) patients had clinically significant diarrhea, 19 (17%) patients did not, and clinically significant diarrhea could not be determined for 31 (28%) patients. Seventy-two (65%) patients were assessed as having a low pretest probability of having CDI, 34 (31%) as having a medium probability, and 5 (5%) as having a high probability. None of the patients with low pretest probabilities had a positive EIA, but four were TC positive. None of the seven patients with a positive TC but a negative index EIA developed CDI within 30 days after the index test or died within 90 days after the index toxin EIA date. Pretest probability for CDI should be considered prior to ordering C. difficile testing and must be taken into account when interpreting test results. CDI is a clinical diagnosis supported by laboratory data, and the detection of toxigenic C. difficile in stool does not necessarily confirm the diagnosis of CDI. Copyright © 2017 American Society for Microbiology.

  8. Assessing wildfire occurrence probability in Pinus pinaster Ait. stands in Portugal

    Energy Technology Data Exchange (ETDEWEB)

    Marques, S.; Garcia-Gonzalo, J.; Botequim, B.; Ricardo, A.; Borges, J. G.; Tome, M.; Oliveira, M. M.

    2012-11-01

    Maritime pine (Pinus pinaster Ait.) is an important conifer from the western Mediterranean Basin extending over 22% of the forest area in Portugal. In the last three decades nearly 4% of Maritime pine area has been burned by wildfires. Yet no wildfire occurrence probability models are available and forest and fire management planning activities are thus carried out mostly independently of each other. This paper presents research to address this gap. Specifically, it presents a model to assess wildfire occurrence probability in regular and pure Maritime pine stands in Portugal. Emphasis was in developing a model based on easily available inventory data so that it might be useful to forest managers. For that purpose, data from the last two Portuguese National Forest Inventories (NFI) and data from wildfire perimeters in the years from 1998 to 2004 and from 2006 to 2007 were used. A binary logistic regression model was build using biometrics data from the NFI. Biometric data included indicators that might be changed by operations prescribed in forest planning. Results showed that the probability of wildfire occurrence in a stand increases in stand located at steeper slopes and with high shrubs load while it decreases with precipitation and with stand basal area. These results are instrumental for assessing the impact of forest management options on wildfire probability thus helping forest managers to reduce the risk of wildfires. (Author) 57 refs.

  9. Probability of islanding in utility networks due to grid connected photovoltaic power systems

    Energy Technology Data Exchange (ETDEWEB)

    Verhoeven, B.

    2002-09-15

    This report for the International Energy Agency (IEA) made by Task 5 of the Photovoltaic Power Systems (PVPS) programme takes a look at the probability of islanding in utility networks due to grid-connected photovoltaic power systems. The mission of the Photovoltaic Power Systems Programme is to enhance the international collaboration efforts which accelerate the development and deployment of photovoltaic solar energy. Task 5 deals with issues concerning grid-interconnection and distributed PV power systems. This report summarises the results on a study on the probability of islanding in power networks with a high penetration level of grid connected PV-systems. The results are based on measurements performed during one year in a Dutch utility network. The measurements of active and reactive power were taken every second for two years and stored in a computer for off-line analysis. The area examined and its characteristics are described, as are the test set-up and the equipment used. The ratios between load and PV-power are discussed. The general conclusion is that the probability of islanding is virtually zero for low, medium and high penetration levels of PV-systems.

  10. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  11. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  12. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  13. Effect of Urban Green Spaces and Flooded Area Type on Flooding Probability

    Directory of Open Access Journals (Sweden)

    Hyomin Kim

    2016-01-01

    Full Text Available Countermeasures to urban flooding should consider long-term perspectives, because climate change impacts are unpredictable and complex. Urban green spaces have emerged as a potential option to reduce urban flood risks, and their effectiveness has been highlighted in notable urban water management studies. In this study, flooded areas in Seoul, Korea, were divided into four flooded area types by cluster analysis based on topographic and physical characteristics and verified using discriminant analysis. After division by flooded area type, logistic regression analysis was performed to determine how the flooding probability changes with variations in green space area. Type 1 included regions where flooding occurred in a drainage basin that had a flood risk management infrastructure (FRMI. In Type 2, the slope was steep; the TWI (Topographic Wetness Index was relatively low; and soil drainage was favorable. Type 3 represented the gentlest sloping areas, and these were associated with the highest TWI values. In addition, these areas had the worst soil drainage. Type 4 had moderate slopes, imperfect soil drainage and lower than average TWI values. We found that green spaces exerted a considerable influence on urban flooding probabilities in Seoul, and flooding probabilities could be reduced by over 50% depending on the green space area and the locations where green spaces were introduced. Increasing the area of green spaces was the most effective method of decreasing flooding probability in Type 3 areas. In Type 2 areas, the maximum hourly precipitation affected the flooding probability significantly, and the flooding probability in these areas was high despite the extensive green space area. These findings can contribute towards establishing guidelines for urban spatial planning to respond to urban flooding.

  14. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be

  15. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  16. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  17. Factors influencing reporting and harvest probabilities in North American geese

    Science.gov (United States)

    Zimmerman, G.S.; Moser, T.J.; Kendall, W.L.; Doherty, P.F.; White, Gary C.; Caswell, D.F.

    2009-01-01

    We assessed variation in reporting probabilities of standard bands among species, populations, harvest locations, and size classes of North American geese to enable estimation of unbiased harvest probabilities. We included reward (US10,20,30,50, or100) and control (0) banded geese from 16 recognized goose populations of 4 species: Canada (Branta canadensis), cackling (B. hutchinsii), Ross's (Chen rossii), and snow geese (C. caerulescens). We incorporated spatially explicit direct recoveries and live recaptures into a multinomial model to estimate reporting, harvest, and band-retention probabilities. We compared various models for estimating harvest probabilities at country (United States vs. Canada), flyway (5 administrative regions), and harvest area (i.e., flyways divided into northern and southern sections) scales. Mean reporting probability of standard bands was 0.73 (95 CI 0.690.77). Point estimates of reporting probabilities for goose populations or spatial units varied from 0.52 to 0.93, but confidence intervals for individual estimates overlapped and model selection indicated that models with species, population, or spatial effects were less parsimonious than those without these effects. Our estimates were similar to recently reported estimates for mallards (Anas platyrhynchos). We provide current harvest probability estimates for these populations using our direct measures of reporting probability, improving the accuracy of previous estimates obtained from recovery probabilities alone. Goose managers and researchers throughout North America can use our reporting probabilities to correct recovery probabilities estimated from standard banding operations for deriving spatially explicit harvest probabilities.

  18. Generic Degraded Congiguration Probability Analysis for DOE Codisposal Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-05-23

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M&O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k{sub eff} in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package.

  19. Probability functions in the context of signed involutive meadows

    NARCIS (Netherlands)

    Bergstra, J.A.; Ponse, A.

    2016-01-01

    The Kolmogorov axioms for probability functions are placed in the context of signed meadows. A completeness theorem is stated and proven for the resulting equational theory of probability calculus. Elementary definitions of probability theory are restated in this framework.

  20. Impact of high-flux haemodialysis on the probability of target attainment for oral amoxicillin/clavulanic acid combination therapy.

    Science.gov (United States)

    Hui, Katrina; Patel, Kashyap; Kong, David C M; Kirkpatrick, Carl M J

    2017-07-01

    Clearance of small molecules such as amoxicillin and clavulanic acid is expected to increase during high-flux haemodialysis, which may result in lower concentrations and thus reduced efficacy. To date, clearance of amoxicillin/clavulanic acid (AMC) during high-flux haemodialysis remains largely unexplored. Using published pharmacokinetic parameters, a two-compartment model with first-order input was simulated to investigate the impact of high-flux haemodialysis on the probability of target attainment (PTA) of orally administered AMC combination therapy. The following pharmacokinetic/pharmacodynamic targets were used to calculate the PTA. For amoxicillin, the time that the free concentration remains above the minimum inhibitory concentration (MIC) of ≥50% of the dosing period (≥50%ƒT >MIC ) was used. For clavulanic acid, the time that the free concentration was >0.1 mg/L of ≥45% of the dosing period (≥45%ƒT >0.1 mg/L ) was used. Dialysis clearance reported in low-flux haemodialysis for both compounds was doubled to represent the likely clearance during high-flux haemodialysis. Monte Carlo simulations were performed to produce concentration-time profiles over 10 days in 1000 virtual patients. Seven different regimens commonly seen in clinical practice were explored. When AMC was dosed twice daily, the PTA was mostly ≥90% for both compounds regardless of when haemodialysis commenced. When administered once daily, the PTA was 20-30% for clavulanic acid and ≥90% for amoxicillin. The simulations suggest that once-daily orally administered AMC in patients receiving high-flux haemodialysis may result in insufficient concentrations of clavulanic acid to effectively treat infections, especially on days when haemodialysis occurs. Copyright © 2017 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.

  1. Using Thermal Inactivation Kinetics to Calculate the Probability of Extreme Spore Longevity: Implications for Paleomicrobiology and Lithopanspermia

    Science.gov (United States)

    Nicholson, Wayne L.

    2003-12-01

    Thermal inactivation kinetics with extrapolation were used to model the survival probabilities of spores of various Bacillus species over time periods of millions of years at the historical ambient temperatures (25-40 °) encountered within the 250 million-year-old Salado formation, from which the putative ancient spore-forming bacterium Salibacillus marismortui strain 2-9-3 was recovered. The model indicated extremely low-to-moderate survival probabilities for spores of mesophiles, but surprisingly high survival probabilities for thermophilic spores. The significance of the results are discussed in terms of the survival probabilities of (i) terrestrial spores in ancient geologic samples and (ii) spores transported between planets within impact ejecta.

  2. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  3. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  4. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  5. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  6. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  7. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  8. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  9. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  10. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  11. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  12. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Sensitivity analysis of limit state functions for probability-based plastic design

    Science.gov (United States)

    Frangopol, D. M.

    1984-01-01

    The evaluation of the total probability of a plastic collapse failure P sub f for a highly redundant structure of random interdependent plastic moments acted on by random interdepedent loads is a difficult and computationally very costly process. The evaluation of reasonable bounds to this probability requires the use of second moment algebra which involves man statistical parameters. A computer program which selects the best strategy for minimizing the interval between upper and lower bounds of P sub f is now in its final stage of development. The relative importance of various uncertainties involved in the computational process on the resulting bounds of P sub f, sensitivity is analyzed. Response sensitivities for both mode and system reliability of an ideal plastic portal frame are shown.

  14. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  15. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  16. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  17. Recursive recovery of Markov transition probabilities from boundary value data

    Energy Technology Data Exchange (ETDEWEB)

    Patch, Sarah Kathyrn [Univ. of California, Berkeley, CA (United States)

    1994-04-01

    In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requires finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 x 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 x 2 x 2 problem, is solved.

  18. Using multinomial and imprecise probability for non-parametric modelling of rainfall in Manizales (Colombia

    Directory of Open Access Journals (Sweden)

    Ibsen Chivatá Cárdenas

    2008-05-01

    Full Text Available This article presents a rainfall model constructed by applying non-parametric modelling and imprecise probabilities; these tools were used because there was not enough homogeneous information in the study area. The area’s hydro-logical information regarding rainfall was scarce and existing hydrological time series were not uniform. A distributed extended rainfall model was constructed from so-called probability boxes (p-boxes, multinomial probability distribu-tion and confidence intervals (a friendly algorithm was constructed for non-parametric modelling by combining the last two tools. This model confirmed the high level of uncertainty involved in local rainfall modelling. Uncertainty en-compassed the whole range (domain of probability values thereby showing the severe limitations on information, leading to the conclusion that a detailed estimation of probability would lead to significant error. Nevertheless, rele-vant information was extracted; it was estimated that maximum daily rainfall threshold (70 mm would be surpassed at least once every three years and the magnitude of uncertainty affecting hydrological parameter estimation. This paper’s conclusions may be of interest to non-parametric modellers and decisions-makers as such modelling and imprecise probability represents an alternative for hydrological variable assessment and maybe an obligatory proce-dure in the future. Its potential lies in treating scarce information and represents a robust modelling strategy for non-seasonal stochastic modelling conditions

  19. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  20. Why not model spoken word recognition instead of phoneme monitoring?

    NARCIS (Netherlands)

    Vroomen, J.; de Gelder, B.

    2000-01-01

    Norris, McQueen & Cutler present a detailed account of the decision stage of the phoneme monitoring task. However, we question whether this contributes to our understanding of the speech recognition process itself, and we fail to see why phonotactic knowledge is playing a role in phoneme

  1. Unbiased multi-fidelity estimate of failure probability of a free plane jet

    Science.gov (United States)

    Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin

    2017-11-01

    Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.

  2. Maximization of regional probabilities using Optimal Surface Graphs

    DEFF Research Database (Denmark)

    Arias Lorza, Andres M.; Van Engelen, Arna; Petersen, Jens

    2018-01-01

    Purpose: We present a segmentation method that maximizes regional probabilities enclosed by coupled surfaces using an Optimal Surface Graph (OSG) cut approach. This OSG cut determines the globally optimal solution given a graph constructed around an initial surface. While most methods for vessel...... wall segmentation only use edge information, we show that maximizing regional probabilities using an OSG improves the segmentation results. We applied this to automatically segment the vessel wall of the carotid artery in magnetic resonance images. Methods: First, voxel-wise regional probability maps...... were obtained using a Support Vector Machine classifier trained on local image features. Then, the OSG segments the regions which maximizes the regional probabilities considering smoothness and topological constraints. Results: The method was evaluated on 49 carotid arteries from 30 subjects...

  3. Zero field reversal probability in thermally assisted magnetization reversal

    Science.gov (United States)

    Prasetya, E. B.; Utari; Purnama, B.

    2017-11-01

    This paper discussed about zero field reversal probability in thermally assisted magnetization reversal (TAMR). Appearance of reversal probability in zero field investigated through micromagnetic simulation by solving stochastic Landau-Lifshitz-Gibert (LLG). The perpendicularly anisotropy magnetic dot of 50×50×20 nm3 is considered as single cell magnetic storage of magnetic random acces memory (MRAM). Thermally assisted magnetization reversal was performed by cooling writing process from near/almost Curie point to room temperature on 20 times runs for different randomly magnetized state. The results show that the probability reversal under zero magnetic field decreased with the increase of the energy barrier. The zero-field probability switching of 55% attained for energy barrier of 60 k B T and the reversal probability become zero noted at energy barrier of 2348 k B T. The higest zero-field switching probability of 55% attained for energy barrier of 60 k B T which corespond to magnetif field of 150 Oe for switching.

  4. Path probability of stochastic motion: A functional approach

    Science.gov (United States)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  5. An approach for estimating the breach probabilities of moraine-dammed lakes in the Chinese Himalayas using remote-sensing data

    Directory of Open Access Journals (Sweden)

    X. Wang

    2012-10-01

    Full Text Available To make first-order estimates of the probability of moraine-dammed lake outburst flood (MDLOF and prioritize the probabilities of breaching posed by potentially dangerous moraine-dammed lakes (PDMDLs in the Chinese Himalayas, an objective approach is presented. We first select five indicators to identify PDMDLs according to four predesigned criteria. The climatic background was regarded as the climatic precondition of the moraine-dam failure, and under different climatic preconditions, we distinguish the trigger mechanisms of MDLOFs and subdivide them into 17 possible breach modes, with each mode having three or four components; we combined the precondition, modes and components to construct a decision-making tree of moraine-dam failure. Conversion guidelines were established so as to quantify the probabilities of components of a breach mode employing the historic performance method combined with expert knowledge and experience. The region of the Chinese Himalayas was chosen as a study area where there have been frequent MDLOFs in recent decades. The results show that the breaching probabilities (P of 142 PDMDLs range from 0.037 to 0.345, and they can be further categorized as 43 lakes with very high breach probabilities (P ≥ 0.24, 47 lakes with high breach probabilities (0.18 ≤ P < 0.24, 24 lakes with mid-level breach probabilities (0.12 ≤ P < 0.18, 24 lakes with low breach probabilities (0.06 ≤ P < 0.12, and four lakes with very low breach probabilities (p < 0.06.

  6. Reactor materials program process water component failure probability

    International Nuclear Information System (INIS)

    Daugherty, W. L.

    1988-01-01

    The maximum rate loss of coolant accident for the Savannah River Production Reactors is presently specified as the abrupt double-ended guillotine break (DEGB) of a large process water pipe. This accident is not considered credible in light of the low applied stresses and the inherent ductility of the piping materials. The Reactor Materials Program was initiated to provide the technical basis for an alternate, credible maximum rate LOCA. The major thrust of this program is to develop an alternate worst case accident scenario by deterministic means. In addition, the probability of a DEGB is also being determined; to show that in addition to being mechanistically incredible, it is also highly improbable. The probability of a DEGB of the process water piping is evaluated in two parts: failure by direct means, and indirectly-induced failure. These two areas have been discussed in other reports. In addition, the frequency of a large bread (equivalent to a DEGB) in other process water system components is assessed. This report reviews the large break frequency for each component as well as the overall large break frequency for the reactor system

  7. Environmental DNA (eDNA) Detection Probability Is Influenced by Seasonal Activity of Organisms.

    Science.gov (United States)

    de Souza, Lesley S; Godwin, James C; Renshaw, Mark A; Larson, Eric

    2016-01-01

    Environmental DNA (eDNA) holds great promise for conservation applications like the monitoring of invasive or imperiled species, yet this emerging technique requires ongoing testing in order to determine the contexts over which it is effective. For example, little research to date has evaluated how seasonality of organism behavior or activity may influence detection probability of eDNA. We applied eDNA to survey for two highly imperiled species endemic to the upper Black Warrior River basin in Alabama, US: the Black Warrior Waterdog (Necturus alabamensis) and the Flattened Musk Turtle (Sternotherus depressus). Importantly, these species have contrasting patterns of seasonal activity, with N. alabamensis more active in the cool season (October-April) and S. depressus more active in the warm season (May-September). We surveyed sites historically occupied by these species across cool and warm seasons over two years with replicated eDNA water samples, which were analyzed in the laboratory using species-specific quantitative PCR (qPCR) assays. We then used occupancy estimation with detection probability modeling to evaluate both the effects of landscape attributes on organism presence and season of sampling on detection probability of eDNA. Importantly, we found that season strongly affected eDNA detection probability for both species, with N. alabamensis having higher eDNA detection probabilities during the cool season and S. depressus have higher eDNA detection probabilities during the warm season. These results illustrate the influence of organismal behavior or activity on eDNA detection in the environment and identify an important role for basic natural history in designing eDNA monitoring programs.

  8. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  9. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  10. Probability of crack-initiation and application to NDE

    Energy Technology Data Exchange (ETDEWEB)

    Prantl, G [Nuclear Safety Inspectorate HSK, (Switzerland)

    1988-12-31

    Fracture toughness is a property with a certain variability. When a statistical distribution is assumed, the probability of crack initiation may be calculated for a given problem defined by its geometry and the applied stress. Experiments have shown, that cracks which experience a certain small amount of ductile growth can reliably be detected by acoustic emission measurements. The probability of crack detection by AE-techniques may be estimated using this experimental finding and the calculated probability of crack initiation. (author).

  11. Introduction to probability and stochastic processes with applications

    CERN Document Server

    Castañ, Blanco; Arunachalam, Viswanathan; Dharmaraja, Selvamuthu

    2012-01-01

    An easily accessible, real-world approach to probability and stochastic processes Introduction to Probability and Stochastic Processes with Applications presents a clear, easy-to-understand treatment of probability and stochastic processes, providing readers with a solid foundation they can build upon throughout their careers. With an emphasis on applications in engineering, applied sciences, business and finance, statistics, mathematics, and operations research, the book features numerous real-world examples that illustrate how random phenomena occur in nature and how to use probabilistic t

  12. Connections rigidity effect on probability of fracture in steel moment frames

    Directory of Open Access Journals (Sweden)

    Gholamreza Abdollahzadeh

    2017-08-01

    Full Text Available Connections in steel moment frames are idealized in full pinned and full rigid conditions. Because with this assumption, in spite of real behavior of connection, real story drifts are less anticipated and maybe frame is designed without performance of bracing. There are several methods for modeling actual behavior of semi rigid connections. In this method a connection with certain rigidity is modeled by a rotational spring with corresponding stiffness. This stiffness is achieved by certain formula. In other words, each percent of rigidity corresponds to one rotational spring stiffness. In this research in order to evaluate the real behavior of connection in analysis and designing process and fracture probability one frame including four stories and one bay with three types of connection has been modeled and designed in ETABS. Each model has an individual rigidity which is equal to 10, 75 and 90 percent. With respect to maximum drift and different PGA in roof, probabilities of low, medium, high and complete fracture were calculated. For this purpose, with applying different PGA to modeled frames, amounts of drift in the roof are achieved. Then these values are compared with given values in American code. Finally, investigation showed that when rigidity in frame connections increases, the probability of frame fracture decreases. In other words, fully rigid assumption of connection in analysis process leads to decreasing in real probability of fracture in frames which is a noticeable risk in building designing processes.

  13. Landslide Probability Assessment by the Derived Distributions Technique

    Science.gov (United States)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    is used along with the soil characteristic curve (suction vs. moisture) and the Mohr-Coulomb failure criteria in order to calculate the FOS of the slope. Data from two slopes located on steep tropical regions of the cities of Medellín (Colombia) and Rio de Janeiro (Brazil) where used to verify the model's performance. The results indicated significant differences between the obtained FOS values and the behavior observed on the field. The model shows relatively high values of FOS that do not reflect the instability of the analyzed slopes. For the two cases studied, the application of a more simple reliability concept (as the Probability of Failure - PR and Reliability Index - β), instead of a FOS could lead to more realistic results.

  14. Fishnet model for failure probability tail of nacre-like imbricated lamellar materials

    Science.gov (United States)

    Luo, Wen; Bažant, Zdeněk P.

    2017-12-01

    Nacre, the iridescent material of the shells of pearl oysters and abalone, consists mostly of aragonite (a form of CaCO3), a brittle constituent of relatively low strength (≈10 MPa). Yet it has astonishing mean tensile strength (≈150 MPa) and fracture energy (≈350 to 1,240 J/m2). The reasons have recently become well understood: (i) the nanoscale thickness (≈300 nm) of nacre's building blocks, the aragonite lamellae (or platelets), and (ii) the imbricated, or staggered, arrangement of these lamellea, bound by biopolymer layers only ≈25 nm thick, occupying engineering applications, however, the failure probability of ≤10-6 is generally required. To guarantee it, the type of probability density function (pdf) of strength, including its tail, must be determined. This objective, not pursued previously, is hardly achievable by experiments alone, since >10^8 tests of specimens would be needed. Here we outline a statistical model of strength that resembles a fishnet pulled diagonally, captures the tail of pdf of strength and, importantly, allows analytical safety assessments of nacreous materials. The analysis shows that, in terms of safety, the imbricated lamellar structure provides a major additional advantage—˜10% strength increase at tail failure probability 10^-6 and a 1 to 2 orders of magnitude tail probability decrease at fixed stress. Another advantage is that a high scatter of microstructure properties diminishes the strength difference between the mean and the probability tail, compared with the weakest link model. These advantages of nacre-like materials are here justified analytically and supported by millions of Monte Carlo simulations.

  15. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    Energy Technology Data Exchange (ETDEWEB)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento, Trento (Italy)

    2016-06-14

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  16. Estimation of functional failure probability of passive systems based on adaptive importance sampling method

    International Nuclear Information System (INIS)

    Wang Baosheng; Wang Dongqing; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to estimate the functional failure probability of passive systems, an innovative adaptive importance sampling methodology is presented. In the proposed methodology, information of variables is extracted with some pre-sampling of points in the failure region. An important sampling density is then constructed from the sample distribution in the failure region. Taking the AP1000 passive residual heat removal system as an example, the uncertainties related to the model of a passive system and the numerical values of its input parameters are considered in this paper. And then the probability of functional failure is estimated with the combination of the response surface method and adaptive importance sampling method. The numerical results demonstrate the high computed efficiency and excellent computed accuracy of the methodology compared with traditional probability analysis methods. (authors)

  17. Use of probability tables for propagating uncertainties in neutronics

    International Nuclear Information System (INIS)

    Coste-Delclaux, M.; Diop, C.M.; Lahaye, S.

    2017-01-01

    Highlights: • Moment-based probability table formalism is described. • Representation by probability tables of any uncertainty distribution is established. • Multiband equations for two kinds of uncertainty propagation problems are solved. • Numerical examples are provided and validated against Monte Carlo simulations. - Abstract: Probability tables are a generic tool that allows representing any random variable whose probability density function is known. In the field of nuclear reactor physics, this tool is currently used to represent the variation of cross-sections versus energy (neutron transport codes TRIPOLI4®, MCNP, APOLLO2, APOLLO3®, ECCO/ERANOS…). In the present article we show how we can propagate uncertainties, thanks to a probability table representation, through two simple physical problems: an eigenvalue problem (neutron multiplication factor) and a depletion problem.

  18. Quantum correlations in terms of neutrino oscillation probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Alok, Ashutosh Kumar, E-mail: akalok@iitj.ac.in [Indian Institute of Technology Jodhpur, Jodhpur 342011 (India); Banerjee, Subhashish, E-mail: subhashish@iitj.ac.in [Indian Institute of Technology Jodhpur, Jodhpur 342011 (India); Uma Sankar, S., E-mail: uma@phy.iitb.ac.in [Indian Institute of Technology Bombay, Mumbai 400076 (India)

    2016-08-15

    Neutrino oscillations provide evidence for the mode entanglement of neutrino mass eigenstates in a given flavour eigenstate. Given this mode entanglement, it is pertinent to consider the relation between the oscillation probabilities and other quantum correlations. In this work, we show that all the well-known quantum correlations, such as the Bell's inequality, are directly related to the neutrino oscillation probabilities. The results of the neutrino oscillation experiments, which measure the neutrino survival probability to be less than unity, imply Bell's inequality violation.

  19. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  20. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki