WorldWideScience

Sample records for conditional probability based

  1. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  2. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  3. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  4. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  5. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  6. Statistical learning of action: the role of conditional probability.

    Science.gov (United States)

    Meyer, Meredith; Baldwin, Dare

    2011-12-01

    Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.

  7. Students' Understanding of Conditional Probability on Entering University

    Science.gov (United States)

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  8. The Probability Approach to English If-Conditional Sentences

    Science.gov (United States)

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  9. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  10. Maximum Entropy and Probability Kinematics Constrained by Conditionals

    Directory of Open Access Journals (Sweden)

    Stefan Lukits

    2015-03-01

    Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.

  11. Conditional probabilities in Ponzano-Regge minisuperspace

    International Nuclear Information System (INIS)

    Petryk, Roman; Schleich, Kristin

    2003-01-01

    We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes

  12. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Probability in reasoning: a developmental test on conditionals.

    Science.gov (United States)

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  15. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  16. Conditional probability on MV-algebras

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2005-01-01

    Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005

  17. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    Science.gov (United States)

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2017-07-01

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  18. Study on conditional probability of surface rupture: effect of fault dip and width of seismogenic layer

    Science.gov (United States)

    Inoue, N.

    2017-12-01

    The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source

  19. Conditional probability of the tornado missile impact given a tornado occurrence

    International Nuclear Information System (INIS)

    Goodman, J.; Koch, J.E.

    1982-01-01

    Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets

  20. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  1. Conditional Probabilities in the Excursion Set Theory. Generic Barriers and non-Gaussian Initial Conditions

    CERN Document Server

    De Simone, Andrea; Riotto, Antonio

    2011-01-01

    The excursion set theory, where density perturbations evolve stochastically with the smoothing scale, provides a method for computing the dark matter halo mass function. The computation of the mass function is mapped into the so-called first-passage time problem in the presence of a moving barrier. The excursion set theory is also a powerful formalism to study other properties of dark matter halos such as halo bias, accretion rate, formation time, merging rate and the formation history of halos. This is achieved by computing conditional probabilities with non-trivial initial conditions, and the conditional two-barrier first-crossing rate. In this paper we use the recently-developed path integral formulation of the excursion set theory to calculate analytically these conditional probabilities in the presence of a generic moving barrier, including the one describing the ellipsoidal collapse, and for both Gaussian and non-Gaussian initial conditions. The non-Markovianity of the random walks induced by non-Gaussi...

  2. Calculation of the Incremental Conditional Core Damage Probability on the Extension of Allowed Outage Time

    International Nuclear Information System (INIS)

    Kang, Dae Il; Han, Sang Hoon

    2006-01-01

    RG 1.177 requires that the conditional risk (incremental conditional core damage probability and incremental conditional large early release probability: ICCDP and ICLERP), given that a specific component is out of service (OOS), be quantified for a permanent change of the allowed outage time (AOT) of a safety system. An AOT is the length of time that a particular component or system is permitted to be OOS while the plant is operating. The ICCDP is defined as: ICCDP = [(conditional CDF with the subject equipment OOS)- (baseline CDF with nominal expected equipment unavailabilities)] [duration of the single AOT under consideration]. Any event enabling the component OOS can initiate the time clock for the limiting condition of operation for a nuclear power plant. Thus, the largest ICCDP among the ICCDPs estimated from any occurrence of the basic events for the component fault tree should be selected for determining whether the AOT can be extended or not. If the component is under a preventive maintenance, the conditional risk can be straightforwardly calculated without changing the CCF probability. The main concern is the estimations of the CCF probability because there are the possibilities of the failures of other similar components due to the same root causes. The quantifications of the risk, given that a subject equipment is in a failed state, are performed by setting the identified event of subject equipment to TRUE. The CCF probabilities are also changed according to the identified failure cause. In the previous studies, however, the ICCDP was quantified with the consideration of the possibility of a simultaneous occurrence of two CCF events. Based on the above, we derived the formulas of the CCF probabilities for the cases where a specific component is in a failed state and we presented sample calculation results of the ICCDP for the low pressure safety injection system (LPSIS) of Ulchin Unit 3

  3. Decomposition of conditional probability for high-order symbolic Markov chains

    Science.gov (United States)

    Melnik, S. S.; Usatenko, O. V.

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  4. Estimation of functional failure probability of passive systems based on subset simulation method

    International Nuclear Information System (INIS)

    Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)

  5. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  6. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  7. A probability evaluation method of early deterioration condition for the critical components of wind turbine generator systems

    DEFF Research Database (Denmark)

    Hu, Y.; Li, H.; Liao, X

    2016-01-01

    method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method...... of early deterioration condition was presented. Finally, two cases showed the validity of the proposed probability evaluation method in detecting early deterioration condition and in tracking their further deterioration for the critical components.......This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration...

  8. A Novel Adaptive Conditional Probability-Based Predicting Model for User’s Personality Traits

    Directory of Open Access Journals (Sweden)

    Mengmeng Wang

    2015-01-01

    Full Text Available With the pervasive increase in social media use, the explosion of users’ generated data provides a potentially very rich source of information, which plays an important role in helping online researchers understand user’s behaviors deeply. Since user’s personality traits are the driving force of user’s behaviors, hence, in this paper, along with social network features, we first extract linguistic features, emotional statistical features, and topic features from user’s Facebook status updates, followed by quantifying importance of features via Kendall correlation coefficient. And then, on the basis of weighted features and dynamic updated thresholds of personality traits, we deploy a novel adaptive conditional probability-based predicting model which considers prior knowledge of correlations between user’s personality traits to predict user’s Big Five personality traits. In the experimental work, we explore the existence of correlations between user’s personality traits which provides a better theoretical support for our proposed method. Moreover, on the same Facebook dataset, compared to other methods, our method can achieve an F1-measure of 80.6% when taking into account correlations between user’s personality traits, and there is an impressive improvement of 5.8% over other approaches.

  9. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  10. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    Science.gov (United States)

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  11. Hawkes-diffusion process and the conditional probability of defaults in the Eurozone

    Science.gov (United States)

    Kim, Jungmu; Park, Yuen Jung; Ryu, Doojin

    2016-05-01

    This study examines market information embedded in the European sovereign CDS (credit default swap) market by analyzing the sovereign CDSs of 13 Eurozone countries from January 1, 2008, to February 29, 2012, which includes the recent Eurozone debt crisis period. We design the conditional probability of defaults for the CDS prices based on the Hawkes-diffusion process and obtain the theoretical prices of CDS indexes. To estimate the model parameters, we calibrate the model prices to empirical prices obtained from individual sovereign CDS term structure data. The estimated parameters clearly explain both cross-sectional and time-series data. Our empirical results show that the probability of a huge loss event sharply increased during the Eurozone debt crisis, indicating a contagion effect. Even countries with strong and stable economies, such as Germany and France, suffered from the contagion effect. We also find that the probability of small events is sensitive to the state of the economy, spiking several times due to the global financial crisis and the Greek government debt crisis.

  12. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  13. An Alternative Teaching Method of Conditional Probabilities and Bayes' Rule: An Application of the Truth Table

    Science.gov (United States)

    Satake, Eiki; Vashlishan Murray, Amy

    2015-01-01

    This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…

  14. A Semi-Continuous State-Transition Probability HMM-Based Voice Activity Detector

    Directory of Open Access Journals (Sweden)

    H. Othman

    2007-02-01

    Full Text Available We introduce an efficient hidden Markov model-based voice activity detection (VAD algorithm with time-variant state-transition probabilities in the underlying Markov chain. The transition probabilities vary in an exponential charge/discharge scheme and are softly merged with state conditional likelihood into a final VAD decision. Working in the domain of ITU-T G.729 parameters, with no additional cost for feature extraction, the proposed algorithm significantly outperforms G.729 Annex B VAD while providing a balanced tradeoff between clipping and false detection errors. The performance compares very favorably with the adaptive multirate VAD, option 2 (AMR2.

  15. A Semi-Continuous State-Transition Probability HMM-Based Voice Activity Detector

    Directory of Open Access Journals (Sweden)

    Othman H

    2007-01-01

    Full Text Available We introduce an efficient hidden Markov model-based voice activity detection (VAD algorithm with time-variant state-transition probabilities in the underlying Markov chain. The transition probabilities vary in an exponential charge/discharge scheme and are softly merged with state conditional likelihood into a final VAD decision. Working in the domain of ITU-T G.729 parameters, with no additional cost for feature extraction, the proposed algorithm significantly outperforms G.729 Annex B VAD while providing a balanced tradeoff between clipping and false detection errors. The performance compares very favorably with the adaptive multirate VAD, option 2 (AMR2.

  16. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  17. Conditional Probabilities of Large Earthquake Sequences in California from the Physics-based Rupture Simulator RSQSim

    Science.gov (United States)

    Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.

    2017-12-01

    Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.

  18. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  19. Using Dynamic Geometry Software for Teaching Conditional Probability with Area-Proportional Venn Diagrams

    Science.gov (United States)

    Radakovic, Nenad; McDougall, Douglas

    2012-01-01

    This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships,…

  20. Class dependency of fuzzy relational database using relational calculus and conditional probability

    Science.gov (United States)

    Deni Akbar, Mohammad; Mizoguchi, Yoshihiro; Adiwijaya

    2018-03-01

    In this paper, we propose a design of fuzzy relational database to deal with a conditional probability relation using fuzzy relational calculus. In the previous, there are several researches about equivalence class in fuzzy database using similarity or approximate relation. It is an interesting topic to investigate the fuzzy dependency using equivalence classes. Our goal is to introduce a formulation of a fuzzy relational database model using the relational calculus on the category of fuzzy relations. We also introduce general formulas of the relational calculus for the notion of database operations such as ’projection’, ’selection’, ’injection’ and ’natural join’. Using the fuzzy relational calculus and conditional probabilities, we introduce notions of equivalence class, redundant, and dependency in the theory fuzzy relational database.

  1. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  2. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  3. Secondary School Students' Reasoning about Conditional Probability, Samples, and Sampling Procedures

    Science.gov (United States)

    Prodromou, Theodosia

    2016-01-01

    In the Australian mathematics curriculum, Year 12 students (aged 16-17) are asked to solve conditional probability problems that involve the representation of the problem situation with two-way tables or three-dimensional diagrams and consider sampling procedures that result in different correct answers. In a small exploratory study, we…

  4. Conditional probability of intense rainfall producing high ground concentrations from radioactive plumes

    International Nuclear Information System (INIS)

    Wayland, J.R.

    1977-03-01

    The overlap of the expanding plume of radioactive material from a hypothetical nuclear accident with rainstorms over dense population areas is considered. The conditional probability of the occurrence of hot spots from intense cellular rainfall is presented

  5. Determination of the failure probability in the weld region of ap-600 vessel for transient condition

    International Nuclear Information System (INIS)

    Wahyono, I.P.

    1997-01-01

    Failure probability in the weld region of AP-600 vessel was determined for transient condition scenario. The type of transient is increase of the heat removal from primary cooling system due to sudden opening of safety valves or steam relief valves on the secondary cooling system or the steam generator. Temperature and pressure in the vessel was considered as the base of deterministic calculation of the stress intensity factor. Calculation of film coefficient of the convective heat transfers is a function of the transient time and water parameter. Pressure, material temperature, flaw depth and transient time are variables for the stress intensity factor. Failure probability consideration was done by using the above information in regard with the flaw and probability distributions of Octavia II and Marshall. Calculation of the failure probability by probability fracture mechanic simulation is applied on the weld region. Failure of the vessel is assumed as a failure of the weld material with one crack which stress intensity factor applied is higher than the critical stress intensity factor. VISA II code (Vessel Integrity Simulation Analysis II) was used for deterministic calculation and simulation. Failure probability of the material is 1.E-5 for Octavia II distribution and 4E-6 for marshall distribution for each transient event postulated. The failure occurred at the 1.7th menit of the initial transient under 12.53 ksi of the pressure

  6. The effect of conditional probability of chord progression on brain response: an MEG study.

    Directory of Open Access Journals (Sweden)

    Seung-Goo Kim

    Full Text Available BACKGROUND: Recent electrophysiological and neuroimaging studies have explored how and where musical syntax in Western music is processed in the human brain. An inappropriate chord progression elicits an event-related potential (ERP component called an early right anterior negativity (ERAN or simply an early anterior negativity (EAN in an early stage of processing the musical syntax. Though the possible underlying mechanism of the EAN is assumed to be probabilistic learning, the effect of the probability of chord progressions on the EAN response has not been previously explored explicitly. METHODOLOGY/PRINCIPAL FINDINGS: In the present study, the empirical conditional probabilities in a Western music corpus were employed as an approximation of the frequencies in previous exposure of participants. Three types of chord progression were presented to musicians and non-musicians in order to examine the correlation between the probability of chord progression and the neuromagnetic response using magnetoencephalography (MEG. Chord progressions were found to elicit early responses in a negatively correlating fashion with the conditional probability. Observed EANm (as a magnetic counterpart of the EAN component responses were consistent with the previously reported EAN responses in terms of latency and location. The effect of conditional probability interacted with the effect of musical training. In addition, the neural response also correlated with the behavioral measures in the non-musicians. CONCLUSIONS/SIGNIFICANCE: Our study is the first to reveal the correlation between the probability of chord progression and the corresponding neuromagnetic response. The current results suggest that the physiological response is a reflection of the probabilistic representations of the musical syntax. Moreover, the results indicate that the probabilistic representation is related to the musical training as well as the sensitivity of an individual.

  7. Hamiltonian theories quantization based on a probability operator

    International Nuclear Information System (INIS)

    Entral'go, E.E.

    1986-01-01

    The quantization method with a linear reflection of classical coordinate-momentum-time functions Λ(q,p,t) at quantum operators in a space of quantum states ψ, is considered. The probability operator satisfies a system of equations representing the principles of dynamical and canonical correspondences between the classical and quantum theories. The quantization based on a probability operator leads to a quantum theory with a nonnegative joint coordinate-momentum distribution function for any state ψ. The main consequences of quantum mechanics with a probability operator are discussed in comparison with the generally accepted quantum and classical theories. It is shown that a probability operator leads to an appearance of some new notions called ''subquantum'' ones. Hence the quantum theory with a probability operator does not pretend to any complete description of physical reality in terms of classical variables and by this reason contains no problems like Einstein-Podolsky-Rosen paradox. The results of some concrete problems are given: a free particle, a harmonic oscillator, an electron in the Coulomb field. These results give hope on the possibility of an experimental verification of the quantization based on a probability operator

  8. Evidence reasoning method for constructing conditional probability tables in a Bayesian network of multimorbidity.

    Science.gov (United States)

    Du, Yuanwei; Guo, Yubin

    2015-01-01

    The intrinsic mechanism of multimorbidity is difficult to recognize and prediction and diagnosis are difficult to carry out accordingly. Bayesian networks can help to diagnose multimorbidity in health care, but it is difficult to obtain the conditional probability table (CPT) because of the lack of clinically statistical data. Today, expert knowledge and experience are increasingly used in training Bayesian networks in order to help predict or diagnose diseases, but the CPT in Bayesian networks is usually irrational or ineffective for ignoring realistic constraints especially in multimorbidity. In order to solve these problems, an evidence reasoning (ER) approach is employed to extract and fuse inference data from experts using a belief distribution and recursive ER algorithm, based on which evidence reasoning method for constructing conditional probability tables in Bayesian network of multimorbidity is presented step by step. A multimorbidity numerical example is used to demonstrate the method and prove its feasibility and application. Bayesian network can be determined as long as the inference assessment is inferred by each expert according to his/her knowledge or experience. Our method is more effective than existing methods for extracting expert inference data accurately and is fused effectively for constructing CPTs in a Bayesian network of multimorbidity.

  9. Rainfall and net infiltration probabilities for future climate conditions at Yucca Mountain

    International Nuclear Information System (INIS)

    Long, A.; Childs, S.W.

    1993-01-01

    Performance assessment of repository integrity is a task rendered difficult because it requires predicting the future. This challenge has occupied many scientists who realize that the best assessments are required to maximize the probability of successful repository sitting and design. As part of a performance assessment effort directed by the EPRI, the authors have used probabilistic methods to assess the magnitude and timing of net infiltration at Yucca Mountain. A mathematical model for net infiltration previously published incorporated a probabilistic treatment of climate, surface hydrologic processes and a mathematical model of the infiltration process. In this paper, we present the details of the climatological analysis. The precipitation model is event-based, simulating characteristics of modern rainfall near Yucca Mountain, then extending the model to most likely values for different degrees of pluvial climates. Next the precipitation event model is fed into a process-based infiltration model that considers spatial variability in parameters relevant to net infiltration of Yucca Mountain. The model predicts that average annual net infiltration at Yucca Mountain will range from a mean of about 1 mm under present climatic conditions to a mean of at least 2.4 mm under full glacial (pluvial) conditions. Considerable variations about these means are expected to occur from year-to-year

  10. Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory

    Science.gov (United States)

    Chruściński, Dariusz

    2013-03-01

    Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.

  11. Quantum-correlation breaking channels, quantum conditional probability and Perron–Frobenius theory

    International Nuclear Information System (INIS)

    Chruściński, Dariusz

    2013-01-01

    Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum–classical and classical–classical channels. Applying the quantum analog of Perron–Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum–classical channels to arbitrary quantum channels.

  12. Probability based load factors for design of concrete containment structures

    International Nuclear Information System (INIS)

    Hwang, H.; Kagami, S.; Reich, M.; Ellingwood, B.; Shinozuka, M.

    1985-01-01

    This paper describes a procedure for developing probability-based load combinations for the design of concrete containments. The proposed criteria are in a load and resistance factor design (LRFD) format. The load factors and resistance factors are derived for use in limit states design and are based on a target limit state probability. In this paper, the load factors for accident pressure and safe shutdown earthquake are derived for three target limit state probabilities. Other load factors are recommended on the basis of prior experience with probability-based design criteria for ordinary building construction. 6 refs

  13. Eliciting conditional and unconditional rank correlations from conditional probabilities

    International Nuclear Information System (INIS)

    Morales, O.; Kurowicka, D.; Roelen, A.

    2008-01-01

    Causes of uncertainties may be interrelated and may introduce dependencies. Ignoring these dependencies may lead to large errors. A number of graphical models in probability theory such as dependence trees, vines and (continuous) Bayesian belief nets [Cooke RM. Markov and entropy properties of tree and vine-dependent variables. In: Proceedings of the ASA section on Bayesian statistical science, 1997; Kurowicka D, Cooke RM. Distribution-free continuous Bayesian belief nets. In: Proceedings of mathematical methods in reliability conference, 2004; Bedford TJ, Cooke RM. Vines-a new graphical model for dependent random variables. Ann Stat 2002; 30(4):1031-68; Kurowicka D, Cooke RM. Uncertainty analysis with high dimensional dependence modelling. New York: Wiley; 2006; Hanea AM, et al. Hybrid methods for quantifying and analyzing Bayesian belief nets. In: Proceedings of the 2005 ENBIS5 conference, 2005; Shachter RD, Kenley CR. Gaussian influence diagrams. Manage Sci 1998; 35(5) .] have been developed to capture dependencies between random variables. The input for these models are various marginal distributions and dependence information, usually in the form of conditional rank correlations. Often expert elicitation is required. This paper focuses on dependence representation, and dependence elicitation. The techniques presented are illustrated with an application from aviation safety

  14. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  15. Multiregion, multigroup collision probability method with white boundary condition for light water reactor thermalization calculations

    International Nuclear Information System (INIS)

    Ozgener, B.; Ozgener, H.A.

    2005-01-01

    A multiregion, multigroup collision probability method with white boundary condition is developed for thermalization calculations of light water moderated reactors. Hydrogen scatterings are treated by Nelkin's kernel while scatterings from other nuclei are assumed to obey the free-gas scattering kernel. The isotropic return (white) boundary condition is applied directly by using the appropriate collision probabilities. Comparisons with alternate numerical methods show the validity of the present formulation. Comparisons with some experimental results indicate that the present formulation is capable of calculating disadvantage factors which are closer to the experimental results than alternative methods

  16. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.

    Directory of Open Access Journals (Sweden)

    Michael R W Dawson

    Full Text Available Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.

  17. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability

    Science.gov (United States)

    2017-01-01

    Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent’s environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned. PMID:28212422

  18. Probability matching in perceptrons: Effects of conditional dependence and linear nonseparability.

    Science.gov (United States)

    Dawson, Michael R W; Gupta, Maya

    2017-01-01

    Probability matching occurs when the behavior of an agent matches the likelihood of occurrence of events in the agent's environment. For instance, when artificial neural networks match probability, the activity in their output unit equals the past probability of reward in the presence of a stimulus. Our previous research demonstrated that simple artificial neural networks (perceptrons, which consist of a set of input units directly connected to a single output unit) learn to match probability when presented different cues in isolation. The current paper extends this research by showing that perceptrons can match probabilities when presented simultaneous cues, with each cue signaling different reward likelihoods. In our first simulation, we presented up to four different cues simultaneously; the likelihood of reward signaled by the presence of one cue was independent of the likelihood of reward signaled by other cues. Perceptrons learned to match reward probabilities by treating each cue as an independent source of information about the likelihood of reward. In a second simulation, we violated the independence between cues by making some reward probabilities depend upon cue interactions. We did so by basing reward probabilities on a logical combination (AND or XOR) of two of the four possible cues. We also varied the size of the reward associated with the logical combination. We discovered that this latter manipulation was a much better predictor of perceptron performance than was the logical structure of the interaction between cues. This indicates that when perceptrons learn to match probabilities, they do so by assuming that each signal of a reward is independent of any other; the best predictor of perceptron performance is a quantitative measure of the independence of these input signals, and not the logical structure of the problem being learned.

  19. Schema bias in source monitoring varies with encoding conditions: support for a probability-matching account.

    Science.gov (United States)

    Kuhlmann, Beatrice G; Vaterrodt, Bianca; Bayen, Ute J

    2012-09-01

    Two experiments examined reliance on schematic knowledge in source monitoring. Based on a probability-matching account of source guessing, a schema bias will only emerge if participants do not have a representation of the source-item contingency in the study list, or if the perceived contingency is consistent with schematic expectations. Thus, the account predicts that encoding conditions that affect contingency detection also affect schema bias. In Experiment 1, the schema bias commonly found when schematic information about the sources is not provided before encoding was diminished by an intentional source-memory instruction. In Experiment 2, the depth of processing of schema-consistent and schema-inconsistent source-item pairings was manipulated. Participants consequently overestimated the occurrence of the pairing type they processed in a deep manner, and their source guessing reflected this biased contingency perception. Results support the probability-matching account of source guessing. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  20. The probability estimate of the defects of the asynchronous motors based on the complex method of diagnostics

    Science.gov (United States)

    Zhukovskiy, Yu L.; Korolev, N. A.; Babanova, I. S.; Boikov, A. V.

    2017-10-01

    This article is devoted to the development of a method for probability estimate of failure of an asynchronous motor as a part of electric drive with a frequency converter. The proposed method is based on a comprehensive method of diagnostics of vibration and electrical characteristics that take into account the quality of the supply network and the operating conditions. The developed diagnostic system allows to increase the accuracy and quality of diagnoses by determining the probability of failure-free operation of the electromechanical equipment, when the parameters deviate from the norm. This system uses an artificial neural networks (ANNs). The results of the system for estimator the technical condition are probability diagrams of the technical state and quantitative evaluation of the defects of the asynchronous motor and its components.

  1. Heightened fire probability in Indonesia in non-drought conditions: the effect of increasing temperatures

    Science.gov (United States)

    Fernandes, Kátia; Verchot, Louis; Baethgen, Walter; Gutierrez-Velez, Victor; Pinedo-Vasquez, Miguel; Martius, Christopher

    2017-05-01

    In Indonesia, drought driven fires occur typically during the warm phase of the El Niño Southern Oscillation. This was the case of the events of 1997 and 2015 that resulted in months-long hazardous atmospheric pollution levels in Equatorial Asia and record greenhouse gas emissions. Nonetheless, anomalously active fire seasons have also been observed in non-drought years. In this work, we investigated the impact of temperature on fires and found that when the July-October (JASO) period is anomalously dry, the sensitivity of fires to temperature is modest. In contrast, under normal-to-wet conditions, fire probability increases sharply when JASO is anomalously warm. This describes a regime in which an active fire season is not limited to drought years. Greater susceptibility to fires in response to a warmer environment finds support in the high evapotranspiration rates observed in normal-to-wet and warm conditions in Indonesia. We also find that fire probability in wet JASOs would be considerably less sensitive to temperature were not for the added effect of recent positive trends. Near-term regional climate projections reveal that, despite negligible changes in precipitation, a continuing warming trend will heighten fire probability over the next few decades especially in non-drought years. Mild fire seasons currently observed in association with wet conditions and cool temperatures will become rare events in Indonesia.

  2. Know the risk, take the win: how executive functions and probability processing influence advantageous decision making under risk conditions.

    Science.gov (United States)

    Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete

    2014-01-01

    Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.

  3. Developing a probability-based model of aquifer vulnerability in an agricultural region

    Science.gov (United States)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  4. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Directory of Open Access Journals (Sweden)

    Michael F Sloma

    2017-11-01

    Full Text Available Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  5. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs.

    Science.gov (United States)

    Sloma, Michael F; Mathews, David H

    2017-11-01

    Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package.

  6. A Hierarchy of Compatibility and Comeasurability Levels in Quantum Logics with Unique Conditional Probabilities

    International Nuclear Information System (INIS)

    Niestegge, Gerd

    2010-01-01

    In the quantum mechanical Hilbert space formalism, the probabilistic interpretation is a later ad-hoc add-on, more or less enforced by the experimental evidence, but not motivated by the mathematical model itself. A model involving a clear probabilistic interpretation from the very beginning is provided by the quantum logics with unique conditional probabilities. It includes the projection lattices in von Neumann algebras and here probability conditionalization becomes identical with the state transition of the Lueders-von Neumann measurement process. This motivates the definition of a hierarchy of five compatibility and comeasurability levels in the abstract setting of the quantum logics with unique conditional probabilities. Their meanings are: the absence of quantum interference or influence, the existence of a joint distribution, simultaneous measurability, and the independence of the final state after two successive measurements from the sequential order of these two measurements. A further level means that two elements of the quantum logic (events) belong to the same Boolean subalgebra. In the general case, the five compatibility and comeasurability levels appear to differ, but they all coincide in the common Hilbert space formalism of quantum mechanics, in von Neumann algebras, and in some other cases. (general)

  7. Joint Probability-Based Neuronal Spike Train Classification

    Directory of Open Access Journals (Sweden)

    Yan Chen

    2009-01-01

    Full Text Available Neuronal spike trains are used by the nervous system to encode and transmit information. Euclidean distance-based methods (EDBMs have been applied to quantify the similarity between temporally-discretized spike trains and model responses. In this study, using the same discretization procedure, we developed and applied a joint probability-based method (JPBM to classify individual spike trains of slowly adapting pulmonary stretch receptors (SARs. The activity of individual SARs was recorded in anaesthetized, paralysed adult male rabbits, which were artificially-ventilated at constant rate and one of three different volumes. Two-thirds of the responses to the 600 stimuli presented at each volume were used to construct three response models (one for each stimulus volume consisting of a series of time bins, each with spike probabilities. The remaining one-third of the responses where used as test responses to be classified into one of the three model responses. This was done by computing the joint probability of observing the same series of events (spikes or no spikes, dictated by the test response in a given model and determining which probability of the three was highest. The JPBM generally produced better classification accuracy than the EDBM, and both performed well above chance. Both methods were similarly affected by variations in discretization parameters, response epoch duration, and two different response alignment strategies. Increasing bin widths increased classification accuracy, which also improved with increased observation time, but primarily during periods of increasing lung inflation. Thus, the JPBM is a simple and effective method performing spike train classification.

  8. Component fragility data base for reliability and probability studies

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.; Kassier, M.; Pepper, S.

    1989-01-01

    Safety-related equipment in a nuclear plant plays a vital role in its proper operation and control, and failure of such equipment due to an earthquake may pose a risk to the safe operation of the plant. Therefore, in order to assess the overall reliability of a plant, the reliability of performance of the equipment should be studied first. The success of a reliability or a probability study depends to a great extent on the data base. To meet this demand, Brookhaven National Laboratory (BNL) has formed a test data base relating the seismic capacity of equipment specimens to the earthquake levels. Subsequently, the test data have been analyzed for use in reliability and probability studies. This paper describes the data base and discusses the analysis methods. The final results that can be directly used in plant reliability and probability studies are also presented in this paper

  9. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  10. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  11. SQERTSS: Dynamic rank based throttling of transition probabilities in kinetic Monte Carlo simulations

    International Nuclear Information System (INIS)

    Danielson, Thomas; Sutton, Jonathan E.; Hin, Céline; Virginia Polytechnic Institute and State University; Savara, Aditya

    2017-01-01

    Lattice based Kinetic Monte Carlo (KMC) simulations offer a powerful simulation technique for investigating large reaction networks while retaining spatial configuration information, unlike ordinary differential equations. However, large chemical reaction networks can contain reaction processes with rates spanning multiple orders of magnitude. This can lead to the problem of “KMC stiffness” (similar to stiffness in differential equations), where the computational expense has the potential to be overwhelmed by very short time-steps during KMC simulations, with the simulation spending an inordinate amount of KMC steps / cpu-time simulating fast frivolous processes (FFPs) without progressing the system (reaction network). In order to achieve simulation times that are experimentally relevant or desired for predictions, a dynamic throttling algorithm involving separation of the processes into speed-ranks based on event frequencies has been designed and implemented with the intent of decreasing the probability of FFP events, and increasing the probability of slow process events -- allowing rate limiting events to become more likely to be observed in KMC simulations. This Staggered Quasi-Equilibrium Rank-based Throttling for Steady-state (SQERTSS) algorithm designed for use in achieving and simulating steady-state conditions in KMC simulations. Lastly, as shown in this work, the SQERTSS algorithm also works for transient conditions: the correct configuration space and final state will still be achieved if the required assumptions are not violated, with the caveat that the sizes of the time-steps may be distorted during the transient period.

  12. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  13. Effects of sampling conditions on DNA-based estimates of American black bear abundance

    Science.gov (United States)

    Laufenberg, Jared S.; Van Manen, Frank T.; Clark, Joseph D.

    2013-01-01

    DNA-based capture-mark-recapture techniques are commonly used to estimate American black bear (Ursus americanus) population abundance (N). Although the technique is well established, many questions remain regarding study design. In particular, relationships among N, capture probability of heterogeneity mixtures A and B (pA and pB, respectively, or p, collectively), the proportion of each mixture (π), number of capture occasions (k), and probability of obtaining reliable estimates of N are not fully understood. We investigated these relationships using 1) an empirical dataset of DNA samples for which true N was unknown and 2) simulated datasets with known properties that represented a broader array of sampling conditions. For the empirical data analysis, we used the full closed population with heterogeneity data type in Program MARK to estimate N for a black bear population in Great Smoky Mountains National Park, Tennessee. We systematically reduced the number of those samples used in the analysis to evaluate the effect that changes in capture probabilities may have on parameter estimates. Model-averaged N for females and males were 161 (95% CI = 114–272) and 100 (95% CI = 74–167), respectively (pooled N = 261, 95% CI = 192–419), and the average weekly p was 0.09 for females and 0.12 for males. When we reduced the number of samples of the empirical data, support for heterogeneity models decreased. For the simulation analysis, we generated capture data with individual heterogeneity covering a range of sampling conditions commonly encountered in DNA-based capture-mark-recapture studies and examined the relationships between those conditions and accuracy (i.e., probability of obtaining an estimated N that is within 20% of true N), coverage (i.e., probability that 95% confidence interval includes true N), and precision (i.e., probability of obtaining a coefficient of variation ≤20%) of estimates using logistic regression. The capture probability

  14. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  15. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  16. Explaining regional disparities in traffic mortality by decomposing conditional probabilities.

    Science.gov (United States)

    Goldstein, Gregory P; Clark, David E; Travis, Lori L; Haskins, Amy E

    2011-04-01

    In the USA, the mortality rate from traffic injury is higher in rural and in southern regions, for reasons that are not well understood. For 1754 (56%) of the 3142 US counties, we obtained data allowing for separation of the deaths/population rate into deaths/injury, injuries/crash, crashes/exposure and exposure/population, with exposure measured as vehicle miles travelled. A 'decomposition method' proposed by Li and Baker was extended to study how the contributions of these components were affected by three measures of rural location, as well as southern location. The method of Li and Baker extended without difficulty to include non-binary effects and multiple exposures. Deaths/injury was by far the most important determinant in the county-to-county variation in deaths/population, and accounted for the greatest portion of the rural/urban disparity. After controlling for the rural effect, injuries/crash accounted for most of the southern/northern disparity. The increased mortality rate from traffic injury in rural areas can be attributed to the increased probability of death given that a person has been injured, possibly due to challenges faced by emergency medical response systems. In southern areas, there is an increased probability of injury given that a person has crashed, possibly due to differences in vehicle, road, or driving conditions.

  17. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  18. Quantum probability ranking principle for ligand-based virtual screening

    Science.gov (United States)

    Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Himmat, Mubarak; Ahmed, Ali; Saeed, Faisal

    2017-04-01

    Chemical libraries contain thousands of compounds that need screening, which increases the need for computational methods that can rank or prioritize compounds. The tools of virtual screening are widely exploited to enhance the cost effectiveness of lead drug discovery programs by ranking chemical compounds databases in decreasing probability of biological activity based upon probability ranking principle (PRP). In this paper, we developed a novel ranking approach for molecular compounds inspired by quantum mechanics, called quantum probability ranking principle (QPRP). The QPRP ranking criteria would make an attempt to draw an analogy between the physical experiment and molecular structure ranking process for 2D fingerprints in ligand based virtual screening (LBVS). The development of QPRP criteria in LBVS has employed the concepts of quantum at three different levels, firstly at representation level, this model makes an effort to develop a new framework of molecular representation by connecting the molecular compounds with mathematical quantum space. Secondly, estimate the similarity between chemical libraries and references based on quantum-based similarity searching method. Finally, rank the molecules using QPRP approach. Simulated virtual screening experiments with MDL drug data report (MDDR) data sets showed that QPRP outperformed the classical ranking principle (PRP) for molecular chemical compounds.

  19. Quantum probability ranking principle for ligand-based virtual screening.

    Science.gov (United States)

    Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Himmat, Mubarak; Ahmed, Ali; Saeed, Faisal

    2017-04-01

    Chemical libraries contain thousands of compounds that need screening, which increases the need for computational methods that can rank or prioritize compounds. The tools of virtual screening are widely exploited to enhance the cost effectiveness of lead drug discovery programs by ranking chemical compounds databases in decreasing probability of biological activity based upon probability ranking principle (PRP). In this paper, we developed a novel ranking approach for molecular compounds inspired by quantum mechanics, called quantum probability ranking principle (QPRP). The QPRP ranking criteria would make an attempt to draw an analogy between the physical experiment and molecular structure ranking process for 2D fingerprints in ligand based virtual screening (LBVS). The development of QPRP criteria in LBVS has employed the concepts of quantum at three different levels, firstly at representation level, this model makes an effort to develop a new framework of molecular representation by connecting the molecular compounds with mathematical quantum space. Secondly, estimate the similarity between chemical libraries and references based on quantum-based similarity searching method. Finally, rank the molecules using QPRP approach. Simulated virtual screening experiments with MDL drug data report (MDDR) data sets showed that QPRP outperformed the classical ranking principle (PRP) for molecular chemical compounds.

  20. Dangerous "spin": the probability myth of evidence-based prescribing - a Merleau-Pontyian approach.

    Science.gov (United States)

    Morstyn, Ron

    2011-08-01

    The aim of this study was to examine logical positivist statistical probability statements used to support and justify "evidence-based" prescribing rules in psychiatry when viewed from the major philosophical theories of probability, and to propose "phenomenological probability" based on Maurice Merleau-Ponty's philosophy of "phenomenological positivism" as a better clinical and ethical basis for psychiatric prescribing. The logical positivist statistical probability statements which are currently used to support "evidence-based" prescribing rules in psychiatry have little clinical or ethical justification when subjected to critical analysis from any of the major theories of probability and represent dangerous "spin" because they necessarily exclude the individual , intersubjective and ambiguous meaning of mental illness. A concept of "phenomenological probability" founded on Merleau-Ponty's philosophy of "phenomenological positivism" overcomes the clinically destructive "objectivist" and "subjectivist" consequences of logical positivist statistical probability and allows psychopharmacological treatments to be appropriately integrated into psychiatric treatment.

  1. A Probability-based Evolutionary Algorithm with Mutations to Learn Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Sho Fukuda

    2014-12-01

    Full Text Available Bayesian networks are regarded as one of the essential tools to analyze causal relationship between events from data. To learn the structure of highly-reliable Bayesian networks from data as quickly as possible is one of the important problems that several studies have been tried to achieve. In recent years, probability-based evolutionary algorithms have been proposed as a new efficient approach to learn Bayesian networks. In this paper, we target on one of the probability-based evolutionary algorithms called PBIL (Probability-Based Incremental Learning, and propose a new mutation operator. Through performance evaluation, we found that the proposed mutation operator has a good performance in learning Bayesian networks

  2. Participatory design of probability-based decision support tools for in-hospital nurses.

    Science.gov (United States)

    Jeffery, Alvin D; Novak, Laurie L; Kennedy, Betsy; Dietrich, Mary S; Mion, Lorraine C

    2017-11-01

    To describe nurses' preferences for the design of a probability-based clinical decision support (PB-CDS) tool for in-hospital clinical deterioration. A convenience sample of bedside nurses, charge nurses, and rapid response nurses (n = 20) from adult and pediatric hospitals completed participatory design sessions with researchers in a simulation laboratory to elicit preferred design considerations for a PB-CDS tool. Following theme-based content analysis, we shared findings with user interface designers and created a low-fidelity prototype. Three major themes and several considerations for design elements of a PB-CDS tool surfaced from end users. Themes focused on "painting a picture" of the patient condition over time, promoting empowerment, and aligning probability information with what a nurse already believes about the patient. The most notable design element consideration included visualizing a temporal trend of the predicted probability of the outcome along with user-selected overlapping depictions of vital signs, laboratory values, and outcome-related treatments and interventions. Participants expressed that the prototype adequately operationalized requests from the design sessions. Participatory design served as a valuable method in taking the first step toward developing PB-CDS tools for nurses. This information about preferred design elements of tools that support, rather than interrupt, nurses' cognitive workflows can benefit future studies in this field as well as nurses' practice. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the United States.

  3. Probabilistic analysis of Millstone Unit 3 ultimate containment failure probability given high pressure: Chapter 14

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1983-01-01

    The quantification of the containment event trees in the Millstone Unit 3 Probabilistic Safety Study utilizes a conditional probability of failure given high pressure which is based on a new approach. The generation of this conditional probability was based on a weakest link failure mode model which considered contributions from a number of overlapping failure modes. This overlap effect was due to a number of failure modes whose mean failure pressures were clustered within a 5 psi range and which had uncertainties due to variances in material strengths and analytical uncertainties which were between 9 and 15 psi. Based on a review of possible probability laws to describe the failure probability of individual structural failure modes, it was determined that a Weibull probability law most adequately described the randomness in the physical process of interest. The resultant conditional probability of failure is found to have a median failure pressure of 132.4 psia. The corresponding 5-95 percentile values are 112 psia and 146.7 psia respectively. The skewed nature of the conditional probability of failure vs. pressure results in a lower overall containment failure probability for an appreciable number of the severe accident sequences of interest, but also probabilities which are more rigorously traceable from first principles

  4. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  5. Performance Analysis of Secrecy Outage Probability for AF-Based Partial Relay Selection with Outdated Channel Estimates

    Directory of Open Access Journals (Sweden)

    Kyu-Sung Hwang

    2017-01-01

    Full Text Available We study the secrecy outage probability of the amplify-and-forward (AF relaying protocol, which consists of one source, one destination, multiple relays, and multiple eavesdroppers. In this system, the aim is to transmit the confidential messages from a source to a destination via the selected relay in presence of eavesdroppers. Moreover, partial relay selection scheme is utilized for relay selection based on outdated channel state information where only neighboring channel information (source-relays is available and passive eavesdroppers are considered where a transmitter does not have any knowledge of eavesdroppers’ channels. Specifically, we offer the exact secrecy outage probability of the proposed system in a one-integral form as well as providing the asymptotic secrecy outage probability in a closed-form. Numerical examples are given to verify our provided analytical results for different system conditions.

  6. Fuzzy probability based fault tree analysis to propagate and quantify epistemic uncertainty

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Ekariansyah, Andi Sofrany; Tjahjono, Hendro

    2015-01-01

    Highlights: • Fuzzy probability based fault tree analysis is to evaluate epistemic uncertainty in fuzzy fault tree analysis. • Fuzzy probabilities represent likelihood occurrences of all events in a fault tree. • A fuzzy multiplication rule quantifies epistemic uncertainty of minimal cut sets. • A fuzzy complement rule estimate epistemic uncertainty of the top event. • The proposed FPFTA has successfully evaluated the U.S. Combustion Engineering RPS. - Abstract: A number of fuzzy fault tree analysis approaches, which integrate fuzzy concepts into the quantitative phase of conventional fault tree analysis, have been proposed to study reliabilities of engineering systems. Those new approaches apply expert judgments to overcome the limitation of the conventional fault tree analysis when basic events do not have probability distributions. Since expert judgments might come with epistemic uncertainty, it is important to quantify the overall uncertainties of the fuzzy fault tree analysis. Monte Carlo simulation is commonly used to quantify the overall uncertainties of conventional fault tree analysis. However, since Monte Carlo simulation is based on probability distribution, this technique is not appropriate for fuzzy fault tree analysis, which is based on fuzzy probabilities. The objective of this study is to develop a fuzzy probability based fault tree analysis to overcome the limitation of fuzzy fault tree analysis. To demonstrate the applicability of the proposed approach, a case study is performed and its results are then compared to the results analyzed by a conventional fault tree analysis. The results confirm that the proposed fuzzy probability based fault tree analysis is feasible to propagate and quantify epistemic uncertainties in fault tree analysis

  7. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  8. Collapse susceptibility mapping in karstified gypsum terrain (Sivas basin - Turkey) by conditional probability, logistic regression, artificial neural network models

    Science.gov (United States)

    Yilmaz, Isik; Keskin, Inan; Marschalko, Marian; Bednarik, Martin

    2010-05-01

    This study compares the GIS based collapse susceptibility mapping methods such as; conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) applied in gypsum rock masses in Sivas basin (Turkey). Digital Elevation Model (DEM) was first constructed using GIS software. Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index- TWI, stream power index- SPI, Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from CP, LR and ANN models, and they were then compared by means of their validations. Area Under Curve (AUC) values obtained from all three methodologies showed that the map obtained from ANN model looks like more accurate than the other models, and the results also showed that the artificial neural networks is a usefull tool in preparation of collapse susceptibility map and highly compatible with GIS operating features. Key words: Collapse; doline; susceptibility map; gypsum; GIS; conditional probability; logistic regression; artificial neural networks.

  9. Probability distribution of magnetization in the one-dimensional Ising model: effects of boundary conditions

    Energy Technology Data Exchange (ETDEWEB)

    Antal, T [Physics Department, Simon Fraser University, Burnaby, BC V5A 1S6 (Canada); Droz, M [Departement de Physique Theorique, Universite de Geneve, CH 1211 Geneva 4 (Switzerland); Racz, Z [Institute for Theoretical Physics, Eoetvoes University, 1117 Budapest, Pazmany setany 1/a (Hungary)

    2004-02-06

    Finite-size scaling functions are investigated both for the mean-square magnetization fluctuations and for the probability distribution of the magnetization in the one-dimensional Ising model. The scaling functions are evaluated in the limit of the temperature going to zero (T {yields} 0), the size of the system going to infinity (N {yields} {infinity}) while N[1 - tanh(J/k{sub B}T)] is kept finite (J being the nearest neighbour coupling). Exact calculations using various boundary conditions (periodic, antiperiodic, free, block) demonstrate explicitly how the scaling functions depend on the boundary conditions. We also show that the block (small part of a large system) magnetization distribution results are identical to those obtained for free boundary conditions.

  10. The Use of Conditional Probability Integral Transformation Method for Testing Accelerated Failure Time Models

    Directory of Open Access Journals (Sweden)

    Abdalla Ahmed Abdel-Ghaly

    2016-06-01

    Full Text Available This paper suggests the use of the conditional probability integral transformation (CPIT method as a goodness of fit (GOF technique in the field of accelerated life testing (ALT, specifically for validating the underlying distributional assumption in accelerated failure time (AFT model. The method is based on transforming the data into independent and identically distributed (i.i.d Uniform (0, 1 random variables and then applying the modified Watson statistic to test the uniformity of the transformed random variables. This technique is used to validate each of the exponential, Weibull and lognormal distributions' assumptions in AFT model under constant stress and complete sampling. The performance of the CPIT method is investigated via a simulation study. It is concluded that this method performs well in case of exponential and lognormal distributions. Finally, a real life example is provided to illustrate the application of the proposed procedure.

  11. Probabilistic Approach to Conditional Probability of Release of Hazardous Materials from Railroad Tank Cars during Accidents

    Science.gov (United States)

    2009-10-13

    This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...

  12. Probability of detecting perchlorate under natural conditions in deep groundwater in California and the Southwestern United States

    Science.gov (United States)

    Fram, Miranda S.; Belitz, Kenneth

    2011-01-01

    We use data from 1626 groundwater samples collected in California, primarily from public drinking water supply wells, to investigate the distribution of perchlorate in deep groundwater under natural conditions. The wells were sampled for the California Groundwater Ambient Monitoring and Assessment Priority Basin Project. We develop a logistic regression model for predicting probabilities of detecting perchlorate at concentrations greater than multiple threshold concentrations as a function of climate (represented by an aridity index) and potential anthropogenic contributions of perchlorate (quantified as an anthropogenic score, AS). AS is a composite categorical variable including terms for nitrate, pesticides, and volatile organic compounds. Incorporating water-quality parameters in AS permits identification of perturbation of natural occurrence patterns by flushing of natural perchlorate salts from unsaturated zones by irrigation recharge as well as addition of perchlorate from industrial and agricultural sources. The data and model results indicate low concentrations (0.1-0.5 μg/L) of perchlorate occur under natural conditions in groundwater across a wide range of climates, beyond the arid to semiarid climates in which they mostly have been previously reported. The probability of detecting perchlorate at concentrations greater than 0.1 μg/L under natural conditions ranges from 50-70% in semiarid to arid regions of California and the Southwestern United States to 5-15% in the wettest regions sampled (the Northern California coast). The probability of concentrations above 1 μg/L under natural conditions is low (generally <3%).

  13. Modulation Based on Probability Density Functions

    Science.gov (United States)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  14. Conditional probability distribution associated to the E-M image reconstruction algorithm for neutron stimulated emission tomography

    International Nuclear Information System (INIS)

    Viana, R.S.; Yoriyaz, H.; Santos, A.

    2011-01-01

    The Expectation-Maximization (E-M) algorithm is an iterative computational method for maximum likelihood (M-L) estimates, useful in a variety of incomplete-data problems. Due to its stochastic nature, one of the most relevant applications of E-M algorithm is the reconstruction of emission tomography images. In this paper, the statistical formulation of the E-M algorithm was applied to the in vivo spectrographic imaging of stable isotopes called Neutron Stimulated Emission Computed Tomography (NSECT). In the process of E-M algorithm iteration, the conditional probability distribution plays a very important role to achieve high quality image. This present work proposes an alternative methodology for the generation of the conditional probability distribution associated to the E-M reconstruction algorithm, using the Monte Carlo code MCNP5 and with the application of the reciprocity theorem. (author)

  15. Conditional probability distribution associated to the E-M image reconstruction algorithm for neutron stimulated emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Viana, R.S.; Yoriyaz, H.; Santos, A., E-mail: rodrigossviana@gmail.com, E-mail: hyoriyaz@ipen.br, E-mail: asantos@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    The Expectation-Maximization (E-M) algorithm is an iterative computational method for maximum likelihood (M-L) estimates, useful in a variety of incomplete-data problems. Due to its stochastic nature, one of the most relevant applications of E-M algorithm is the reconstruction of emission tomography images. In this paper, the statistical formulation of the E-M algorithm was applied to the in vivo spectrographic imaging of stable isotopes called Neutron Stimulated Emission Computed Tomography (NSECT). In the process of E-M algorithm iteration, the conditional probability distribution plays a very important role to achieve high quality image. This present work proposes an alternative methodology for the generation of the conditional probability distribution associated to the E-M reconstruction algorithm, using the Monte Carlo code MCNP5 and with the application of the reciprocity theorem. (author)

  16. Age replacement policy based on imperfect repair with random probability

    International Nuclear Information System (INIS)

    Lim, J.H.; Qu, Jian; Zuo, Ming J.

    2016-01-01

    In most of literatures of age replacement policy, failures before planned replacement age can be either minimally repaired or perfectly repaired based on the types of failures, cost for repairs and so on. In this paper, we propose age replacement policy based on imperfect repair with random probability. The proposed policy incorporates the case that such intermittent failure can be either minimally repaired or perfectly repaired with random probabilities. The mathematical formulas of the expected cost rate per unit time are derived for both the infinite-horizon case and the one-replacement-cycle case. For each case, we show that the optimal replacement age exists and is finite. - Highlights: • We propose a new age replacement policy with random probability of perfect repair. • We develop the expected cost per unit time. • We discuss the optimal age for replacement minimizing the expected cost rate.

  17. An novel frequent probability pattern mining algorithm based on circuit simulation method in uncertain biological networks

    Science.gov (United States)

    2014-01-01

    Background Motif mining has always been a hot research topic in bioinformatics. Most of current research on biological networks focuses on exact motif mining. However, due to the inevitable experimental error and noisy data, biological network data represented as the probability model could better reflect the authenticity and biological significance, therefore, it is more biological meaningful to discover probability motif in uncertain biological networks. One of the key steps in probability motif mining is frequent pattern discovery which is usually based on the possible world model having a relatively high computational complexity. Methods In this paper, we present a novel method for detecting frequent probability patterns based on circuit simulation in the uncertain biological networks. First, the partition based efficient search is applied to the non-tree like subgraph mining where the probability of occurrence in random networks is small. Then, an algorithm of probability isomorphic based on circuit simulation is proposed. The probability isomorphic combines the analysis of circuit topology structure with related physical properties of voltage in order to evaluate the probability isomorphism between probability subgraphs. The circuit simulation based probability isomorphic can avoid using traditional possible world model. Finally, based on the algorithm of probability subgraph isomorphism, two-step hierarchical clustering method is used to cluster subgraphs, and discover frequent probability patterns from the clusters. Results The experiment results on data sets of the Protein-Protein Interaction (PPI) networks and the transcriptional regulatory networks of E. coli and S. cerevisiae show that the proposed method can efficiently discover the frequent probability subgraphs. The discovered subgraphs in our study contain all probability motifs reported in the experiments published in other related papers. Conclusions The algorithm of probability graph isomorphism

  18. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  19. A Probability-Based Hybrid User Model for Recommendation System

    Directory of Open Access Journals (Sweden)

    Jia Hao

    2016-01-01

    Full Text Available With the rapid development of information communication technology, the available information or knowledge is exponentially increased, and this causes the well-known information overload phenomenon. This problem is more serious in product design corporations because over half of the valuable design time is consumed in knowledge acquisition, which highly extends the design cycle and weakens the competitiveness. Therefore, the recommender systems become very important in the domain of product domain. This research presents a probability-based hybrid user model, which is a combination of collaborative filtering and content-based filtering. This hybrid model utilizes user ratings and item topics or classes, which are available in the domain of product design, to predict the knowledge requirement. The comprehensive analysis of the experimental results shows that the proposed method gains better performance in most of the parameter settings. This work contributes a probability-based method to the community for implement recommender system when only user ratings and item topics are available.

  20. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  1. A human error probability estimate methodology based on fuzzy inference and expert judgment on nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Mesquita, R.N. de

    2009-01-01

    Recent studies point human error as an important factor for many industrial and nuclear accidents: Three Mile Island (1979), Bhopal (1984), Chernobyl and Challenger (1986) are classical examples. Human contribution to these accidents may be better understood and analyzed by using Human Reliability Analysis (HRA), which has being taken as an essential part on Probabilistic Safety Analysis (PSA) of nuclear plants. Both HRA and PSA depend on Human Error Probability (HEP) for a quantitative analysis. These probabilities are extremely affected by the Performance Shaping Factors (PSF), which has a direct effect on human behavior and thus shape HEP according with specific environment conditions and personal individual characteristics which are responsible for these actions. This PSF dependence raises a great problem on data availability as turn these scarcely existent database too much generic or too much specific. Besides this, most of nuclear plants do not keep historical records of human error occurrences. Therefore, in order to overcome this occasional data shortage, a methodology based on Fuzzy Inference and expert judgment was employed in this paper in order to determine human error occurrence probabilities and to evaluate PSF's on performed actions by operators in a nuclear power plant (IEA-R1 nuclear reactor). Obtained HEP values were compared with reference tabled data used on current literature in order to show method coherence and valid approach. This comparison leads to a conclusion that this work results are able to be employed both on HRA and PSA enabling efficient prospection of plant safety conditions, operational procedures and local working conditions potential improvements (author)

  2. Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data

    Science.gov (United States)

    Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei

    2009-03-01

    We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.

  3. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  4. Bayesian based Diagnostic Model for Condition based Maintenance of Offshore Wind Farms

    DEFF Research Database (Denmark)

    Asgarpour, Masoud; Sørensen, John Dalsgaard

    2018-01-01

    Operation and maintenance costs are a major contributor to the Levelized Cost of Energy for electricity produced by offshore wind and can be significantly reduced if existing corrective actions are performed as efficiently as possible and if future corrective actions are avoided by performing...... sufficient preventive actions. This paper presents an applied and generic diagnostic model for fault detection and condition based maintenance of offshore wind components. The diagnostic model is based on two probabilistic matrices; first, a confidence matrix, representing the probability of detection using...... for a wind turbine component based on vibration, temperature, and oil particle fault detection methods. The last part of the paper will have a discussion of the case study results and present conclusions....

  5. Probability-of-Superiority SEM (PS-SEM—Detecting Probability-Based Multivariate Relationships in Behavioral Research

    Directory of Open Access Journals (Sweden)

    Johnson Ching-Hong Li

    2018-06-01

    Full Text Available In behavioral research, exploring bivariate relationships between variables X and Y based on the concept of probability-of-superiority (PS has received increasing attention. Unlike the conventional, linear-based bivariate relationship (e.g., Pearson's correlation, PS defines that X and Y can be related based on their likelihood—e.g., a student who is above mean in SAT has 63% likelihood of achieving an above-mean college GPA. Despite its increasing attention, the concept of PS is restricted to a simple bivariate scenario (X-Y pair, which hinders the development and application of PS in popular multivariate modeling such as structural equation modeling (SEM. Therefore, this study addresses an empirical-based simulation study that explores the potential of detecting PS-based relationship in SEM, called PS-SEM. The simulation results showed that the proposed PS-SEM method can detect and identify PS-based when data follow PS-based relationships, thereby providing a useful method for researchers to explore PS-based SEM in their studies. Conclusions, implications, and future directions based on the findings are also discussed.

  6. Approximation of Measurement Results of “Emergency” Signal Reception Probability

    Directory of Open Access Journals (Sweden)

    Gajda Stanisław

    2017-08-01

    Full Text Available The intended aim of this article is to present approximation results of the exemplary measurements of EMERGENCY signal reception probability. The probability is under-stood as a distance function between the aircraft and a ground-based system under established conditions. The measurements were approximated using the properties of logistic functions. This probability, as a distance function, enables to determine the range of the EMERGENCY signal for a pre-set confidence level.

  7. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  8. Reducing the Probability of Incidents Through Behavior-Based Safety -- An Anomaly or Not?

    International Nuclear Information System (INIS)

    Turek, John A

    2002-01-01

    Reducing the probability of incidents through Behavior-Based Safety--an anomaly or not? Can a Behavior-Based Safety (BBS) process reduce the probability of an employee sustaining a work-related injury or illness? This presentation describes the actions taken to implement a sustainable BBS process and evaluates its effectiveness. The BBS process at the Stanford Linear Accelerator Center used a pilot population of national laboratory employees to: Achieve employee and management support; Reduce the probability of employees' sustaining work-related injuries and illnesses; and Provide support for additional funding to expand within the laboratory

  9. Probability based calibration of pressure coefficients

    DEFF Research Database (Denmark)

    Hansen, Svend Ole; Pedersen, Marie Louise; Sørensen, John Dalsgaard

    2015-01-01

    Normally, a consistent basis for calculating partial factors focuses on a homogeneous reliability index neither depending on which material the structure is constructed of nor the ratio between the permanent and variable actions acting on the structure. Furthermore, the reliability index should n...... the characteristic shape coefficients are based on mean values as specified in background documents to the Eurocodes. Importance of hidden safeties judging the reliability is discussed for wind actions on low-rise structures....... not depend on the type of variable action. A probability based calibration of pressure coefficients have been carried out using pressure measurements on the standard CAARC building modelled on scale of 1:383. The extreme pressures measured on the CAARC building model in the wind tunnel have been fitted.......3, the Eurocode partial factor of 1.5 for variable actions agrees well with the inherent uncertainties of wind actions when the pressure coefficients are determined using wind tunnel test results. The increased bias and uncertainty when pressure coefficients mainly are based on structural codes lead to a larger...

  10. Analysis of blocking probability for OFDM-based variable bandwidth optical network

    Science.gov (United States)

    Gong, Lei; Zhang, Jie; Zhao, Yongli; Lin, Xuefeng; Wu, Yuyao; Gu, Wanyi

    2011-12-01

    Orthogonal Frequency Division Multiplexing (OFDM) has recently been proposed as a modulation technique. For optical networks, because of its good spectral efficiency, flexibility, and tolerance to impairments, optical OFDM is much more flexible compared to traditional WDM systems, enabling elastic bandwidth transmissions, and optical networking is the future trend of development. In OFDM-based optical network the research of blocking rate has very important significance for network assessment. Current research for WDM network is basically based on a fixed bandwidth, in order to accommodate the future business and the fast-changing development of optical network, our study is based on variable bandwidth OFDM-based optical networks. We apply the mathematical analysis and theoretical derivation, based on the existing theory and algorithms, research blocking probability of the variable bandwidth of optical network, and then we will build a model for blocking probability.

  11. Concepts and Bounded Rationality: An Application of Niestegge's Approach to Conditional Quantum Probabilities

    Science.gov (United States)

    Blutner, Reinhard

    2009-03-01

    Recently, Gerd Niestegge developed a new approach to quantum mechanics via conditional probabilities developing the well-known proposal to consider the Lüders-von Neumann measurement as a non-classical extension of probability conditionalization. I will apply his powerful and rigorous approach to the treatment of concepts using a geometrical model of meaning. In this model, instances are treated as vectors of a Hilbert space H. In the present approach there are at least two possibilities to form categories. The first possibility sees categories as a mixture of its instances (described by a density matrix). In the simplest case we get the classical probability theory including the Bayesian formula. The second possibility sees categories formed by a distinctive prototype which is the superposition of the (weighted) instances. The construction of prototypes can be seen as transferring a mixed quantum state into a pure quantum state freezing the probabilistic characteristics of the superposed instances into the structure of the formed prototype. Closely related to the idea of forming concepts by prototypes is the existence of interference effects. Such inference effects are typically found in macroscopic quantum systems and I will discuss them in connection with several puzzles of bounded rationality. The present approach nicely generalizes earlier proposals made by authors such as Diederik Aerts, Andrei Khrennikov, Ricardo Franco, and Jerome Busemeyer. Concluding, I will suggest that an active dialogue between cognitive approaches to logic and semantics and the modern approach of quantum information science is mandatory.

  12. Input-profile-based software failure probability quantification for safety signal generation systems

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Lim, Ho Gon; Lee, Ho Jung; Kim, Man Cheol; Jang, Seung Cheol

    2009-01-01

    The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.

  13. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  14. Multiple-event probability in general-relativistic quantum mechanics

    International Nuclear Information System (INIS)

    Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-01-01

    We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse

  15. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  16. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Traffic conflict assessment for non-lane-based movements of motorcycles under congested conditions

    Directory of Open Access Journals (Sweden)

    Long Xuan Nguyen

    2014-03-01

    Full Text Available Traffic conflict under congested conditions is one of the main safety issues of motorcycle traffic in developing countries. Unlike cars, motorcycles often display non-lane-based movements such as swerving or oblique following of a lead vehicle when traffic becomes congested. Very few studies have quantitatively evaluated the effects of such non-lane-based movements on traffic conflict. Therefore, in this study we aim to develop an integrated model to assess the traffic conflict of motorcycles under congested conditions. The proposed model includes a concept of safety space to describe the non-lane-based movements unique to motorcycles, new features developed for traffic conflict assessment such as parameters of acceleration and deceleration, and the conditions for choosing a lead vehicle. Calibration data were extracted from video clips taken at two road segments in Ho Chi Minh City. A simulation based on the model was developed to verify the dynamic non-lane-based movements of motorcycles. Subsequently, the assessment of traffic conflict was validated by calculating the probability of sudden braking at each time interval according to the change in the density of motorcycle flow. Our findings underscore the fact that higher flow density may lead to conflicts associated with a greater probability of sudden breaking. Three types of motorcycle traffic conflicts were confirmed, and the proportions of each type were calculated and discussed.

  18. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    Energy Technology Data Exchange (ETDEWEB)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento, Trento (Italy)

    2016-06-14

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  19. Capturing alternative secondary structures of RNA by decomposition of base-pairing probabilities.

    Science.gov (United States)

    Hagio, Taichi; Sakuraba, Shun; Iwakiri, Junichi; Mori, Ryota; Asai, Kiyoshi

    2018-02-19

    It is known that functional RNAs often switch their functions by forming different secondary structures. Popular tools for RNA secondary structures prediction, however, predict the single 'best' structures, and do not produce alternative structures. There are bioinformatics tools to predict suboptimal structures, but it is difficult to detect which alternative secondary structures are essential. We proposed a new computational method to detect essential alternative secondary structures from RNA sequences by decomposing the base-pairing probability matrix. The decomposition is calculated by a newly implemented software tool, RintW, which efficiently computes the base-pairing probability distributions over the Hamming distance from arbitrary reference secondary structures. The proposed approach has been demonstrated on ROSE element RNA thermometer sequence and Lysine RNA ribo-switch, showing that the proposed approach captures conformational changes in secondary structures. We have shown that alternative secondary structures are captured by decomposing base-paring probabilities over Hamming distance. Source code is available from http://www.ncRNA.org/RintW .

  20. A probability-based multi-cycle sorting method for 4D-MRI: A simulation study.

    Science.gov (United States)

    Liang, Xiao; Yin, Fang-Fang; Liu, Yilin; Cai, Jing

    2016-12-01

    To develop a novel probability-based sorting method capable of generating multiple breathing cycles of 4D-MRI images and to evaluate performance of this new method by comparing with conventional phase-based methods in terms of image quality and tumor motion measurement. Based on previous findings that breathing motion probability density function (PDF) of a single breathing cycle is dramatically different from true stabilized PDF that resulted from many breathing cycles, it is expected that a probability-based sorting method capable of generating multiple breathing cycles of 4D images may capture breathing variation information missing from conventional single-cycle sorting methods. The overall idea is to identify a few main breathing cycles (and their corresponding weightings) that can best represent the main breathing patterns of the patient and then reconstruct a set of 4D images for each of the identified main breathing cycles. This method is implemented in three steps: (1) The breathing signal is decomposed into individual breathing cycles, characterized by amplitude, and period; (2) individual breathing cycles are grouped based on amplitude and period to determine the main breathing cycles. If a group contains more than 10% of all breathing cycles in a breathing signal, it is determined as a main breathing pattern group and is represented by the average of individual breathing cycles in the group; (3) for each main breathing cycle, a set of 4D images is reconstructed using a result-driven sorting method adapted from our previous study. The probability-based sorting method was first tested on 26 patients' breathing signals to evaluate its feasibility of improving target motion PDF. The new method was subsequently tested for a sequential image acquisition scheme on the 4D digital extended cardiac torso (XCAT) phantom. Performance of the probability-based and conventional sorting methods was evaluated in terms of target volume precision and accuracy as measured

  1. α-Cut method based importance measure for criticality analysis in fuzzy probabilityBased fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probabilitybased fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  2. The limiting conditional probability distribution in a stochastic model of T cell repertoire maintenance.

    Science.gov (United States)

    Stirk, Emily R; Lythe, Grant; van den Berg, Hugo A; Hurst, Gareth A D; Molina-París, Carmen

    2010-04-01

    The limiting conditional probability distribution (LCD) has been much studied in the field of mathematical biology, particularly in the context of epidemiology and the persistence of epidemics. However, it has not yet been applied to the immune system. One of the characteristic features of the T cell repertoire is its diversity. This diversity declines in old age, whence the concepts of extinction and persistence are also relevant to the immune system. In this paper we model T cell repertoire maintenance by means of a continuous-time birth and death process on the positive integers, where the origin is an absorbing state. We show that eventual extinction is guaranteed. The late-time behaviour of the process before extinction takes place is modelled by the LCD, which we prove always exists for the process studied here. In most cases, analytic expressions for the LCD cannot be computed but the probability distribution may be approximated by means of the stationary probability distributions of two related processes. We show how these approximations are related to the LCD of the original process and use them to study the LCD in two special cases. We also make use of the large N expansion to derive a further approximation to the LCD. The accuracy of the various approximations is then analysed. (c) 2009 Elsevier Inc. All rights reserved.

  3. Prediction suppression in monkey inferotemporal cortex depends on the conditional probability between images.

    Science.gov (United States)

    Ramachandran, Suchitra; Meyer, Travis; Olson, Carl R

    2016-01-01

    When monkeys view two images in fixed sequence repeatedly over days and weeks, neurons in area TE of the inferotemporal cortex come to exhibit prediction suppression. The trailing image elicits only a weak response when presented following the leading image that preceded it during training. Induction of prediction suppression might depend either on the contiguity of the images, as determined by their co-occurrence and captured in the measure of joint probability P(A,B), or on their contingency, as determined by their correlation and as captured in the measures of conditional probability P(A|B) and P(B|A). To distinguish between these possibilities, we measured prediction suppression after imposing training regimens that held P(A,B) constant but varied P(A|B) and P(B|A). We found that reducing either P(A|B) or P(B|A) during training attenuated prediction suppression as measured during subsequent testing. We conclude that prediction suppression depends on contingency, as embodied in the predictive relations between the images, and not just on contiguity, as embodied in their co-occurrence. Copyright © 2016 the American Physiological Society.

  4. [Conditional probability analysis between tinnitus and comorbidities in patients attending the National Rehabilitation Institute-LGII in the period 2012-2013].

    Science.gov (United States)

    Gómez Toledo, Verónica; Gutiérrez Farfán, Ileana; Verduzco-Mendoza, Antonio; Arch-Tirado, Emilio

    Tinnitus is defined as the conscious perception of a sensation of sound that occurs in the absence of an external stimulus. This audiological symptom affects 7% to 19% of the adult population. The aim of this study is to describe the associated comorbidities present in patients with tinnitus usingjoint and conditional probability analysis. Patients of both genders, diagnosed with unilateral or bilateral tinnitus, aged between 20 and 45 years, and had a full computerised medical record, were selected. Study groups were formed on the basis of the following clinical aspects: 1) audiological findings; 2) vestibular findings; 3) comorbidities such as, temporomandibular dysfunction, tubal dysfunction, otosclerosis and, 4) triggering factors of tinnitus noise exposure, respiratory tract infection, use of ototoxic and/or drugs. Of the patients with tinnitus, 27 (65%) reported hearing loss, 11 (26.19%) temporomandibular dysfunction, and 11 (26.19%) with vestibular disorders. When performing the joint probability analysis, it was found that the probability that a patient with tinnitus having hearing loss was 2742 0.65, and 2042 0.47 for bilateral type. The result for P (A ∩ B)=30%. Bayes' theorem P (AiB) = P(Ai∩B)P(B) was used, and various probabilities were calculated. Therefore, in patients with temporomandibulardysfunction and vestibular disorders, a posterior probability of P (Aі/B)=31.44% was calculated. Consideration should be given to the joint and conditional probability approach as tools for the study of different pathologies. Copyright © 2016 Academia Mexicana de Cirugía A.C. Publicado por Masson Doyma México S.A. All rights reserved.

  5. Failure Probability Calculation Method Using Kriging Metamodel-based Importance Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seunggyu [Korea Aerospace Research Institue, Daejeon (Korea, Republic of); Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-05-15

    The kernel density was determined based on sampling points obtained in a Markov chain simulation and was assumed to be an important sampling function. A Kriging metamodel was constructed in more detail in the vicinity of a limit state. The failure probability was calculated based on importance sampling, which was performed for the Kriging metamodel. A pre-existing method was modified to obtain more sampling points for a kernel density in the vicinity of a limit state. A stable numerical method was proposed to find a parameter of the kernel density. To assess the completeness of the Kriging metamodel, the possibility of changes in the calculated failure probability due to the uncertainty of the Kriging metamodel was calculated.

  6. Analytical and numerical studies of creation probabilities of hierarchical trees

    Directory of Open Access Journals (Sweden)

    S.S. Borysov

    2011-03-01

    Full Text Available We consider the creation conditions of diverse hierarchical trees both analytically and numerically. A connection between the probabilities to create hierarchical levels and the probability to associate these levels into a united structure is studied. We argue that a consistent probabilistic picture requires the use of deformed algebra. Our consideration is based on the study of the main types of hierarchical trees, among which both regular and degenerate ones are studied analytically, while the creation probabilities of Fibonacci, scale-free and arbitrary trees are determined numerically.

  7. The Probability of Uranium Deposit Occurrences at Hatapang and Its Surrounding

    International Nuclear Information System (INIS)

    Soepradto-Tjokrokardono; Ngadenin

    2004-01-01

    This study was carried out based on a geological condition of Hatapang and is surroundings areas that are favourable for uranium accumulation, which are indicated by the existence of granite high uranium content, having mobilizations process and uranium trapping rocks. Referring to the plate tectonic and geochemical situation of Hatapang, those condition will give a significant indications for the possible occurrence of deposit of uranium in the area. The goal of this study is to know the probability occurrences of uranium deposit based on the regional tectonic, geology, mineralogy, geochemical, and radioactivity characters. It is concluded that Hatapang granite is potential for U source granite, and U deposit of black shale type is probably accurate in this area. (author)

  8. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  9. Probability based load combinations for design of category I structures

    International Nuclear Information System (INIS)

    Reich, M.; Hwang, H.

    1985-01-01

    This paper discusses a reliability analysis method and a procedure for developing the load combination design criteria for category I structures. For safety evaluation of category I concrete structures under various static and dynamic loads, a probability-based reliability analysis method has been developed. This reliability analysis method is also used as a tool for determining the load factors for design of category I structures. In this paper, the load combinations for design of concrete containments, corresponding to a target limit state probability of 1.0 x 10 -6 in 4 years, are described. A comparison of containments designed using the ASME code and the proposed design criteria is also presented

  10. Power Allocation and Outage Probability Analysis for SDN-based Radio Access Networks

    Science.gov (United States)

    Zhao, Yongxu; Chen, Yueyun; Mai, Zhiyuan

    2018-01-01

    In this paper, performance of Access network Architecture based SDN (Software Defined Network) is analyzed with respect to the power allocation issue. A power allocation scheme PSO-PA (Particle Swarm Optimization-power allocation) algorithm is proposed, the proposed scheme is subjected to constant total power with the objective of minimizing system outage probability. The entire access network resource configuration is controlled by the SDN controller, then it sends the optimized power distribution factor to the base station source node (SN) and the relay node (RN). Simulation results show that the proposed scheme reduces the system outage probability at a low complexity.

  11. A quantum probability model of causal reasoning

    Directory of Open Access Journals (Sweden)

    Jennifer S Trueblood

    2012-05-01

    Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  12. Web-based experiments controlled by JavaScript: an example from probability learning.

    Science.gov (United States)

    Birnbaum, Michael H; Wakcher, Sandra V

    2002-05-01

    JavaScript programs can be used to control Web experiments. This technique is illustrated by an experiment that tested the effects of advice on performance in the classic probability-learning paradigm. Previous research reported that people tested via the Web or in the lab tended to match the probabilities of their responses to the probabilities that those responses would be reinforced. The optimal strategy, however, is to consistently choose the more frequent event; probability matching produces suboptimal performance. We investigated manipulations we reasoned should improve performance. A horse race scenario in which participants predicted the winner in each of a series of races between two horses was compared with an abstract scenario used previously. Ten groups of learners received different amounts of advice, including all combinations of (1) explicit instructions concerning the optimal strategy, (2) explicit instructions concerning a monetary sum to maximize, and (3) accurate information concerning the probabilities of events. The results showed minimal effects of horse race versus abstract scenario. Both advice concerning the optimal strategy and probability information contributed significantly to performance in the task. This paper includes a brief tutorial on JavaScript, explaining with simple examples how to assemble a browser-based experiment.

  13. The Importance of Conditional Probability in Diagnostic Reasoning and Clinical Decision Making: A Primer for the Eye Care Practitioner.

    Science.gov (United States)

    Sanfilippo, Paul G; Hewitt, Alex W; Mackey, David A

    2017-04-01

    To outline and detail the importance of conditional probability in clinical decision making and discuss the various diagnostic measures eye care practitioners should be aware of in order to improve the scope of their clinical practice. We conducted a review of the importance of conditional probability in diagnostic testing for the eye care practitioner. Eye care practitioners use diagnostic tests on a daily basis to assist in clinical decision making and optimizing patient care and management. These tests provide probabilistic information that can enable the clinician to increase (or decrease) their level of certainty about the presence of a particular condition. While an understanding of the characteristics of diagnostic tests are essential to facilitate proper interpretation of test results and disease risk, many practitioners either confuse or misinterpret these measures. In the interests of their patients, practitioners should be aware of the basic concepts associated with diagnostic testing and the simple mathematical rule that underpins them. Importantly, the practitioner needs to recognize that the prevalence of a disease in the population greatly determines the clinical value of a diagnostic test.

  14. Conditional non-independence of radiographic image features and the derivation of post-test probabilities – A mammography BI-RADS example

    International Nuclear Information System (INIS)

    Benndorf, Matthias

    2012-01-01

    Bayes' theorem has proven to be one of the cornerstones in medical decision making. It allows for the derivation of post-test probabilities, which in case of a positive test result become positive predictive values. If several test results are observed successively Bayes' theorem may be used with assumed conditional independence of test results or with incorporated conditional dependencies. Herein it is examined whether radiographic image features should be considered conditionally independent diagnostic tests when post-test probabilities are to be derived. For this purpose the mammographic mass dataset from the UCI (University of California, Irvine) machine learning repository is analysed. It comprises the description of 961 (516 benign, 445 malignant) mammographic mass lesions according to the BI-RADS (Breast Imaging: Reporting and Data System) lexicon. Firstly, an exhaustive correlation matrix is presented for mammography BI-RADS features among benign and malignant lesions separately; correlation can be regarded as measure for conditional dependence. Secondly, it is shown that the derived positive predictive values for the conjunction of the two features “irregular shape” and “spiculated margin” differ significantly depending on whether conditional dependencies are incorporated into the decision process or not. It is concluded that radiographic image features should not generally be regarded as conditionally independent diagnostic tests.

  15. Failure probability analyses for PWSCC in Ni-based alloy welds

    International Nuclear Information System (INIS)

    Udagawa, Makoto; Katsuyama, Jinya; Onizawa, Kunio; Li, Yinsheng

    2015-01-01

    A number of cracks due to primary water stress corrosion cracking (PWSCC) in pressurized water reactors and Ni-based alloy stress corrosion cracking (NiSCC) in boiling water reactors have been detected around Ni-based alloy welds. The causes of crack initiation and growth due to stress corrosion cracking include weld residual stress, operating stress, the materials, and the environment. We have developed the analysis code PASCAL-NP for calculating the failure probability and assessment of the structural integrity of cracked components on the basis of probabilistic fracture mechanics (PFM) considering PWSCC and NiSCC. This PFM analysis code has functions for calculating the incubation time of PWSCC and NiSCC crack initiation, evaluation of crack growth behavior considering certain crack location and orientation patterns, and evaluation of failure behavior near Ni-based alloy welds due to PWSCC and NiSCC in a probabilistic manner. Herein, actual plants affected by PWSCC have been analyzed using PASCAL-NP. Failure probabilities calculated by PASCAL-NP are in reasonable agreement with the detection data. Furthermore, useful knowledge related to leakage due to PWSCC was obtained through parametric studies using this code

  16. Reliability assessment and probability based design of reinforced concrete containments and shear walls

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Ellingwood, B.; Shinozuka, M.

    1986-03-01

    This report summarizes work completed under the program entitled, ''Probability-Based Load Combinations for Design of Category I Structures.'' Under this program, the probabilistic models for various static and dynamic loads were formulated. The randomness and uncertainties in material strengths and structural resistance were established. Several limit states of concrete containments and shear walls were identified and analytically formulated. Furthermore, the reliability analysis methods for estimating limit state probabilities were established. These reliability analysis methods can be used to evaluate the safety levels of nuclear structures under various combinations of static and dynamic loads. They can also be used to generate analytically the fragility data for PRA studies. In addition to the development of reliability analysis methods, probability-based design criteria for concrete containments and shear wall structures have also been developed. The proposed design criteria are in the load and resistance factor design (LRFD) format. The load and resistance factors are determined for several limit states and target limit state probabilities. Thus, the proposed design criteria are risk-consistent and have a well-established rationale. 73 refs., 18 figs., 16 tabs

  17. Localized probability of improvement for kriging based multi-objective optimization

    Science.gov (United States)

    Li, Yinjiang; Xiao, Song; Barba, Paolo Di; Rotaru, Mihai; Sykulski, Jan K.

    2017-12-01

    The paper introduces a new approach to kriging based multi-objective optimization by utilizing a local probability of improvement as the infill sampling criterion and the nearest neighbor check to ensure diversification and uniform distribution of Pareto fronts. The proposed method is computationally fast and linearly scalable to higher dimensions.

  18. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  19. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    International Nuclear Information System (INIS)

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.; Rincon, G.; Go, R.T.; Williams, G.; Leatherman, J.

    1984-01-01

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence in patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application

  20. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  1. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Value and probability coding in a feedback-based learning task utilizing food rewards.

    Science.gov (United States)

    Tricomi, Elizabeth; Lempert, Karolina M

    2015-01-01

    For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedback by training participants to associate cues with food rewards and then employing a selective satiety procedure to devalue one food outcome. Using functional magnetic resonance imaging, we examined brain activity related to receipt of expected rewards, receipt of devalued outcomes, omission of expected rewards, omission of devalued outcomes, and expected omissions of an outcome. Nucleus accumbens activation was greater for rewarding outcomes than devalued outcomes, but activity in this region did not correlate with the probability of reward receipt. Activation of the right caudate and putamen, however, was largest in response to rewarding outcomes relative to expected omissions of reward. The dorsal striatum (caudate and putamen) at the time of feedback also showed a parametric increase correlating with the trialwise probability of reward receipt. Our results suggest that the ventral striatum is sensitive to the motivational relevance, or subjective value, of the outcome, while the dorsal striatum codes for a more complex signal that incorporates reward probability. Value and probability information may be integrated in the dorsal striatum, to facilitate action planning and allocation of effort. Copyright © 2015 the American Physiological Society.

  3. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  4. Cluster Validity Classification Approaches Based on Geometric Probability and Application in the Classification of Remotely Sensed Images

    Directory of Open Access Journals (Sweden)

    LI Jian-Wei

    2014-08-01

    Full Text Available On the basis of the cluster validity function based on geometric probability in literature [1, 2], propose a cluster analysis method based on geometric probability to process large amount of data in rectangular area. The basic idea is top-down stepwise refinement, firstly categories then subcategories. On all clustering levels, use the cluster validity function based on geometric probability firstly, determine clusters and the gathering direction, then determine the center of clustering and the border of clusters. Through TM remote sensing image classification examples, compare with the supervision and unsupervised classification in ERDAS and the cluster analysis method based on geometric probability in two-dimensional square which is proposed in literature 2. Results show that the proposed method can significantly improve the classification accuracy.

  5. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  6. Covariate-adjusted Spearman's rank correlation with probability-scale residuals.

    Science.gov (United States)

    Liu, Qi; Li, Chun; Wanga, Valentine; Shepherd, Bryan E

    2018-06-01

    It is desirable to adjust Spearman's rank correlation for covariates, yet existing approaches have limitations. For example, the traditionally defined partial Spearman's correlation does not have a sensible population parameter, and the conditional Spearman's correlation defined with copulas cannot be easily generalized to discrete variables. We define population parameters for both partial and conditional Spearman's correlation through concordance-discordance probabilities. The definitions are natural extensions of Spearman's rank correlation in the presence of covariates and are general for any orderable random variables. We show that they can be neatly expressed using probability-scale residuals (PSRs). This connection allows us to derive simple estimators. Our partial estimator for Spearman's correlation between X and Y adjusted for Z is the correlation of PSRs from models of X on Z and of Y on Z, which is analogous to the partial Pearson's correlation derived as the correlation of observed-minus-expected residuals. Our conditional estimator is the conditional correlation of PSRs. We describe estimation and inference, and highlight the use of semiparametric cumulative probability models, which allow preservation of the rank-based nature of Spearman's correlation. We conduct simulations to evaluate the performance of our estimators and compare them with other popular measures of association, demonstrating their robustness and efficiency. We illustrate our method in two applications, a biomarker study and a large survey. © 2017, The International Biometric Society.

  7. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  8. Precise Propagation of Upper and Lower Probability Bounds in System P

    OpenAIRE

    Gilio, Angelo

    2000-01-01

    In this paper we consider the inference rules of System P in the framework of coherent imprecise probabilistic assessments. Exploiting our algorithms, we propagate the lower and upper probability bounds associated with the conditional assertions of a given knowledge base, automatically obtaining the precise probability bounds for the derived conclusions of the inference rules. This allows a more flexible and realistic use of System P in default reasoning and provides an exact illustration of ...

  9. Experiencing El Niño conditions during early life reduces recruiting probabilities but not adult survival

    Science.gov (United States)

    Rodríguez, Cristina; Drummond, Hugh

    2018-01-01

    In wild long-lived animals, analysis of impacts of stressful natal conditions on adult performance has rarely embraced the entire age span, and the possibility that costs are expressed late in life has seldom been examined. Using 26 years of data from 8541 fledglings and 1310 adults of the blue-footed booby (Sula nebouxii), a marine bird that can live up to 23 years, we tested whether experiencing the warm waters and food scarcity associated with El Niño in the natal year reduces recruitment or survival over the adult lifetime. Warm water in the natal year reduced the probability of recruiting; each additional degree (°C) of water temperature meant a reduction of roughly 50% in fledglings' probability of returning to the natal colony as breeders. Warm water in the current year impacted adult survival, with greater effect at the oldest ages than during early adulthood. However, warm water in the natal year did not affect survival at any age over the adult lifespan. A previous study showed that early recruitment and widely spaced breeding allow boobies that experience warm waters in the natal year to achieve normal fledgling production over the first 10 years; our results now show that this reproductive effort incurs no survival penalty, not even late in life. This pattern is additional evidence of buffering against stressful natal conditions via life-history adjustments. PMID:29410788

  10. The Effect of Simulation-Based Learning on Prospective Teachers' Inference Skills in Teaching Probability

    Science.gov (United States)

    Koparan, Timur; Yilmaz, Gül Kaleli

    2015-01-01

    The effect of simulation-based probability teaching on the prospective teachers' inference skills has been examined with this research. In line with this purpose, it has been aimed to examine the design, implementation and efficiency of a learning environment for experimental probability. Activities were built on modeling, simulation and the…

  11. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  12. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  13. Task-set switching under cue-based versus memory-based switching conditions in younger and older adults.

    Science.gov (United States)

    Kray, Jutta

    2006-08-11

    Adult age differences in task switching and advance preparation were examined by comparing cue-based and memory-based switching conditions. Task switching was assessed by determining two types of costs that occur at the general (mixing costs) and specific (switching costs) level of switching. Advance preparation was investigated by varying the time interval until the next task (short, middle, very long). Results indicated that the implementation of task sets was different for cue-based switching with random task sequences and memory-based switching with predictable task sequences. Switching costs were strongly reduced under cue-based switching conditions, indicating that task-set cues facilitate the retrieval of the next task. Age differences were found for mixing costs and for switching costs only under cue-based conditions in which older adults showed smaller switching costs than younger adults. It is suggested that older adults adopt a less extreme bias between two tasks than younger adults in situations associated with uncertainty. For cue-based switching with random task sequences, older adults are less engaged in a complete reconfiguration of task sets because of the probability of a further task change. Furthermore, the reduction of switching costs was more pronounced for cue- than memory-based switching for short preparation intervals, whereas the reduction of switch costs was more pronounced for memory- than cue-based switching for longer preparation intervals at least for older adults. Together these findings suggest that the implementation of task sets is functionally different for the two types of task-switching conditions.

  14. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  15. Dynamic probability control limits for risk-adjusted CUSUM charts based on multiresponses.

    Science.gov (United States)

    Zhang, Xiang; Loda, Justin B; Woodall, William H

    2017-07-20

    For a patient who has survived a surgery, there could be several levels of recovery. Thus, it is reasonable to consider more than two outcomes when monitoring surgical outcome quality. The risk-adjusted cumulative sum (CUSUM) chart based on multiresponses has been developed for monitoring a surgical process with three or more outcomes. However, there is a significant effect of varying risk distributions on the in-control performance of the chart when constant control limits are applied. To overcome this disadvantage, we apply the dynamic probability control limits to the risk-adjusted CUSUM charts for multiresponses. The simulation results demonstrate that the in-control performance of the charts with dynamic probability control limits can be controlled for different patient populations because these limits are determined for each specific sequence of patients. Thus, the use of dynamic probability control limits for risk-adjusted CUSUM charts based on multiresponses allows each chart to be designed for the corresponding patient sequence of a surgeon or a hospital and therefore does not require estimating or monitoring the patients' risk distribution. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  17. Computing Moment-Based Probability Tables for Self-Shielding Calculations in Lattice Codes

    International Nuclear Information System (INIS)

    Hebert, Alain; Coste, Mireille

    2002-01-01

    As part of the self-shielding model used in the APOLLO2 lattice code, probability tables are required to compute self-shielded cross sections for coarse energy groups (typically with 99 or 172 groups). This paper describes the replacement of the multiband tables (typically with 51 subgroups) with moment-based tables in release 2.5 of APOLLO2. An improved Ribon method is proposed to compute moment-based probability tables, allowing important savings in CPU resources while maintaining the accuracy of the self-shielding algorithm. Finally, a validation is presented where the absorption rates obtained with each of these techniques are compared with exact values obtained using a fine-group elastic slowing-down calculation in the resolved energy domain. Other results, relative to the Rowland's benchmark and to three assembly production cases, are also presented

  18. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  19. The influence of initial beliefs on judgments of probability.

    Science.gov (United States)

    Yu, Erica C; Lagnado, David A

    2012-01-01

    This study aims to investigate whether experimentally induced prior beliefs affect processing of evidence including the updating of beliefs under uncertainty about the unknown probabilities of outcomes and the structural, outcome-generating nature of the environment. Participants played a gambling task in the form of computer-simulated slot machines and were given information about the slot machines' possible outcomes without their associated probabilities. One group was induced with a prior belief about the outcome space that matched the space of actual outcomes to be sampled; the other group was induced with a skewed prior belief that included the actual outcomes and also fictional higher outcomes. In reality, however, all participants sampled evidence from the same underlying outcome distribution, regardless of priors given. Before and during sampling, participants expressed their beliefs about the outcome distribution (values and probabilities). Evaluation of those subjective probability distributions suggests that all participants' judgments converged toward the observed outcome distribution. However, despite observing no supporting evidence for fictional outcomes, a significant proportion of participants in the skewed priors condition expected them in the future. A probe of the participants' understanding of the underlying outcome-generating processes indicated that participants' judgments were based on the information given in the induced priors and consequently, a significant proportion of participants in the skewed condition believed the slot machines were not games of chance while participants in the control condition believed the machines generated outcomes at random. Beyond Bayesian or heuristic belief updating, priors not only contribute to belief revision but also affect one's deeper understanding of the environment.

  20. Informing Environmental Water Management Decisions: Using Conditional Probability Networks to Address the Information Needs of Planning and Implementation Cycles

    Science.gov (United States)

    Horne, Avril C.; Szemis, Joanna M.; Webb, J. Angus; Kaur, Simranjit; Stewardson, Michael J.; Bond, Nick; Nathan, Rory

    2018-03-01

    One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.

  1. Informing Environmental Water Management Decisions: Using Conditional Probability Networks to Address the Information Needs of Planning and Implementation Cycles.

    Science.gov (United States)

    Horne, Avril C; Szemis, Joanna M; Webb, J Angus; Kaur, Simranjit; Stewardson, Michael J; Bond, Nick; Nathan, Rory

    2018-03-01

    One important aspect of adaptive management is the clear and transparent documentation of hypotheses, together with the use of predictive models (complete with any assumptions) to test those hypotheses. Documentation of such models can improve the ability to learn from management decisions and supports dialog between stakeholders. A key challenge is how best to represent the existing scientific knowledge to support decision-making. Such challenges are currently emerging in the field of environmental water management in Australia, where managers are required to prioritize the delivery of environmental water on an annual basis, using a transparent and evidence-based decision framework. We argue that the development of models of ecological responses to environmental water use needs to support both the planning and implementation cycles of adaptive management. Here we demonstrate an approach based on the use of Conditional Probability Networks to translate existing ecological knowledge into quantitative models that include temporal dynamics to support adaptive environmental flow management. It equally extends to other applications where knowledge is incomplete, but decisions must still be made.

  2. Probability-based collaborative filtering model for predicting gene–disease associations

    OpenAIRE

    Zeng, Xiangxiang; Ding, Ningxiang; Rodríguez-Patón, Alfonso; Zou, Quan

    2017-01-01

    Background Accurately predicting pathogenic human genes has been challenging in recent research. Considering extensive gene–disease data verified by biological experiments, we can apply computational methods to perform accurate predictions with reduced time and expenses. Methods We propose a probability-based collaborative filtering model (PCFM) to predict pathogenic human genes. Several kinds of data sets, containing data of humans and data of other nonhuman species, are integrated in our mo...

  3. Jet identification based on probability calculations using Bayes' theorem

    International Nuclear Information System (INIS)

    Jacobsson, C.; Joensson, L.; Lindgren, G.; Nyberg-Werther, M.

    1994-11-01

    The problem of identifying jets at LEP and HERA has been studied. Identification using jet energies and fragmentation properties was treated separately in order to investigate the degree of quark-gluon separation that can be achieved by either of these approaches. In the case of the fragmentation-based identification, a neural network was used, and a test of the dependence on the jet production process and the fragmentation model was done. Instead of working with the separation variables directly, these have been used to calculate probabilities of having a specific type of jet, according to Bayes' theorem. This offers a direct interpretation of the performance of the jet identification and provides a simple means of combining the results of the energy- and fragmentation-based identifications. (orig.)

  4. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  5. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  6. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  7. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  8. Probability-Based Determination Methods for Service Waiting in Service-Oriented Computing Environments

    Science.gov (United States)

    Zeng, Sen; Huang, Shuangxi; Liu, Yang

    Cooperative business processes (CBP)-based service-oriented enterprise networks (SOEN) are emerging with the significant advances of enterprise integration and service-oriented architecture. The performance prediction and optimization for CBP-based SOEN is very complex. To meet these challenges, one of the key points is to try to reduce an abstract service’s waiting number of its physical services. This paper introduces a probability-based determination method (PBDM) of an abstract service’ waiting number, M l , and time span, τ i , for its physical services. The determination of M i and τ i is according to the physical services’ arriving rule and their overall performance’s distribution functions. In PBDM, the arriving probability of the physical services with the best overall performance value is a pre-defined reliability. PBDM has made use of the information of the physical services’ arriving rule and performance distribution functions thoroughly, which will improve the computational efficiency for the scheme design and performance optimization of the collaborative business processes in service-oriented computing environments.

  9. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  10. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  11. Algorithms for the extension of precise and imprecise conditional probability assessments: an implementation with maple V

    Directory of Open Access Journals (Sweden)

    Veronica Biazzo

    2000-05-01

    Full Text Available In this paper, we illustrate an implementation with Maple V of some procedures which allow to exactly propagate precise and imprecise probability assessments. The extension of imprecise assessments is based on a suitable generalization of the concept of coherence of de Finetti. The procedures described are supported by some examples and relevant cases.

  12. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  13. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  14. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  15. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  16. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  17. Unequal Probability Marking Approach to Enhance Security of Traceback Scheme in Tree-Based WSNs.

    Science.gov (United States)

    Huang, Changqin; Ma, Ming; Liu, Xiao; Liu, Anfeng; Zuo, Zhengbang

    2017-06-17

    Fog (from core to edge) computing is a newly emerging computing platform, which utilizes a large number of network devices at the edge of a network to provide ubiquitous computing, thus having great development potential. However, the issue of security poses an important challenge for fog computing. In particular, the Internet of Things (IoT) that constitutes the fog computing platform is crucial for preserving the security of a huge number of wireless sensors, which are vulnerable to attack. In this paper, a new unequal probability marking approach is proposed to enhance the security performance of logging and migration traceback (LM) schemes in tree-based wireless sensor networks (WSNs). The main contribution of this paper is to overcome the deficiency of the LM scheme that has a higher network lifetime and large storage space. In the unequal probability marking logging and migration (UPLM) scheme of this paper, different marking probabilities are adopted for different nodes according to their distances to the sink. A large marking probability is assigned to nodes in remote areas (areas at a long distance from the sink), while a small marking probability is applied to nodes in nearby area (areas at a short distance from the sink). This reduces the consumption of storage and energy in addition to enhancing the security performance, lifetime, and storage capacity. Marking information will be migrated to nodes at a longer distance from the sink for increasing the amount of stored marking information, thus enhancing the security performance in the process of migration. The experimental simulation shows that for general tree-based WSNs, the UPLM scheme proposed in this paper can store 1.12-1.28 times the amount of stored marking information that the equal probability marking approach achieves, and has 1.15-1.26 times the storage utilization efficiency compared with other schemes.

  18. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset

    Science.gov (United States)

    Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  19. Knock probability estimation through an in-cylinder temperature model with exogenous noise

    Science.gov (United States)

    Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.

    2018-01-01

    This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.

  20. Computer-aided mathematical analysis of probability of intercept for ground-based communication intercept system

    Science.gov (United States)

    Park, Sang Chul

    1989-09-01

    We develop a mathematical analysis model to calculate the probability of intercept (POI) for the ground-based communication intercept (COMINT) system. The POI is a measure of the effectiveness of the intercept system. We define the POI as the product of the probability of detection and the probability of coincidence. The probability of detection is a measure of the receiver's capability to detect a signal in the presence of noise. The probability of coincidence is the probability that an intercept system is available, actively listening in the proper frequency band, in the right direction and at the same time that the signal is received. We investigate the behavior of the POI with respect to the observation time, the separation distance, antenna elevations, the frequency of the signal, and the receiver bandwidths. We observe that the coincidence characteristic between the receiver scanning parameters and the signal parameters is the key factor to determine the time to obtain a given POI. This model can be used to find the optimal parameter combination to maximize the POI in a given scenario. We expand this model to a multiple system. This analysis is conducted on a personal computer to provide the portability. The model is also flexible and can be easily implemented under different situations.

  1. Novel MGF-based expressions for the average bit error probability of binary signalling over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2014-04-01

    The main idea in the moment generating function (MGF) approach is to alternatively express the conditional bit error probability (BEP) in a desired exponential form so that possibly multi-fold performance averaging is readily converted into a computationally efficient single-fold averaging - sometimes into a closed-form - by means of using the MGF of the signal-to-noise ratio. However, as presented in [1] and specifically indicated in [2] and also to the best of our knowledge, there does not exist an MGF-based approach in the literature to represent Wojnar\\'s generic BEP expression in a desired exponential form. This paper presents novel MGF-based expressions for calculating the average BEP of binary signalling over generalized fading channels, specifically by expressing Wojnar\\'s generic BEP expression in a desirable exponential form. We also propose MGF-based expressions to explore the amount of dispersion in the BEP for binary signalling over generalized fading channels.

  2. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  3. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  4. Estimation of component failure probability from masked binomial system testing data

    International Nuclear Information System (INIS)

    Tan Zhibin

    2005-01-01

    The component failure probability estimates from analysis of binomial system testing data are very useful because they reflect the operational failure probability of components in the field which is similar to the test environment. In practice, this type of analysis is often confounded by the problem of data masking: the status of tested components is unknown. Methods in considering this type of uncertainty are usually computationally intensive and not practical to solve the problem for complex systems. In this paper, we consider masked binomial system testing data and develop a probabilistic model to efficiently estimate component failure probabilities. In the model, all system tests are classified into test categories based on component coverage. Component coverage of test categories is modeled by a bipartite graph. Test category failure probabilities conditional on the status of covered components are defined. An EM algorithm to estimate component failure probabilities is developed based on a simple but powerful concept: equivalent failures and tests. By simulation we not only demonstrate the convergence and accuracy of the algorithm but also show that the probabilistic model is capable of analyzing systems in series, parallel and any other user defined structures. A case study illustrates an application in test case prioritization

  5. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  6. Condition-based Human Reliability Assessment for digitalized control room

    International Nuclear Information System (INIS)

    Kang, H. G.; Jang, S. C.; Eom, H. S.; Ha, J. J.

    2005-04-01

    In safety-critical systems, the generation failure of an actuation signal is caused by the concurrent failures of the automated systems and an operator action. These two sources of safety signals are complicatedly correlated. The failures of sensors or automated systems will cause a lack of necessary information for a human operator and result in error-forcing contexts such as the loss of corresponding alarms and indications. In the conventional analysis, the Human Error Probabilities (HEP) are estimated based on the assumption of 'normal condition of indications and alarms'. In order to construct a more realistic signal-generation failure model, we have to consider more complicated conditions in a more realistic manner. In this study, we performed two kinds of investigation for addressing this issue. We performed the analytic calculations for estimating the effect of sensors failures on the system unavailability and plant risk. For the single-parameter safety signals, the analysis result reveals that the quantification of the HEP should be performed by focusing on the 'no alarm from the automatic system and corresponding indications unavailable' situation. This study also proposes a Condition-Based Human Reliability Assessment (CBHRA) method in order to address these complicated conditions in a practical way. We apply the CBHRA method to the manual actuation of the safety features such as a reactor trip and auxiliary feedwater actuation in Korean Standard Nuclear Power Plants. In the case of conventional single HEP method, it is very hard to consider the multiple HE conditions. The merit of CBHRA is clearly shown in the application to the AFAS generation where no dominating HE condition exits. In this case, even if the HE conditions are carefully investigated, the single HEP method cannot accommodate the multiple conditions in a fault tree. On the other hand, the application result of the reactor trip in SLOCA shows that if there is a dominating condition, the use

  7. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  8. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  9. Thermal disadvantage factor calculation by the multiregion collision probability method

    International Nuclear Information System (INIS)

    Ozgener, B.; Ozgener, H.A.

    2004-01-01

    A multi-region collision probability formulation that is capable of applying white boundary condition directly is presented and applied to thermal neutron transport problems. The disadvantage factors computed are compared with their counterparts calculated by S N methods with both direct and indirect application of white boundary condition. The results of the ABH and collision probability method with indirect application of white boundary condition are also considered and comparisons with benchmark Monte Carlo results are carried out. The studies show that the proposed formulation is capable of calculating thermal disadvantage factor with sufficient accuracy without resorting to the fictitious scattering outer shell approximation associated with the indirect application of the white boundary condition in collision probability solutions

  10. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  11. Hydrological model calibration for flood prediction in current and future climates using probability distributions of observed peak flows and model based rainfall

    Science.gov (United States)

    Haberlandt, Uwe; Wallner, Markus; Radtke, Imke

    2013-04-01

    Derived flood frequency analysis based on continuous hydrological modelling is very demanding regarding the required length and temporal resolution of precipitation input data. Often such flood predictions are obtained using long precipitation time series from stochastic approaches or from regional climate models as input. However, the calibration of the hydrological model is usually done using short time series of observed data. This inconsistent employment of different data types for calibration and application of a hydrological model increases its uncertainty. Here, it is proposed to calibrate a hydrological model directly on probability distributions of observed peak flows using model based rainfall in line with its later application. Two examples are given to illustrate the idea. The first one deals with classical derived flood frequency analysis using input data from an hourly stochastic rainfall model. The second one concerns a climate impact analysis using hourly precipitation from a regional climate model. The results show that: (I) the same type of precipitation input data should be used for calibration and application of the hydrological model, (II) a model calibrated on extreme conditions works quite well for average conditions but not vice versa, (III) the calibration of the hydrological model using regional climate model data works as an implicit bias correction method and (IV) the best performance for flood estimation is usually obtained when model based precipitation and observed probability distribution of peak flows are used for model calibration.

  12. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  14. On the Hitting Probability of Max-Stable Processes

    OpenAIRE

    Hofmann, Martin

    2012-01-01

    The probability that a max-stable process {\\eta} in C[0, 1] with identical marginal distribution function F hits x \\in R with 0 < F (x) < 1 is the hitting probability of x. We show that the hitting probability is always positive, unless the components of {\\eta} are completely dependent. Moreover, we consider the event that the paths of standard MSP hit some x \\in R twice and we give a sufficient condition for a positive probability of this event.

  15. Bayesian Based Diagnostic Model for Condition Based Maintenance of Offshore Wind Farms

    Directory of Open Access Journals (Sweden)

    Masoud Asgarpour

    2018-01-01

    Full Text Available Operation and maintenance costs are a major contributor to the Levelized Cost of Energy for electricity produced by offshore wind and can be significantly reduced if existing corrective actions are performed as efficiently as possible and if future corrective actions are avoided by performing sufficient preventive actions. This paper presents an applied and generic diagnostic model for fault detection and condition based maintenance of offshore wind components. The diagnostic model is based on two probabilistic matrices; first, a confidence matrix, representing the probability of detection using each fault detection method, and second, a diagnosis matrix, representing the individual outcome of each fault detection method. Once the confidence and diagnosis matrices of a component are defined, the individual diagnoses of each fault detection method are combined into a final verdict on the fault state of that component. Furthermore, this paper introduces a Bayesian updating model based on observations collected by inspections to decrease the uncertainty of initial confidence matrix. The framework and implementation of the presented diagnostic model are further explained within a case study for a wind turbine component based on vibration, temperature, and oil particle fault detection methods. The last part of the paper will have a discussion of the case study results and present conclusions.

  16. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  17. Mode of delivery and the probability of subsequent childbearing: a population-based register study.

    Science.gov (United States)

    Elvander, C; Dahlberg, J; Andersson, G; Cnattingius, S

    2015-11-01

    To investigate the relationship between mode of first delivery and probability of subsequent childbearing. Population-based study. Nationwide study in Sweden. A cohort of 771 690 women who delivered their first singleton infant in Sweden between 1992 and 2010. Using Cox's proportional-hazards regression models, risks of subsequent childbearing were compared across four modes of delivery. Hazard ratios (HRs) were calculated, using 95% confidence intervals (95% CIs). Probability of having a second and third child; interpregnancy interval. Compared with women who had a spontaneous vaginal first delivery, women who delivered by vacuum extraction were less likely to have a second pregnancy (HR 0.96, 95% CI 0.95-0.97), and the probabilities of a second childbirth were substantially lower among women with a previous emergency caesarean section (HR 0.85, 95% CI 0.84-0.86) or an elective caesarean section (HR 0.82, 95% CI 0.80-0.83). There were no clinically important differences in the median time between first and second pregnancy by mode of first delivery. Compared with women younger than 30 years of age, older women were more negatively affected by a vacuum extraction with respect to the probability of having a second child. A primary vacuum extraction decreased the probability of having a third child by 4%, but having two consecutive vacuum extraction deliveries did not further alter the probability. A first delivery by vacuum extraction does not reduce the probability of subsequent childbearing to the same extent as a first delivery by emergency or elective caesarean section. © 2014 Royal College of Obstetricians and Gynaecologists.

  18. Teaching Probability to Pre-Service Teachers with Argumentation Based Science Learning Approach

    Science.gov (United States)

    Can, Ömer Sinan; Isleyen, Tevfik

    2016-01-01

    The aim of this study is to explore the effects of the argumentation based science learning (ABSL) approach on the teaching probability to pre-service teachers. The sample of the study included 41 students studying at the Department of Elementary School Mathematics Education in a public university during the 2014-2015 academic years. The study is…

  19. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  20. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Science.gov (United States)

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  1. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Directory of Open Access Journals (Sweden)

    Karel Doubravsky

    Full Text Available Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (rechecked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  2. Conditional net survival: Relevant prognostic information for colorectal cancer survivors. A French population-based study.

    Science.gov (United States)

    Drouillard, Antoine; Bouvier, Anne-Marie; Rollot, Fabien; Faivre, Jean; Jooste, Valérie; Lepage, Côme

    2015-07-01

    Traditionally, survival estimates have been reported as survival from the time of diagnosis. A patient's probability of survival changes according to time elapsed since the diagnosis and this is known as conditional survival. The aim was to estimate 5-year net conditional survival in patients with colorectal cancer in a well-defined French population at yearly intervals up to 5 years. Our study included 18,300 colorectal cancers diagnosed between 1976 and 2008 and registered in the population-based digestive cancer registry of Burgundy (France). We calculated conditional 5-year net survival, using the Pohar Perme estimator, for every additional year survived after diagnosis from 1 to 5 years. The initial 5-year net survival estimates varied between 89% for stage I and 9% for advanced stage cancer. The corresponding 5-year net survival for patients alive after 5 years was 95% and 75%. Stage II and III patients who survived 5 years had a similar probability of surviving 5 more years, respectively 87% and 84%. For survivors after the first year following diagnosis, five-year conditional net survival was similar regardless of age class and period of diagnosis. For colorectal cancer survivors, conditional net survival provides relevant and complementary prognostic information for patients and clinicians. Copyright © 2015 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  3. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  4. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  5. Differentiated protection services with failure probability guarantee for workflow-based applications

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Jin, Yaohui; Sun, Weiqiang; Hu, Weisheng

    2010-12-01

    A cost-effective and service-differentiated provisioning strategy is very desirable to service providers so that they can offer users satisfactory services, while optimizing network resource allocation. Providing differentiated protection services to connections for surviving link failure has been extensively studied in recent years. However, the differentiated protection services for workflow-based applications, which consist of many interdependent tasks, have scarcely been studied. This paper investigates the problem of providing differentiated services for workflow-based applications in optical grid. In this paper, we develop three differentiated protection services provisioning strategies which can provide security level guarantee and network-resource optimization for workflow-based applications. The simulation demonstrates that these heuristic algorithms provide protection cost-effectively while satisfying the applications' failure probability requirements.

  6. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  7. Probability theory versus simulation of petroleum potential in play analysis

    Science.gov (United States)

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  8. Statistics concerning the Apollo command module water landing, including the probability of occurrence of various impact conditions, sucessful impact, and body X-axis loads

    Science.gov (United States)

    Whitnah, A. M.; Howes, D. B.

    1971-01-01

    Statistical information for the Apollo command module water landings is presented. This information includes the probability of occurrence of various impact conditions, a successful impact, and body X-axis loads of various magnitudes.

  9. A Hybrid Generalized Hidden Markov Model-Based Condition Monitoring Approach for Rolling Bearings.

    Science.gov (United States)

    Liu, Jie; Hu, Youmin; Wu, Bo; Wang, Yan; Xie, Fengyun

    2017-05-18

    The operating condition of rolling bearings affects productivity and quality in the rotating machine process. Developing an effective rolling bearing condition monitoring approach is critical to accurately identify the operating condition. In this paper, a hybrid generalized hidden Markov model-based condition monitoring approach for rolling bearings is proposed, where interval valued features are used to efficiently recognize and classify machine states in the machine process. In the proposed method, vibration signals are decomposed into multiple modes with variational mode decomposition (VMD). Parameters of the VMD, in the form of generalized intervals, provide a concise representation for aleatory and epistemic uncertainty and improve the robustness of identification. The multi-scale permutation entropy method is applied to extract state features from the decomposed signals in different operating conditions. Traditional principal component analysis is adopted to reduce feature size and computational cost. With the extracted features' information, the generalized hidden Markov model, based on generalized interval probability, is used to recognize and classify the fault types and fault severity levels. Finally, the experiment results show that the proposed method is effective at recognizing and classifying the fault types and fault severity levels of rolling bearings. This monitoring method is also efficient enough to quantify the two uncertainty components.

  10. A Hybrid Generalized Hidden Markov Model-Based Condition Monitoring Approach for Rolling Bearings

    Directory of Open Access Journals (Sweden)

    Jie Liu

    2017-05-01

    Full Text Available The operating condition of rolling bearings affects productivity and quality in the rotating machine process. Developing an effective rolling bearing condition monitoring approach is critical to accurately identify the operating condition. In this paper, a hybrid generalized hidden Markov model-based condition monitoring approach for rolling bearings is proposed, where interval valued features are used to efficiently recognize and classify machine states in the machine process. In the proposed method, vibration signals are decomposed into multiple modes with variational mode decomposition (VMD. Parameters of the VMD, in the form of generalized intervals, provide a concise representation for aleatory and epistemic uncertainty and improve the robustness of identification. The multi-scale permutation entropy method is applied to extract state features from the decomposed signals in different operating conditions. Traditional principal component analysis is adopted to reduce feature size and computational cost. With the extracted features’ information, the generalized hidden Markov model, based on generalized interval probability, is used to recognize and classify the fault types and fault severity levels. Finally, the experiment results show that the proposed method is effective at recognizing and classifying the fault types and fault severity levels of rolling bearings. This monitoring method is also efficient enough to quantify the two uncertainty components.

  11. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. On parametric domain for asymptotic stability with probability one of zero solution of linear Ito stochastic differential equations

    International Nuclear Information System (INIS)

    Phan Thanh An; Phan Le Na; Ngo Quoc Chung

    2004-05-01

    We describe a practical implementation for finding parametric domain for asymptotic stability with probability one of zero solution of linear Ito stochastic differential equations based on Korenevskij and Mitropolskij's sufficient condition and our sufficient conditions. Numerical results show that all of these sufficient conditions are crucial in the implementation. (author)

  13. Condition based spare parts supply

    NARCIS (Netherlands)

    Lin, X.; Basten, Robertus Johannes Ida; Kranenburg, A.A.; van Houtum, Geert-Jan

    2012-01-01

    We consider a spare parts stock point that serves an installed base of machines. Each machine contains the same critical component, whose degradation behavior is described by a Markov process. We consider condition based spare parts supply, and show that an optimal, condition based inventory policy

  14. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  15. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  16. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    Science.gov (United States)

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  17. UQ for Decision Making: How (at least five) Kinds of Probability Might Come Into Play

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    information the model-based probability is conditioned on holds. It is argued that no model-based climate-like probability forecast is complete without a quantitative estimate of its own irrelevance, and that the clear identification of model-based probability forecasts as mature or immature, are critical elements for maintaining the credibility of science-based decision support, and can shape uncertainty quantification more widely.

  18. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  19. A Bayesian-probability-based method for assigning protein backbone dihedral angles based on chemical shifts and local sequences

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jun; Liu Haiyan [University of Science and Technology of China, Hefei National Laboratory for Physical Sciences at the Microscale, and Key Laboratory of Structural Biology, School of Life Sciences (China)], E-mail: hyliu@ustc.edu.cn

    2007-01-15

    Chemical shifts contain substantial information about protein local conformations. We present a method to assign individual protein backbone dihedral angles into specific regions on the Ramachandran map based on the amino acid sequences and the chemical shifts of backbone atoms of tripeptide segments. The method uses a scoring function derived from the Bayesian probability for the central residue of a query tripeptide segment to have a particular conformation. The Ramachandran map is partitioned into representative regions at two levels of resolution. The lower resolution partitioning is equivalent to the conventional definitions of different secondary structure regions on the map. At the higher resolution level, the {alpha} and {beta} regions are further divided into subregions. Predictions are attempted at both levels of resolution. We compared our method with TALOS using the original TALOS database, and obtained comparable results. Although TALOS may produce the best results with currently available databases which are much enlarged, the Bayesian-probability-based approach can provide a quantitative measure for the reliability of predictions.

  20. An extended car-following model considering the appearing probability of truck and driver's characteristics

    Science.gov (United States)

    Rong, Ying; Wen, Huiying

    2018-05-01

    In this paper, the appearing probability of truck is introduced and an extended car-following model is presented to analyze the traffic flow based on the consideration of driver's characteristics, under honk environment. The stability condition of this proposed model is obtained through linear stability analysis. In order to study the evolution properties of traffic wave near the critical point, the mKdV equation is derived by the reductive perturbation method. The results show that the traffic flow will become more disorder for the larger appearing probability of truck. Besides, the appearance of leading truck affects not only the stability of traffic flow, but also the effect of other aspects on traffic flow, such as: driver's reaction and honk effect. The effects of them on traffic flow are closely correlated with the appearing probability of truck. Finally, the numerical simulations under the periodic boundary condition are carried out to verify the proposed model. And they are consistent with the theoretical findings.

  1. A Comparison of Urge Intensity and the Probability of Tic Completion During Tic Freely and Tic Suppression Conditions.

    Science.gov (United States)

    Specht, Matt W; Nicotra, Cassandra M; Kelly, Laura M; Woods, Douglas W; Ricketts, Emily J; Perry-Parrish, Carisa; Reynolds, Elizabeth; Hankinson, Jessica; Grados, Marco A; Ostrander, Rick S; Walkup, John T

    2014-03-01

    Tic-suppression-based treatments (TSBTs) represent a safe and effective treatment option for Chronic Tic Disorders (CTDs). Prior research has demonstrated that treatment naive youths with CTDs have the capacity to safely and effectively suppress tics for prolonged periods. It remains unclear how tic suppression is achieved. The current study principally examines how effective suppression is achieved and preliminary correlates of the ability to suppress tics. Twelve youths, ages 10 to 17 years, with moderate-to-marked CTDs participated in an alternating sequence of tic freely and reinforced tic suppression conditions during which urge intensity and tic frequency were frequently assessed. Probability of tics occurring was half as likely following high-intensity urges during tic suppression (31%) in contrast to low-intensity urges during tic freely conditions (60%). Age was not associated with ability to suppress. Intelligence indices were associated with or trended toward greater ability to suppress tics. Attention difficulties were not associated with ability to suppress but were associated with tic severity. In contrast to our "selective suppression" hypothesis, we found participants equally capable of suppressing their tics regardless of urge intensity during reinforced tic suppression. Tic suppression was achieved with an "across-the-board" effort to resist urges. Preliminary data suggest that ability to suppress may be associated with general cognitive variables rather than age, tic severity, urge severity, and attention. Treatment naive youths appear to possess a capacity for robust tic suppression. TSBTs may bolster these capacities and/or enable their broader implementation, resulting in symptom improvement. © The Author(s) 2014.

  2. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  3. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  4. A New Self-Constrained Inversion Method of Potential Fields Based on Probability Tomography

    Science.gov (United States)

    Sun, S.; Chen, C.; WANG, H.; Wang, Q.

    2014-12-01

    The self-constrained inversion method of potential fields uses a priori information self-extracted from potential field data. Differing from external a priori information, the self-extracted information are generally parameters derived exclusively from the analysis of the gravity and magnetic data (Paoletti et al., 2013). Here we develop a new self-constrained inversion method based on probability tomography. Probability tomography doesn't need any priori information, as well as large inversion matrix operations. Moreover, its result can describe the sources, especially the distribution of which is complex and irregular, entirely and clearly. Therefore, we attempt to use the a priori information extracted from the probability tomography results to constrain the inversion for physical properties. The magnetic anomaly data was taken as an example in this work. The probability tomography result of magnetic total field anomaly(ΔΤ) shows a smoother distribution than the anomalous source and cannot display the source edges exactly. However, the gradients of ΔΤ are with higher resolution than ΔΤ in their own direction, and this characteristic is also presented in their probability tomography results. So we use some rules to combine the probability tomography results of ∂ΔΤ⁄∂x, ∂ΔΤ⁄∂y and ∂ΔΤ⁄∂z into a new result which is used for extracting a priori information, and then incorporate the information into the model objective function as spatial weighting functions to invert the final magnetic susceptibility. Some magnetic synthetic examples incorporated with and without a priori information extracted from the probability tomography results were made to do comparison, results of which show that the former are more concentrated and with higher resolution of the source body edges. This method is finally applied in an iron mine in China with field measured ΔΤ data and performs well. ReferencesPaoletti, V., Ialongo, S., Florio, G., Fedi, M

  5. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  6. Exact calculation of loop formation probability identifies folding motifs in RNA secondary structures

    Science.gov (United States)

    Sloma, Michael F.; Mathews, David H.

    2016-01-01

    RNA secondary structure prediction is widely used to analyze RNA sequences. In an RNA partition function calculation, free energy nearest neighbor parameters are used in a dynamic programming algorithm to estimate statistical properties of the secondary structure ensemble. Previously, partition functions have largely been used to estimate the probability that a given pair of nucleotides form a base pair, the conditional stacking probability, the accessibility to binding of a continuous stretch of nucleotides, or a representative sample of RNA structures. Here it is demonstrated that an RNA partition function can also be used to calculate the exact probability of formation of hairpin loops, internal loops, bulge loops, or multibranch loops at a given position. This calculation can also be used to estimate the probability of formation of specific helices. Benchmarking on a set of RNA sequences with known secondary structures indicated that loops that were calculated to be more probable were more likely to be present in the known structure than less probable loops. Furthermore, highly probable loops are more likely to be in the known structure than the set of loops predicted in the lowest free energy structures. PMID:27852924

  7. Ensemble based system for whole-slide prostate cancer probability mapping using color texture features.

    LENUS (Irish Health Repository)

    DiFranco, Matthew D

    2011-01-01

    We present a tile-based approach for producing clinically relevant probability maps of prostatic carcinoma in histological sections from radical prostatectomy. Our methodology incorporates ensemble learning for feature selection and classification on expert-annotated images. Random forest feature selection performed over varying training sets provides a subset of generalized CIEL*a*b* co-occurrence texture features, while sample selection strategies with minimal constraints reduce training data requirements to achieve reliable results. Ensembles of classifiers are built using expert-annotated tiles from training images, and scores for the probability of cancer presence are calculated from the responses of each classifier in the ensemble. Spatial filtering of tile-based texture features prior to classification results in increased heat-map coherence as well as AUC values of 95% using ensembles of either random forests or support vector machines. Our approach is designed for adaptation to different imaging modalities, image features, and histological decision domains.

  8. Quantum probabilities of composite events in quantum measurements with multimode states

    International Nuclear Information System (INIS)

    Yukalov, V I; Sornette, D

    2013-01-01

    The problem of defining quantum probabilities of composite events is considered. This problem is of great importance for the theory of quantum measurements and for quantum decision theory, which is a part of measurement theory. We show that the Lüders probability of consecutive measurements is a transition probability between two quantum states and that this probability cannot be treated as a quantum extension of the classical conditional probability. The Wigner distribution is shown to be a weighted transition probability that cannot be accepted as a quantum extension of the classical joint probability. We suggest the definition of quantum joint probabilities by introducing composite events in multichannel measurements. The notion of measurements under uncertainty is defined. We demonstrate that the necessary condition for mode interference is the entanglement of the composite prospect together with the entanglement of the composite statistical state. As an illustration, we consider an example of a quantum game. Special attention is paid to the application of the approach to systems with multimode states, such as atoms, molecules, quantum dots, or trapped Bose-condensed atoms with several coherent modes. (paper)

  9. USING RASCH ANALYSIS TO EXPLORE WHAT STUDENTS LEARN ABOUT PROBABILITY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand.  Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand.Keywords: Perceived Understanding, Probability Concepts, Rasch Measurement Model DOI: dx.doi.org/10.22342/jme.61.1

  10. Methodology for reliability based condition assessment

    International Nuclear Information System (INIS)

    Mori, Y.; Ellingwood, B.

    1993-08-01

    Structures in nuclear power plants may be exposed to aggressive environmental effects that cause their strength to decrease over an extended period of service. A major concern in evaluating the continued service for such structures is to ensure that in their current condition they are able to withstand future extreme load events during the intended service life with a level of reliability sufficient for public safety. This report describes a methodology to facilitate quantitative assessments of current and future structural reliability and performance of structures in nuclear power plants. This methodology takes into account the nature of past and future loads, and randomness in strength and in degradation resulting from environmental factors. An adaptive Monte Carlo simulation procedure is used to evaluate time-dependent system reliability. The time-dependent reliability is sensitive to the time-varying load characteristics and to the choice of initial strength and strength degradation models but not to correlation in component strengths within a system. Inspection/maintenance strategies are identified that minimize the expected future costs of keeping the failure probability of a structure at or below an established target failure probability during its anticipated service period

  11. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  12. Operating envelope to minimize probability of fractures in Zircaloy-2 pressure tubes

    International Nuclear Information System (INIS)

    Azer, N.; Wong, H.

    1994-01-01

    The failure mode of primary concern with Candu pressure tubes is fast fracture of a through-wall axial crack, resulting from delayed hydride crack growth. The application of operating envelopes is demonstrated to minimize the probability of fracture in Zircaloy-2 pressure tubes based on Zr-2.5%Nb pressure tube experience. The technical basis for the development of the operating envelopes is also summarized. The operating envelope represents an area on the pressure versus temperature diagram within which the reactor may be operated without undue concern for pressure tube fracture. The envelopes presented address both normal operating conditions and the condition where a pressure tube leak has been detected. The examples in this paper are prepared to illustrate the methodology, and are not intended to be directly applicable to the operation of any specific reactor. The application of operating envelopes to minimized the probability of fracture in 80 mm diameter Zircaloy-2 pressure tubes has been discussed. Both normal operating and leaking pressure tube conditions have been considered. 3 refs., 4 figs

  13. Visualizing RNA Secondary Structure Base Pair Binding Probabilities using Nested Concave Hulls

    OpenAIRE

    Sansen , Joris; Bourqui , Romain; Thebault , Patricia; Allali , Julien; Auber , David

    2015-01-01

    International audience; The challenge 1 of the BIOVIS 2015 design contest consists in designing an intuitive visual depiction of base pairs binding probabilities for secondary structure of ncRNA. Our representation depicts the potential nucleotide pairs binding using nested concave hulls over the computed MFE ncRNA secondary structure. Thus, it allows to identify regions with a high level of uncertainty in the MFE computation and the structures which seem to match to reality.

  14. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  15. Safety constraints applied to an adaptive Bayesian condition-based maintenance optimization model

    International Nuclear Information System (INIS)

    Flage, Roger; Coit, David W.; Luxhøj, James T.; Aven, Terje

    2012-01-01

    A model is described that determines an optimal inspection and maintenance scheme for a deteriorating unit with a stochastic degradation process with independent and stationary increments and for which the parameters are uncertain. This model and resulting maintenance plans offers some distinct benefits compared to prior research because the uncertainty of the degradation process is accommodated by a Bayesian approach and two new safety constraints have been applied to the problem: (1) with a given subjective probability (degree of belief), the limiting relative frequency of one or more failures during a fixed time interval is bounded; or (2) the subjective probability of one or more failures during a fixed time interval is bounded. In the model, the parameter(s) of a condition-based inspection scheduling function and a preventive replacement threshold are jointly optimized upon each replacement and inspection such as to minimize the expected long run cost per unit of time, but also considering one of the specified safety constraints. A numerical example is included to illustrate the effect of imposing each of the two different safety constraints.

  16. Markov transition probability-based network from time series for characterizing experimental two-phase flow

    International Nuclear Information System (INIS)

    Gao Zhong-Ke; Hu Li-Dan; Jin Ning-De

    2013-01-01

    We generate a directed weighted complex network by a method based on Markov transition probability to represent an experimental two-phase flow. We first systematically carry out gas—liquid two-phase flow experiments for measuring the time series of flow signals. Then we construct directed weighted complex networks from various time series in terms of a network generation method based on Markov transition probability. We find that the generated network inherits the main features of the time series in the network structure. In particular, the networks from time series with different dynamics exhibit distinct topological properties. Finally, we construct two-phase flow directed weighted networks from experimental signals and associate the dynamic behavior of gas-liquid two-phase flow with the topological statistics of the generated networks. The results suggest that the topological statistics of two-phase flow networks allow quantitative characterization of the dynamic flow behavior in the transitions among different gas—liquid flow patterns. (general)

  17. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  18. Condition-based fault tree analysis (CBFTA): A new method for improved fault tree analysis (FTA), reliability and safety calculations

    International Nuclear Information System (INIS)

    Shalev, Dan M.; Tiran, Joseph

    2007-01-01

    Condition-based maintenance methods have changed systems reliability in general and individual systems in particular. Yet, this change does not affect system reliability analysis. System fault tree analysis (FTA) is performed during the design phase. It uses components failure rates derived from available sources as handbooks, etc. Condition-based fault tree analysis (CBFTA) starts with the known FTA. Condition monitoring (CM) methods applied to systems (e.g. vibration analysis, oil analysis, electric current analysis, bearing CM, electric motor CM, and so forth) are used to determine updated failure rate values of sensitive components. The CBFTA method accepts updated failure rates and applies them to the FTA. The CBFTA recalculates periodically the top event (TE) failure rate (λ TE ) thus determining the probability of system failure and the probability of successful system operation-i.e. the system's reliability. FTA is a tool for enhancing system reliability during the design stages. But, it has disadvantages, mainly it does not relate to a specific system undergoing maintenance. CBFTA is tool for updating reliability values of a specific system and for calculating the residual life according to the system's monitored conditions. Using CBFTA, the original FTA is ameliorated to a practical tool for use during the system's field life phase, not just during system design phase. This paper describes the CBFTA method and its advantages are demonstrated by an example

  19. An evaluation method for tornado missile strike probability with stochastic correction

    Energy Technology Data Exchange (ETDEWEB)

    Eguchi, Yuzuru; Murakami, Takahiro; Hirakuchi, Hiromaru; Sugimoto, Soichiro; Hattori, Yasuo [Nuclear Risk Research Center (External Natural Event Research Team), Central Research Institute of Electric Power Industry, Abiko (Japan)

    2017-03-15

    An efficient evaluation method for the probability of a tornado missile strike without using the Monte Carlo method is proposed in this paper. A major part of the proposed probability evaluation is based on numerical results computed using an in-house code, Tornado-borne missile analysis code, which enables us to evaluate the liftoff and flight behaviors of unconstrained objects on the ground driven by a tornado. Using the Tornado-borne missile analysis code, we can obtain a stochastic correlation between local wind speed and flight distance of each object, and this stochastic correlation is used to evaluate the conditional strike probability, QV(r), of a missile located at position r, where the local wind speed is V. In contrast, the annual exceedance probability of local wind speed, which can be computed using a tornado hazard analysis code, is used to derive the probability density function, p(V). Then, we finally obtain the annual probability of tornado missile strike on a structure with the convolutional integration of product of QV(r) and p(V) over V. The evaluation method is applied to a simple problem to qualitatively confirm the validity, and to quantitatively verify the results for two extreme cases in which an object is located just in the vicinity of or far away from the structure.

  20. An evaluation method for tornado missile strike probability with stochastic correction

    International Nuclear Information System (INIS)

    Eguchi, Yuzuru; Murakami, Takahiro; Hirakuchi, Hiromaru; Sugimoto, Soichiro; Hattori, Yasuo

    2017-01-01

    An efficient evaluation method for the probability of a tornado missile strike without using the Monte Carlo method is proposed in this paper. A major part of the proposed probability evaluation is based on numerical results computed using an in-house code, Tornado-borne missile analysis code, which enables us to evaluate the liftoff and flight behaviors of unconstrained objects on the ground driven by a tornado. Using the Tornado-borne missile analysis code, we can obtain a stochastic correlation between local wind speed and flight distance of each object, and this stochastic correlation is used to evaluate the conditional strike probability, QV(r), of a missile located at position r, where the local wind speed is V. In contrast, the annual exceedance probability of local wind speed, which can be computed using a tornado hazard analysis code, is used to derive the probability density function, p(V). Then, we finally obtain the annual probability of tornado missile strike on a structure with the convolutional integration of product of QV(r) and p(V) over V. The evaluation method is applied to a simple problem to qualitatively confirm the validity, and to quantitatively verify the results for two extreme cases in which an object is located just in the vicinity of or far away from the structure

  1. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  2. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  3. Upper Bounds for Ruin Probability with Stochastic Investment Return

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    Risk models with stochastic investment return are widely held in practice, as well as in more challenging research fields. Risk theory is mainly concerned with ruin probability, and a tight bound for ruin probability is the best for practical use. This paper presents a discrete time risk model with stochastic investment return. Conditional expectation properties and martingale inequalities are used to obtain both exponential and non-exponential upper bounds for the ruin probability.

  4. Protein single-model quality assessment by feature-based probability density functions.

    Science.gov (United States)

    Cao, Renzhi; Cheng, Jianlin

    2016-04-04

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.

  5. Minimum Probability of Error-Based Equalization Algorithms for Fading Channels

    Directory of Open Access Journals (Sweden)

    Janos Levendovszky

    2007-06-01

    Full Text Available Novel channel equalizer algorithms are introduced for wireless communication systems to combat channel distortions resulting from multipath propagation. The novel algorithms are based on newly derived bounds on the probability of error (PE and guarantee better performance than the traditional zero forcing (ZF or minimum mean square error (MMSE algorithms. The new equalization methods require channel state information which is obtained by a fast adaptive channel identification algorithm. As a result, the combined convergence time needed for channel identification and PE minimization still remains smaller than the convergence time of traditional adaptive algorithms, yielding real-time equalization. The performance of the new algorithms is tested by extensive simulations on standard mobile channels.

  6. Theoretical determination of gamma spectrometry systems efficiency based on probability functions. Application to self-attenuation correction factors

    Energy Technology Data Exchange (ETDEWEB)

    Barrera, Manuel, E-mail: manuel.barrera@uca.es [Escuela Superior de Ingeniería, University of Cadiz, Avda, Universidad de Cadiz 10, 11519 Puerto Real, Cadiz (Spain); Suarez-Llorens, Alfonso [Facultad de Ciencias, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cadiz (Spain); Casas-Ruiz, Melquiades; Alonso, José J.; Vidal, Juan [CEIMAR, University of Cadiz, Avda, Rep. Saharaui s/n, 11510 Puerto Real, Cádiz (Spain)

    2017-05-11

    A generic theoretical methodology for the calculation of the efficiency of gamma spectrometry systems is introduced in this work. The procedure is valid for any type of source and detector and can be applied to determine the full energy peak and the total efficiency of any source-detector system. The methodology is based on the idea of underlying probability of detection, which describes the physical model for the detection of the gamma radiation at the particular studied situation. This probability depends explicitly on the direction of the gamma radiation, allowing the use of this dependence the development of more realistic and complex models than the traditional models based on the point source integration. The probability function that has to be employed in practice must reproduce the relevant characteristics of the detection process occurring at the particular studied situation. Once the probability is defined, the efficiency calculations can be performed in general by using numerical methods. Monte Carlo integration procedure is especially useful to perform the calculations when complex probability functions are used. The methodology can be used for the direct determination of the efficiency and also for the calculation of corrections that require this determination of the efficiency, as it is the case of coincidence summing, geometric or self-attenuation corrections. In particular, we have applied the procedure to obtain some of the classical self-attenuation correction factors usually employed to correct for the sample attenuation of cylindrical geometry sources. The methodology clarifies the theoretical basis and approximations associated to each factor, by making explicit the probability which is generally hidden and implicit to each model. It has been shown that most of these self-attenuation correction factors can be derived by using a common underlying probability, having this probability a growing level of complexity as it reproduces more precisely

  7. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  8. Bearing Fault Classification Based on Conditional Random Field

    Directory of Open Access Journals (Sweden)

    Guofeng Wang

    2013-01-01

    Full Text Available Condition monitoring of rolling element bearing is paramount for predicting the lifetime and performing effective maintenance of the mechanical equipment. To overcome the drawbacks of the hidden Markov model (HMM and improve the diagnosis accuracy, conditional random field (CRF model based classifier is proposed. In this model, the feature vectors sequences and the fault categories are linked by an undirected graphical model in which their relationship is represented by a global conditional probability distribution. In comparison with the HMM, the main advantage of the CRF model is that it can depict the temporal dynamic information between the observation sequences and state sequences without assuming the independence of the input feature vectors. Therefore, the interrelationship between the adjacent observation vectors can also be depicted and integrated into the model, which makes the classifier more robust and accurate than the HMM. To evaluate the effectiveness of the proposed method, four kinds of bearing vibration signals which correspond to normal, inner race pit, outer race pit and roller pit respectively are collected from the test rig. And the CRF and HMM models are built respectively to perform fault classification by taking the sub band energy features of wavelet packet decomposition (WPD as the observation sequences. Moreover, K-fold cross validation method is adopted to improve the evaluation accuracy of the classifier. The analysis and comparison under different fold times show that the accuracy rate of classification using the CRF model is higher than the HMM. This method brings some new lights on the accurate classification of the bearing faults.

  9. Left passage probability of Schramm-Loewner Evolution

    Science.gov (United States)

    Najafi, M. N.

    2013-06-01

    SLE(κ,ρ⃗) is a variant of Schramm-Loewner Evolution (SLE) which describes the curves which are not conformal invariant, but are self-similar due to the presence of some other preferred points on the boundary. In this paper we study the left passage probability (LPP) of SLE(κ,ρ⃗) through field theoretical framework and find the differential equation governing this probability. This equation is numerically solved for the special case κ=2 and hρ=0 in which hρ is the conformal weight of the boundary changing (bcc) operator. It may be referred to loop erased random walk (LERW) and Abelian sandpile model (ASM) with a sink on its boundary. For the curve which starts from ξ0 and conditioned by a change of boundary conditions at x0, we find that this probability depends significantly on the factor x0-ξ0. We also present the perturbative general solution for large x0. As a prototype, we apply this formalism to SLE(κ,κ-6) which governs the curves that start from and end on the real axis.

  10. Measuring survival time: a probability-based approach useful in healthcare decision-making.

    Science.gov (United States)

    2011-01-01

    In some clinical situations, the choice between treatment options takes into account their impact on patient survival time. Due to practical constraints (such as loss to follow-up), survival time is usually estimated using a probability calculation based on data obtained in clinical studies or trials. The two techniques most commonly used to estimate survival times are the Kaplan-Meier method and the actuarial method. Despite their limitations, they provide useful information when choosing between treatment options.

  11. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  12. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  13. USING RASCH ANALYSIS TO EXPLORE WHAT STUDENTS LEARN ABOUT PROBABILITY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand. Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand

  14. Using Rasch Analysis To Explore What Students Learn About Probability Concepts

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand. Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand.

  15. Effectiveness of Telephone-Based Health Coaching for Patients with Chronic Conditions: A Randomised Controlled Trial.

    Directory of Open Access Journals (Sweden)

    Martin Härter

    Full Text Available Chronic diseases, like diabetes mellitus, heart disease and cancer are leading causes of death and disability. These conditions are at least partially preventable or modifiable, e.g. by enhancing patients' self-management. We aimed to examine the effectiveness of telephone-based health coaching (TBHC in chronically ill patients.This prospective, pragmatic randomized controlled trial compares an intervention group (IG of participants in TBHC to a control group (CG without TBHC. Endpoints were assessed two years after enrolment. Three different groups of insurees with 1 multiple conditions (chronic campaign, 2 heart failure (heart failure campaign, or 3 chronic mental illness conditions (mental health campaign were targeted. The telephone coaching included evidence-based information and was based on the concepts of motivational interviewing, shared decision-making, and collaborative goal setting. Patients received an average of 12.9 calls. Primary outcome was time from enrolment until hospital readmission within a two-year follow-up period. Secondary outcomes comprised the probability of hospital readmission, number of daily defined medication doses (DDD, frequency and duration of inability to work, and mortality within two years. All outcomes were collected from routine data provided by the statutory health insurance. As informed consent was obtained after randomization, propensity score matching (PSM was used to minimize selection bias introduced by decliners. For the analysis of hospital readmission and mortality, we calculated Kaplan-Meier curves and estimated hazard ratios (HR. Probability of hospital readmission and probability of death were analysed by calculating odds ratios (OR. Quantity of health service use and inability to work were analysed by linear random effects regression models. PSM resulted in patient samples of 5,309 (IG: 2,713; CG: 2,596 in the chronic campaign, of 660 (IG: 338; CG: 322 in the heart failure campaign, and of

  16. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    Science.gov (United States)

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  17. The transition probability and the probability for the left-most particle's position of the q-totally asymmetric zero range process

    Energy Technology Data Exchange (ETDEWEB)

    Korhonen, Marko [Department of Mathematics and Statistics, University of Helsinki, FIN-00014 (Finland); Lee, Eunghyun [Centre de Recherches Mathématiques (CRM), Université de Montréal, Quebec H3C 3J7 (Canada)

    2014-01-15

    We treat the N-particle zero range process whose jumping rates satisfy a certain condition. This condition is required to use the Bethe ansatz and the resulting model is the q-boson model by Sasamoto and Wadati [“Exact results for one-dimensional totally asymmetric diffusion models,” J. Phys. A 31, 6057–6071 (1998)] or the q-totally asymmetric zero range process (TAZRP) by Borodin and Corwin [“Macdonald processes,” Probab. Theory Relat. Fields (to be published)]. We find the explicit formula of the transition probability of the q-TAZRP via the Bethe ansatz. By using the transition probability we find the probability distribution of the left-most particle's position at time t. To find the probability for the left-most particle's position we find a new identity corresponding to identity for the asymmetric simple exclusion process by Tracy and Widom [“Integral formulas for the asymmetric simple exclusion process,” Commun. Math. Phys. 279, 815–844 (2008)]. For the initial state that all particles occupy a single site, the probability distribution of the left-most particle's position at time t is represented by the contour integral of a determinant.

  18. Failure probability estimate of type 304 stainless steel piping

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L.; Mehta, H.S.; Ranganath, S.

    1989-01-01

    The primary source of in-service degradation of the SRS production reactor process water piping is intergranular stress corrosion cracking (IGSCC). IGSCC has occurred in a limited number of weld heat affected zones, areas known to be susceptible to IGSCC. A model has been developed to combine crack growth rates, crack size distributions, in-service examination reliability estimates and other considerations to estimate the pipe large-break frequency. This frequency estimates the probability that an IGSCC crack will initiate, escape detection by ultrasonic (UT) examination, and grow to instability prior to extending through-wall and being detected by the sensitive leak detection system. These events are combined as the product of four factors: (1) the probability that a given weld heat affected zone contains IGSCC; (2) the conditional probability, given the presence of IGSCC, that the cracking will escape detection during UT examination; (3) the conditional probability, given a crack escapes detection by UT, that it will not grow through-wall and be detected by leakage; (4) the conditional probability, given a crack is not detected by leakage, that it grows to instability prior to the next UT exam. These four factors estimate the occurrence of several conditions that must coexist in order for a crack to lead to a large break of the process water piping. When evaluated for the SRS production reactors, they produce an extremely low break frequency. The objective of this paper is to present the assumptions, methodology, results and conclusions of a probabilistic evaluation for the direct failure of the primary coolant piping resulting from normal operation and seismic loads. This evaluation was performed to support the ongoing PRA effort and to complement deterministic analyses addressing the credibility of a double-ended guillotine break

  19. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  20. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  1. ELIPGRID-PC: A PC program for calculating hot spot probabilities

    International Nuclear Information System (INIS)

    Davidson, J.R.

    1994-10-01

    ELIPGRID-PC, a new personal computer program has been developed to provide easy access to Singer's 1972 ELIPGRID algorithm for hot-spot detection probabilities. Three features of the program are the ability to determine: (1) the grid size required for specified conditions, (2) the smallest hot spot that can be sampled with a given probability, and (3) the approximate grid size resulting from specified conditions and sampling cost. ELIPGRID-PC also provides probability of hit versus cost data for graphing with spread-sheets or graphics software. The program has been successfully tested using Singer's published ELIPGRID results. An apparent error in the original ELIPGRID code has been uncovered and an appropriate modification incorporated into the new program

  2. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  3. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    Science.gov (United States)

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which

  4. Probability analysis of MCO over-pressurization during staging

    International Nuclear Information System (INIS)

    Pajunen, A.L.

    1997-01-01

    The purpose of this calculation is to determine the probability of Multi-Canister Overpacks (MCOs) over-pressurizing during staging at the Canister Storage Building (CSB). Pressurization of an MCO during staging is dependent upon changes to the MCO gas temperature and the build-up of reaction products during the staging period. These effects are predominantly limited by the amount of water that remains in the MCO following cold vacuum drying that is available for reaction during staging conditions. Because of the potential for increased pressure within an MCO, provisions for a filtered pressure relief valve and rupture disk have been incorporated into the MCO design. This calculation provides an estimate of the frequency that an MCO will contain enough water to pressurize beyond the limits of these design features. The results of this calculation will be used in support of further safety analyses and operational planning efforts. Under the bounding steady state CSB condition assumed for this analysis, an MCO must contain less than 1.6 kg (3.7 lbm) of water available for reaction to preclude actuation of the pressure relief valve at 100 psid. To preclude actuation of the MCO rupture disk at 150 psid, an MCO must contain less than 2.5 kg (5.5 lbm) of water available for reaction. These limits are based on the assumption that hydrogen generated by uranium-water reactions is the sole source of gas produced within the MCO and that hydrates in fuel particulate are the primary source of water available for reactions during staging conditions. The results of this analysis conclude that the probability of the hydrate water content of an MCO exceeding 1.6 kg is 0.08 and the probability that it will exceed 2.5 kg is 0.01. This implies that approximately 32 of 400 staged MCOs may experience pressurization to the point where the pressure relief valve actuates. In the event that an MCO pressure relief valve fails to open, the probability is 1 in 100 that the MCO would experience

  5. A Trust-Based Adaptive Probability Marking and Storage Traceback Scheme for WSNs

    Science.gov (United States)

    Liu, Anfeng; Liu, Xiao; Long, Jun

    2016-01-01

    Security is a pivotal issue for wireless sensor networks (WSNs), which are emerging as a promising platform that enables a wide range of military, scientific, industrial and commercial applications. Traceback, a key cyber-forensics technology, can play an important role in tracing and locating a malicious source to guarantee cybersecurity. In this work a trust-based adaptive probability marking and storage (TAPMS) traceback scheme is proposed to enhance security for WSNs. In a TAPMS scheme, the marking probability is adaptively adjusted according to the security requirements of the network and can substantially reduce the number of marking tuples and improve network lifetime. More importantly, a high trust node is selected to store marking tuples, which can avoid the problem of marking information being lost. Experimental results show that the total number of marking tuples can be reduced in a TAPMS scheme, thus improving network lifetime. At the same time, since the marking tuples are stored in high trust nodes, storage reliability can be guaranteed, and the traceback time can be reduced by more than 80%. PMID:27043566

  6. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  7. Probability-Based Recognition Framework for Underwater Landmarks Using Sonar Images †.

    Science.gov (United States)

    Lee, Yeongjun; Choi, Jinwoo; Ko, Nak Yong; Choi, Hyun-Taek

    2017-08-24

    This paper proposes a probability-based framework for recognizing underwater landmarks using sonar images. Current recognition methods use a single image, which does not provide reliable results because of weaknesses of the sonar image such as unstable acoustic source, many speckle noises, low resolution images, single channel image, and so on. However, using consecutive sonar images, if the status-i.e., the existence and identity (or name)-of an object is continuously evaluated by a stochastic method, the result of the recognition method is available for calculating the uncertainty, and it is more suitable for various applications. Our proposed framework consists of three steps: (1) candidate selection, (2) continuity evaluation, and (3) Bayesian feature estimation. Two probability methods-particle filtering and Bayesian feature estimation-are used to repeatedly estimate the continuity and feature of objects in consecutive images. Thus, the status of the object is repeatedly predicted and updated by a stochastic method. Furthermore, we develop an artificial landmark to increase detectability by an imaging sonar, which we apply to the characteristics of acoustic waves, such as instability and reflection depending on the roughness of the reflector surface. The proposed method is verified by conducting basin experiments, and the results are presented.

  8. Evaluation of gene importance in microarray data based upon probability of selection

    Directory of Open Access Journals (Sweden)

    Fu Li M

    2005-03-01

    Full Text Available Abstract Background Microarray devices permit a genome-scale evaluation of gene function. This technology has catalyzed biomedical research and development in recent years. As many important diseases can be traced down to the gene level, a long-standing research problem is to identify specific gene expression patterns linking to metabolic characteristics that contribute to disease development and progression. The microarray approach offers an expedited solution to this problem. However, it has posed a challenging issue to recognize disease-related genes expression patterns embedded in the microarray data. In selecting a small set of biologically significant genes for classifier design, the nature of high data dimensionality inherent in this problem creates substantial amount of uncertainty. Results Here we present a model for probability analysis of selected genes in order to determine their importance. Our contribution is that we show how to derive the P value of each selected gene in multiple gene selection trials based on different combinations of data samples and how to conduct a reliability analysis accordingly. The importance of a gene is indicated by its associated P value in that a smaller value implies higher information content from information theory. On the microarray data concerning the subtype classification of small round blue cell tumors, we demonstrate that the method is capable of finding the smallest set of genes (19 genes with optimal classification performance, compared with results reported in the literature. Conclusion In classifier design based on microarray data, the probability value derived from gene selection based on multiple combinations of data samples enables an effective mechanism for reducing the tendency of fitting local data particularities.

  9. Analytical Model for the Probability Characteristics of a Crack Penetrating Capsules in Capsule-Based Self-Healing Cementitious Materials

    Directory of Open Access Journals (Sweden)

    Zhong LV

    2017-08-01

    Full Text Available Autonomous crack healing using pre-embedded capsules containing healing agent is becoming a promising approach to restore the strength of damaged structures. In addition to the material properties, the size and volume fraction of capsules influence crack healing in the matrix. Understanding the crack and capsule interaction is critical in the development and design of structures made of capsule-based self-healing materials. Continuing our previous study, in this contribution a more practical rupturing mode of capsules characterizing the rupturing manner of capsules fractured by cracks in cementitious materials is presented, i.e., penetrating mode. With the underlying assumption that a crack penetrating capsules undoubtedly leads to crack healing, geometrical probability theory is employed to develop the quantitative relationship between crack size and capsule size, capsule concentration in capsule-based self-healing virtual cementitious material. Moreover, an analytical expression of probability of a crack penetrating with randomly dispersed capsules is developed in two-dimensional material matrix setup. The influences of the induced rupturing modes of capsules embedded on the self-healing efficiency are analyzed. Much attention is paid to compare the penetrating probability and the hitting probability, in order to assist the designer to make a choice of the optimal rupturing modes of capsules embedded. The accuracy of results of the theoretical model is also compared with Monte-Carlo numerical analysis of crack interacting with capsules. It shows that the developed probability characteristics of a crack interaction with capsules for different rupturing modes is helpful to provide guidelines for designer working with capsule-based self-healing cementitious materials.DOI: http://dx.doi.org/10.5755/j01.ms.23.3.16888

  10. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  11. Passage and survival probabilities of juvenile Chinook salmon at Cougar Dam, Oregon, 2012

    Science.gov (United States)

    Beeman, John W.; Evans, Scott D.; Haner, Philip V.; Hansel, Hal C.; Hansen, Amy C.; Smith, Collin D.; Sprando, Jamie M.

    2014-01-01

    This report describes studies of juvenile-salmon dam passage and apparent survival at Cougar Dam, Oregon, during two operating conditions in 2012. Cougar Dam is a 158-meter tall rock-fill dam used primarily for flood control, and passes water through a temperature control tower to either a powerhouse penstock or to a regulating outlet (RO). The temperature control tower has moveable weir gates to enable water of different elevations and temperatures to be drawn through the dam to control water temperatures downstream. A series of studies of downstream dam passage of juvenile salmonids were begun after the National Oceanic and Atmospheric Administration determined that Cougar Dam was impacting the viability of anadromous fish stocks. The primary objectives of the studies described in this report were to estimate the route-specific fish passage probabilities at the dam and to estimate the survival probabilities of fish passing through the RO. The first set of dam operating conditions, studied in November, consisted of (1) a mean reservoir elevation of 1,589 feet, (2) water entering the temperature control tower through the weir gates, (3) most water routed through the turbines during the day and through the RO during the night, and (4) mean RO gate openings of 1.2 feet during the day and 3.2 feet during the night. The second set of dam operating conditions, studied in December, consisted of (1) a mean reservoir elevation of 1,507 ft, (2) water entering the temperature control tower through the RO bypass, (3) all water passing through the RO, and (4) mean RO gate openings of 7.3 feet during the day and 7.5 feet during the night. The studies were based on juvenile Chinook salmon (Oncorhynchus tshawytscha) surgically implanted with radio transmitters and passive integrated transponder (PIT) tags. Inferences about general dam passage percentage and timing of volitional migrants were based on surface-acclimated fish released in the reservoir. Dam passage and apparent

  12. What Are Probability Surveys used by the National Aquatic Resource Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  13. Condition Monitoring for Roller Bearings of Wind Turbines Based on Health Evaluation under Variable Operating States

    Directory of Open Access Journals (Sweden)

    Lei Fu

    2017-10-01

    Full Text Available Condition monitoring (CM is used to assess the health status of wind turbines (WT by detecting turbine failure and predicting maintenance needs. However, fluctuating operating conditions cause variations in monitored features, therefore increasing the difficulty of CM, for example, the frequency-domain analysis may lead to an inaccurate or even incorrect prediction when evaluating the health of the WT components. In light of this challenge, this paper proposed a method for the health evaluation of WT components based on vibration signals. The proposed approach aimed to reduce the evaluation error caused by the impact of the variable operating condition. First, the vibration signal was decomposed into a set of sub-signals using variational mode decomposition (VMD. Next, the sub-signal energy and the probability distribution were obtained and normalized. Finally, the concept of entropy was introduced to evaluate the health condition of a monitored object to provide an effective guide for maintenance. In particular, the health evaluation for CM was based on a performance review over a range of operating conditions, rather than at a certain single operating condition. Experimental investigations were performed which verified the efficiency of the evaluation method, as well as a comparison with the previous method.

  14. New normative standards of conditional reasoning and the dual-source model.

    Science.gov (United States)

    Singmann, Henrik; Klauer, Karl Christoph; Over, David

    2014-01-01

    There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task.

  15. New Normative Standards of Conditional Reasoning and the Dual-Source Model

    Directory of Open Access Journals (Sweden)

    Henrik eSingmann

    2014-04-01

    Full Text Available There has been a major shift in research on human reasoning towards Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998 for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer & Kleiter, 2005, 2010 exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer, Beller, & Hütter, 2010 is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task.

  16. Learning difficulties of senior high school students based on probability understanding levels

    Science.gov (United States)

    Anggara, B.; Priatna, N.; Juandi, D.

    2018-05-01

    Identifying students' difficulties in learning concept of probability is important for teachers to prepare the appropriate learning processes and can overcome obstacles that may arise in the next learning processes. This study revealed the level of students' understanding of the concept of probability and identified their difficulties as a part of the epistemological obstacles identification of the concept of probability. This study employed a qualitative approach that tends to be the character of descriptive research involving 55 students of class XII. In this case, the writer used the diagnostic test of probability concept learning difficulty, observation, and interview as the techniques to collect the data needed. The data was used to determine levels of understanding and the learning difficulties experienced by the students. From the result of students' test result and learning observation, it was found that the mean cognitive level was at level 2. The findings indicated that students had appropriate quantitative information of probability concept but it might be incomplete or incorrectly used. The difficulties found are the ones in arranging sample space, events, and mathematical models related to probability problems. Besides, students had difficulties in understanding the principles of events and prerequisite concept.

  17. Estimating deficit probabilities with price-responsive demand in contract-based electricity markets

    International Nuclear Information System (INIS)

    Galetovic, Alexander; Munoz, Cristian M.

    2009-01-01

    Studies that estimate deficit probabilities in hydrothermal systems have generally ignored the response of demand to changing prices, in the belief that such response is largely irrelevant. We show that ignoring the response of demand to prices can lead to substantial over or under estimation of the probability of an energy deficit. To make our point we present an estimation of deficit probabilities in Chile's Central Interconnected System between 2006 and 2010. This period is characterized by tight supply, fast consumption growth and rising electricity prices. When the response of demand to rising prices is acknowledged, forecasted deficit probabilities and marginal costs are shown to be substantially lower

  18. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  19. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  20. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  1. A method for the calculation of the cumulative failure probability distribution of complex repairable systems

    International Nuclear Information System (INIS)

    Caldarola, L.

    1976-01-01

    A method is proposed for the analytical evaluation of the cumulative failure probability distribution of complex repairable systems. The method is based on a set of integral equations each one referring to a specific minimal cut set of the system. Each integral equation links the unavailability of a minimal cut set to its failure probability density distribution and to the probability that the minimal cut set is down at the time t under the condition that it was down at time t'(t'<=t). The limitations for the applicability of the method are also discussed. It has been concluded that the method is applicable if the process describing the failure of a minimal cut set is a 'delayed semi-regenerative process'. (Auth.)

  2. A probable new type of osteopenic bone disease

    Energy Technology Data Exchange (ETDEWEB)

    Widhe, Torulf L. [Department of Orthopaedics, Huddinge University Hospital (Sweden)

    2002-06-01

    A probable new type of osteopenic bone disease in two sisters and one female cousin is described. In infancy, the radiological findings were osteopenia, coxa vara, periosteal cloaking, bowing of the long bones, and flaring of the metaphyses. During growth, spinal pathology developed with compression of the vertebral bodies and scoliosis in one girl and kyphosis in another. All three children had genu valgum and two developed severe S-shaped bowing of the tibiae. Growth was stunted. Inheritance of this disorder is probably recessive. Type I and III collagen biosynthesis was normal. This condition is probably a hitherto undescribed form of osteogenesis imperfecta type III or a new bone disease. (orig.)

  3. A probable new type of osteopenic bone disease

    International Nuclear Information System (INIS)

    Widhe, Torulf L.

    2002-01-01

    A probable new type of osteopenic bone disease in two sisters and one female cousin is described. In infancy, the radiological findings were osteopenia, coxa vara, periosteal cloaking, bowing of the long bones, and flaring of the metaphyses. During growth, spinal pathology developed with compression of the vertebral bodies and scoliosis in one girl and kyphosis in another. All three children had genu valgum and two developed severe S-shaped bowing of the tibiae. Growth was stunted. Inheritance of this disorder is probably recessive. Type I and III collagen biosynthesis was normal. This condition is probably a hitherto undescribed form of osteogenesis imperfecta type III or a new bone disease. (orig.)

  4. Fasting: Benefits and probable health harmfulness from the Islamic perspective

    Directory of Open Access Journals (Sweden)

    Mahdi Ebrahimi

    2015-06-01

    Medical science can determine the effects and consequences of thirst and hunger during the month of Ramadan. In the religious perspective, it has been emphasized that fasting is for achieving the divine virtue, and this shouldn’t be in conflict with maintaining man’s health. Therefore, the conditions in which there is the probability of harmfulness to man’s health due to fasting, man shouldn’t fast. As a result, medical science could recognize the conditions in which there is probable harmfulness to man’s health.

  5. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  6. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    Science.gov (United States)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  7. The Efficacy of Using Diagrams When Solving Probability Word Problems in College

    Science.gov (United States)

    Beitzel, Brian D.; Staley, Richard K.

    2015-01-01

    Previous experiments have shown a deleterious effect of visual representations on college students' ability to solve total- and joint-probability word problems. The present experiments used conditional-probability problems, known to be more difficult than total- and joint-probability problems. The diagram group was instructed in how to use tree…

  8. Edge Probability and Pixel Relativity-Based Speckle Reducing Anisotropic Diffusion.

    Science.gov (United States)

    Mishra, Deepak; Chaudhury, Santanu; Sarkar, Mukul; Soin, Arvinder Singh; Sharma, Vivek

    2018-02-01

    Anisotropic diffusion filters are one of the best choices for speckle reduction in the ultrasound images. These filters control the diffusion flux flow using local image statistics and provide the desired speckle suppression. However, inefficient use of edge characteristics results in either oversmooth image or an image containing misinterpreted spurious edges. As a result, the diagnostic quality of the images becomes a concern. To alleviate such problems, a novel anisotropic diffusion-based speckle reducing filter is proposed in this paper. A probability density function of the edges along with pixel relativity information is used to control the diffusion flux flow. The probability density function helps in removing the spurious edges and the pixel relativity reduces the oversmoothing effects. Furthermore, the filtering is performed in superpixel domain to reduce the execution time, wherein a minimum of 15% of the total number of image pixels can be used. For performance evaluation, 31 frames of three synthetic images and 40 real ultrasound images are used. In most of the experiments, the proposed filter shows a better performance as compared to the state-of-the-art filters in terms of the speckle region's signal-to-noise ratio and mean square error. It also shows a comparative performance for figure of merit and structural similarity measure index. Furthermore, in the subjective evaluation, performed by the expert radiologists, the proposed filter's outputs are preferred for the improved contrast and sharpness of the object boundaries. Hence, the proposed filtering framework is suitable to reduce the unwanted speckle and improve the quality of the ultrasound images.

  9. MR-based automatic delineation of volumes of interest in human brain PET images using probability maps

    DEFF Research Database (Denmark)

    Svarer, Claus; Madsen, Karina; Hasselbalch, Steen G.

    2005-01-01

    The purpose of this study was to develop and validate an observer-independent approach for automatic generation of volume-of-interest (VOI) brain templates to be used in emission tomography studies of the brain. The method utilizes a VOI probability map created on the basis of a database of several...... delineation of the VOI set. The approach was also shown to work equally well in individuals with pronounced cerebral atrophy. Probability-map-based automatic delineation of VOIs is a fast, objective, reproducible, and safe way to assess regional brain values from PET or SPECT scans. In addition, the method...

  10. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  11. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  12. Testing the Conditional Mean Function of Autoregressive Conditional Duration Models

    DEFF Research Database (Denmark)

    Hautsch, Nikolaus

    be subject to censoring structures. In an empirical study based on financial transaction data we present an application of the model to estimate conditional asset price change probabilities. Evaluating the forecasting properties of the model, it is shown that the proposed approach is a promising competitor......This paper proposes a dynamic proportional hazard (PH) model with non-specified baseline hazard for the modelling of autoregressive duration processes. A categorization of the durations allows us to reformulate the PH model as an ordered response model based on extreme value distributed errors...

  13. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    Science.gov (United States)

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  14. An Optimization Method for Condition Based Maintenance of Aircraft Fleet Considering Prognostics Uncertainty

    Directory of Open Access Journals (Sweden)

    Qiang Feng

    2014-01-01

    Full Text Available An optimization method for condition based maintenance (CBM of aircraft fleet considering prognostics uncertainty is proposed. The CBM and dispatch process of aircraft fleet is analyzed first, and the alternative strategy sets for single aircraft are given. Then, the optimization problem of fleet CBM with lower maintenance cost and dispatch risk is translated to the combinatorial optimization problem of single aircraft strategy. Remain useful life (RUL distribution of the key line replaceable Module (LRM has been transformed into the failure probability of the aircraft and the fleet health status matrix is established. And the calculation method of the costs and risks for mission based on health status matrix and maintenance matrix is given. Further, an optimization method for fleet dispatch and CBM under acceptable risk is proposed based on an improved genetic algorithm. Finally, a fleet of 10 aircrafts is studied to verify the proposed method. The results shows that it could realize optimization and control of the aircraft fleet oriented to mission success.

  15. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  16. Diagnostics of enterprise bankruptcy occurrence probability in an anti-crisis management: modern approaches and classification of models

    Directory of Open Access Journals (Sweden)

    I.V. Zhalinska

    2015-09-01

    Full Text Available Diagnostics of enterprise bankruptcy occurrence probability is defined as an important tool ensuring the viability of an organization under conditions of unpredictable dynamic environment. The paper aims to define the basic features of diagnostics of bankruptcy occurrence probability models and their classification. The article grounds the objective increasing of crisis probability in modern enterprises where such increasing leads to the need to improve the efficiency of anti-crisis enterprise activities. The system of anti-crisis management is based on the subsystem of diagnostics of bankruptcy occurrence probability. Such a subsystem is the main one for further measures to prevent and overcome the crisis. The classification of existing models of enterprise bankruptcy occurrence probability has been suggested. The classification is based on methodical and methodological principles of models. The following main groups of models are determined: the models using financial ratios, aggregates and scores, the models of discriminated analysis, the methods of strategic analysis, informal models, artificial intelligence systems and the combination of the models. The classification made it possible to identify the analytical capabilities of each of the groups of models suggested.

  17. PREDICTION OF RESERVOIR FLOW RATE OF DEZ DAM BY THE PROBABILITY MATRIX METHOD

    Directory of Open Access Journals (Sweden)

    Mohammad Hashem Kanani

    2012-12-01

    Full Text Available The data collected from the operation of existing storage reservoirs, could offer valuable information for the better allocation and management of fresh water rates for future use to mitigation droughts effect. In this paper the long-term Dez reservoir (IRAN water rate prediction is presented using probability matrix method. Data is analyzed to find the probability matrix of water rates in Dez reservoir based on the previous history of annual water entrance during the past and present years(40 years. The algorithm developed covers both, the overflow and non-overflow conditions in the reservoir. Result of this study shows that in non-overflow conditions the most exigency case is equal to 75%. This means that, if the reservoir is empty (the stored water is less than 100 MCM this year, it would be also empty by 75% next year. The stored water in the reservoir would be less than 300 MCM by 85% next year if the reservoir is empty this year. This percentage decreases to 70% next year if the water of reservoir is less than 300 MCM this year. The percentage also decreases to 5% next year if the reservoir is full this year. In overflow conditions the most exigency case is equal to 75% again. The reservoir volume would be less than 150 MCM by 90% next year, if it is empty this year. This percentage decreases to 70% if its water volume is less than 300 MCM and 55% if the water volume is less than 500 MCM this year. Result shows that too, if the probability matrix of water rates to a reservoir is multiplied by itself repeatedly; it converges to a constant probability matrix, which could be used to predict the long-term water rate of the reservoir. In other words, the probability matrix of series of water rates is changed to a steady probability matrix in the course of time, which could reflect the hydrological behavior of the watershed and could be easily used for the long-term prediction of water storage in the down stream reservoirs.

  18. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  19. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  20. Modeling spatial variability of sand-lenses in clay till settings using transition probability and multiple-point geostatistics

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Nilsson, Bertel; Klint, Knud Erik

    2010-01-01

    (TPROGS) of alternating geological facies. The second method, multiple-point statistics, uses training images to estimate the conditional probability of sand-lenses at a certain location. Both methods respect field observations such as local stratigraphy, however, only the multiple-point statistics can...... of sand-lenses in clay till. Sand-lenses mainly account for horizontal transport and are prioritised in this study. Based on field observations, the distribution has been modeled using two different geostatistical approaches. One method uses a Markov chain model calculating the transition probabilities...

  1. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  2. Literature-based condition-specific miRNA-mRNA target prediction.

    Directory of Open Access Journals (Sweden)

    Minsik Oh

    Full Text Available miRNAs are small non-coding RNAs that regulate gene expression by binding to the 3'-UTR of genes. Many recent studies have reported that miRNAs play important biological roles by regulating specific mRNAs or genes. Many sequence-based target prediction algorithms have been developed to predict miRNA targets. However, these methods are not designed for condition-specific target predictions and produce many false positives; thus, expression-based target prediction algorithms have been developed for condition-specific target predictions. A typical strategy to utilize expression data is to leverage the negative control roles of miRNAs on genes. To control false positives, a stringent cutoff value is typically set, but in this case, these methods tend to reject many true target relationships, i.e., false negatives. To overcome these limitations, additional information should be utilized. The literature is probably the best resource that we can utilize. Recent literature mining systems compile millions of articles with experiments designed for specific biological questions, and the systems provide a function to search for specific information. To utilize the literature information, we used a literature mining system, BEST, that automatically extracts information from the literature in PubMed and that allows the user to perform searches of the literature with any English words. By integrating omics data analysis methods and BEST, we developed Context-MMIA, a miRNA-mRNA target prediction method that combines expression data analysis results and the literature information extracted based on the user-specified context. In the pathway enrichment analysis using genes included in the top 200 miRNA-targets, Context-MMIA outperformed the four existing target prediction methods that we tested. In another test on whether prediction methods can re-produce experimentally validated target relationships, Context-MMIA outperformed the four existing target prediction

  3. Height probabilities in the Abelian sandpile model on the generalized finite Bethe lattice

    Science.gov (United States)

    Chen, Haiyan; Zhang, Fuji

    2013-08-01

    In this paper, we study the sandpile model on the generalized finite Bethe lattice with a particular boundary condition. Using a combinatorial method, we give the exact expressions for all single-site probabilities and some two-site joint probabilities. As a by-product, we prove that the height probabilities of bulk vertices are all the same for the Bethe lattice with certain given boundary condition, which was found from numerical evidence by Grassberger and Manna ["Some more sandpiles," J. Phys. (France) 51, 1077-1098 (1990)], 10.1051/jphys:0199000510110107700 but without a proof.

  4. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  5. On estimating probability of presence from use-availability or presence-background data.

    Science.gov (United States)

    Phillips, Steven J; Elith, Jane

    2013-06-01

    A fundamental ecological modeling task is to estimate the probability that a species is present in (or uses) a site, conditional on environmental variables. For many species, available data consist of "presence" data (locations where the species [or evidence of it] has been observed), together with "background" data, a random sample of available environmental conditions. Recently published papers disagree on whether probability of presence is identifiable from such presence-background data alone. This paper aims to resolve the disagreement, demonstrating that additional information is required. We defined seven simulated species representing various simple shapes of response to environmental variables (constant, linear, convex, unimodal, S-shaped) and ran five logistic model-fitting methods using 1000 presence samples and 10 000 background samples; the simulations were repeated 100 times. The experiment revealed a stark contrast between two groups of methods: those based on a strong assumption that species' true probability of presence exactly matches a given parametric form had highly variable predictions and much larger RMS error than methods that take population prevalence (the fraction of sites in which the species is present) as an additional parameter. For six species, the former group grossly under- or overestimated probability of presence. The cause was not model structure or choice of link function, because all methods were logistic with linear and, where necessary, quadratic terms. Rather, the experiment demonstrates that an estimate of prevalence is not just helpful, but is necessary (except in special cases) for identifying probability of presence. We therefore advise against use of methods that rely on the strong assumption, due to Lele and Keim (recently advocated by Royle et al.) and Lancaster and Imbens. The methods are fragile, and their strong assumption is unlikely to be true in practice. We emphasize, however, that we are not arguing against

  6. Probability distributions for Markov chain based quantum walks

    Science.gov (United States)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  7. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  8. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  9. A Derivation of Probabilities of Correct and Wrongful Conviction in a Criminal Trial

    DEFF Research Database (Denmark)

    Lando, Henrik

    2006-01-01

    probabilities are the probability of observing (any given) evidence against individual i given that individual j committed the crime (for any j including j equal to i). The variables are derived from the conditional probabilities as a function of the standard of the proof using simple Bayesian updating....

  10. The probability of object-scene co-occurrence influences object identification processes.

    Science.gov (United States)

    Sauvé, Geneviève; Harmand, Mariane; Vanni, Léa; Brodeur, Mathieu B

    2017-07-01

    Contextual information allows the human brain to make predictions about the identity of objects that might be seen and irregularities between an object and its background slow down perception and identification processes. Bar and colleagues modeled the mechanisms underlying this beneficial effect suggesting that the brain stocks information about the statistical regularities of object and scene co-occurrence. Their model suggests that these recurring regularities could be conceptualized along a continuum in which the probability of seeing an object within a given scene can be high (probable condition), moderate (improbable condition) or null (impossible condition). In the present experiment, we propose to disentangle the electrophysiological correlates of these context effects by directly comparing object-scene pairs found along this continuum. We recorded the event-related potentials of 30 healthy participants (18-34 years old) and analyzed their brain activity in three time windows associated with context effects. We observed anterior negativities between 250 and 500 ms after object onset for the improbable and impossible conditions (improbable more negative than impossible) compared to the probable condition as well as a parieto-occipital positivity (improbable more positive than impossible). The brain may use different processing pathways to identify objects depending on whether the probability of co-occurrence with the scene is moderate (rely more on top-down effects) or null (rely more on bottom-up influences). The posterior positivity could index error monitoring aimed to ensure that no false information is integrated into mental representations of the world.

  11. On the failure probability of the primary piping of the PWR

    International Nuclear Information System (INIS)

    Schueller, G.I.; Hampl, N.C.

    1984-01-01

    A methodology for quantification of the structural reliability of the primary piping (PP) of a PWR under operational and accidental conditions is developed. Biblis B is utilized as reference plant. The PP structure is modeled utilizing finite element procedures. Based on the properties of the operational and internal accidental conditions, a static analysis suffices. However, a dynamic analysis considering non-linear effects of the soil-structure-interaction is to be used to determine load effects due to earthquake induced loading. Considering realistically the presence of initial cracks in welds and considering annual frequencies of occurrence of the various loading conditions, a crack propagation calculation utilizing the Forman model is carried out. Simultaneously leak and break probabilities using the 'Two Criteria'-Aproach are computed. A Monte Carlo simulation procedure is used as method of solution. (Author) [pt

  12. The probability of return conditional on migration duration: evidence from Kosovo

    Directory of Open Access Journals (Sweden)

    Kotorri Mrika

    2017-12-01

    Full Text Available The aim of this paper is to conceptualise the migration duration decision within the expected utility maximisation framework, and from that to derive and estimate an empirical proposition. For this purpose, the conceptual framework in Kotorri (2015 is extended where households decide to return to the home country conditional on their migration duration. In the empirical analysis, the Cox proportional hazards model is employed. This analysis is the first to investigate migration duration based on a random sample stemming from the Kosovo census of population conducted in 2011. The findings suggest rather mixed support for the household approach. The hazard to return decreases with income but not nonlinearly. The results indicate that household return migration behaviour is influenced by demographic characteristics, psychic income, and political factors.

  13. The creation and evaluation of a model predicting the probability of conception in seasonal-calving, pasture-based dairy cows.

    Science.gov (United States)

    Fenlon, Caroline; O'Grady, Luke; Doherty, Michael L; Dunnion, John; Shalloo, Laurence; Butler, Stephen T

    2017-07-01

    Reproductive performance in pasture-based production systems has a fundamentally important effect on economic efficiency. The individual factors affecting the probability of submission and conception are multifaceted and have been extensively researched. The present study analyzed some of these factors in relation to service-level probability of conception in seasonal-calving pasture-based dairy cows to develop a predictive model of conception. Data relating to 2,966 services from 737 cows on 2 research farms were used for model development and data from 9 commercial dairy farms were used for model testing, comprising 4,212 services from 1,471 cows. The data spanned a 15-yr period and originated from seasonal-calving pasture-based dairy herds in Ireland. The calving season for the study herds extended from January to June, with peak calving in February and March. A base mixed-effects logistic regression model was created using a stepwise model-building strategy and incorporated parity, days in milk, interservice interval, calving difficulty, and predicted transmitting abilities for calving interval and milk production traits. To attempt to further improve the predictive capability of the model, the addition of effects that were not statistically significant was considered, resulting in a final model composed of the base model with the inclusion of BCS at service. The models' predictions were evaluated using discrimination to measure their ability to correctly classify positive and negative cases. Precision, recall, F-score, and area under the receiver operating characteristic curve (AUC) were calculated. Calibration tests measured the accuracy of the predicted probabilities. These included tests of overall goodness-of-fit, bias, and calibration error. Both models performed better than using the population average probability of conception. Neither of the models showed high levels of discrimination (base model AUC 0.61, final model AUC 0.62), possibly because of the

  14. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  15. Stochastic Economic Dispatch with Wind using Versatile Probability Distribution and L-BFGS-B Based Dual Decomposition

    DEFF Research Database (Denmark)

    Huang, Shaojun; Sun, Yuanzhang; Wu, Qiuwei

    2018-01-01

    This paper focuses on economic dispatch (ED) in power systems with intermittent wind power, which is a very critical issue in future power systems. A stochastic ED problem is formed based on the recently proposed versatile probability distribution (VPD) of wind power. The problem is then analyzed...

  16. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    Science.gov (United States)

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  17. Naive Probability: Model-based Estimates of Unique Events

    Science.gov (United States)

    2014-05-04

    of inference. Argument and Computation, 1–17, iFirst. Khemlani, S., & Johnson-Laird, P.N. (2012b). Theories of the syllogism: A meta -analysis...is the probability that… 1 space tourism will achieve widespread popularity in the next 50 years? advances in material science will lead to the... governments dedicate more resources to contacting extra-terrestrials? 8 the United States adopts an open border policy of universal acceptance? English is

  18. Modelling detection probabilities to evaluate management and control tools for an invasive species

    Science.gov (United States)

    Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.

    2010-01-01

    For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By

  19. Oil spill contamination probability in the southeastern Levantine basin.

    Science.gov (United States)

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Bayesian modeling and inference for diagnostic accuracy and probability of disease based on multiple diagnostic biomarkers with and without a perfect reference standard.

    Science.gov (United States)

    Jafarzadeh, S Reza; Johnson, Wesley O; Gardner, Ian A

    2016-03-15

    The area under the receiver operating characteristic (ROC) curve (AUC) is used as a performance metric for quantitative tests. Although multiple biomarkers may be available for diagnostic or screening purposes, diagnostic accuracy is often assessed individually rather than in combination. In this paper, we consider the interesting problem of combining multiple biomarkers for use in a single diagnostic criterion with the goal of improving the diagnostic accuracy above that of an individual biomarker. The diagnostic criterion created from multiple biomarkers is based on the predictive probability of disease, conditional on given multiple biomarker outcomes. If the computed predictive probability exceeds a specified cutoff, the corresponding subject is allocated as 'diseased'. This defines a standard diagnostic criterion that has its own ROC curve, namely, the combined ROC (cROC). The AUC metric for cROC, namely, the combined AUC (cAUC), is used to compare the predictive criterion based on multiple biomarkers to one based on fewer biomarkers. A multivariate random-effects model is proposed for modeling multiple normally distributed dependent scores. Bayesian methods for estimating ROC curves and corresponding (marginal) AUCs are developed when a perfect reference standard is not available. In addition, cAUCs are computed to compare the accuracy of different combinations of biomarkers for diagnosis. The methods are evaluated using simulations and are applied to data for Johne's disease (paratuberculosis) in cattle. Copyright © 2015 John Wiley & Sons, Ltd.

  1. On a paradox of probability theory

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)

  2. Study of fusion probabilities with halo nuclei using different proximity based potentials

    International Nuclear Information System (INIS)

    Kumari, Raj

    2013-01-01

    We study fusion of halo nuclei with heavy targets using proximity based potentials due to Aage Winther (AW) 95, Bass 80 and Proximity 2010. In order to consider the extended matter distribution of halo nuclei, the nuclei radii borrowed from cross section measurements are included in these potentials. Our study reveals that the barrier heights are effectively reduced and fusion cross sections are appreciably enhanced by including extended radii of these nuclei. We also find that the extended sizes of halos contribute towards enhancement of fusion probabilities in case of proton halo nuclei, but, contribute to transfer or break-up process rather than fusion yield in case of neutron halo nuclei

  3. Survival modeling for the estimation of transition probabilities in model-based economic evaluations in the absence of individual patient data: a tutorial.

    Science.gov (United States)

    Diaby, Vakaramoko; Adunlin, Georges; Montero, Alberto J

    2014-02-01

    Survival modeling techniques are increasingly being used as part of decision modeling for health economic evaluations. As many models are available, it is imperative for interested readers to know about the steps in selecting and using the most suitable ones. The objective of this paper is to propose a tutorial for the application of appropriate survival modeling techniques to estimate transition probabilities, for use in model-based economic evaluations, in the absence of individual patient data (IPD). An illustration of the use of the tutorial is provided based on the final progression-free survival (PFS) analysis of the BOLERO-2 trial in metastatic breast cancer (mBC). An algorithm was adopted from Guyot and colleagues, and was then run in the statistical package R to reconstruct IPD, based on the final PFS analysis of the BOLERO-2 trial. It should be emphasized that the reconstructed IPD represent an approximation of the original data. Afterwards, we fitted parametric models to the reconstructed IPD in the statistical package Stata. Both statistical and graphical tests were conducted to verify the relative and absolute validity of the findings. Finally, the equations for transition probabilities were derived using the general equation for transition probabilities used in model-based economic evaluations, and the parameters were estimated from fitted distributions. The results of the application of the tutorial suggest that the log-logistic model best fits the reconstructed data from the latest published Kaplan-Meier (KM) curves of the BOLERO-2 trial. Results from the regression analyses were confirmed graphically. An equation for transition probabilities was obtained for each arm of the BOLERO-2 trial. In this paper, a tutorial was proposed and used to estimate the transition probabilities for model-based economic evaluation, based on the results of the final PFS analysis of the BOLERO-2 trial in mBC. The results of our study can serve as a basis for any model

  4. Plyometric Training Improves Sprinting, Jumping and Throwing Capacities of High Level Female Volleyball Players Better Than Skill-Based Conditioning

    Science.gov (United States)

    Gjinovci, Bahri; Idrizovic, Kemal; Uljevic, Ognjen; Sekulic, Damir

    2017-01-01

    There is an evident lack of studies on the effectiveness of plyometric- and skill-based-conditioning in volleyball. This study aimed to evaluate effects of 12-week plyometric- and volleyball-skill-based training on specific conditioning abilities in female volleyball players. The sample included 41 high-level female volleyball players (21.8 ± 2.1 years of age; 1.76 ± 0.06 cm; 60.8 ± 7.0 kg), who participated in plyometric- (n = 21), or skill-based-conditioning-program (n = 20). Both programs were performed twice per week. Participants were tested on body-height, body-mass (BM), countermovement jump (CMJ), standing broad jump (SBJ), medicine ball throw, (MBT) and 20-m sprint (S20M). All tests were assessed at the study baseline (pre-) and at the end of the 12-week programs (post-testing). Two-way ANOVA for repeated measurements showed significant (pvolleyball players. Future studies should evaluate differential program effects in less experienced and younger players. Key points Plyometric- and skill-based-conditioning resulted in improvements in jumping and throwing capacities, but plyometric training additionally induced positive changes in anthropometrics and sprint-capacity The changes induced by plyometric training were larger in magnitude than those achieved by skill-based conditioning. The higher intensity together with possibility of more accurate adjustment of training load in plyometric training are probably the most important determinant of such differential influence. It is likely that the skill-based conditioning program did not result in changes of higher magnitude because of the players’ familiarity with volleyball-related skills. PMID:29238253

  5. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  6. Probability-Based Ship Design Procedures: A Demonstration. Phase 1.

    Science.gov (United States)

    1992-09-01

    Although actuarially speaking, this should refer to the probability that the structure catastrophically fails, the term is generally and widely used as a...termed "empty", while otherwise they are called "qualified" upcrossings, a terminology devised by Vanmarcke ( ASME , J. Applied Mechanics, March 1975

  7. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  8. Sampling the stream landscape: Improving the applicability of an ecoregion-level capture probability model for stream fishes

    Science.gov (United States)

    Mollenhauer, Robert; Mouser, Joshua B.; Brewer, Shannon K.

    2018-01-01

    Temporal and spatial variability in streams result in heterogeneous gear capture probability (i.e., the proportion of available individuals identified) that confounds interpretation of data used to monitor fish abundance. We modeled tow-barge electrofishing capture probability at multiple spatial scales for nine Ozark Highland stream fishes. In addition to fish size, we identified seven reach-scale environmental characteristics associated with variable capture probability: stream discharge, water depth, conductivity, water clarity, emergent vegetation, wetted width–depth ratio, and proportion of riffle habitat. The magnitude of the relationship between capture probability and both discharge and depth varied among stream fishes. We also identified lithological characteristics among stream segments as a coarse-scale source of variable capture probability. The resulting capture probability model can be used to adjust catch data and derive reach-scale absolute abundance estimates across a wide range of sampling conditions with similar effort as used in more traditional fisheries surveys (i.e., catch per unit effort). Adjusting catch data based on variable capture probability improves the comparability of data sets, thus promoting both well-informed conservation and management decisions and advances in stream-fish ecology.

  9. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  10. Scenario based optimization of a container vessel with respect to its projected operating conditions

    Science.gov (United States)

    Wagner, Jonas; Binkowski, Eva; Bronsart, Robert

    2014-06-01

    In this paper the scenario based optimization of the bulbous bow of the KRISO Container Ship (KCS) is presented. The optimization of the parametrically modeled vessel is based on a statistically developed operational profile generated from noon-to-noon reports of a comparable 3600 TEU container vessel and specific development functions representing the growth of global economy during the vessels service time. In order to consider uncertainties, statistical fluctuations are added. An analysis of these data lead to a number of most probable upcoming operating conditions (OC) the vessel will stay in the future. According to their respective likeliness an objective function for the evaluation of the optimal design variant of the vessel is derived and implemented within the parametrical optimization workbench FRIENDSHIP Framework. In the following this evaluation is done with respect to vessel's calculated effective power based on the usage of potential flow code. The evaluation shows, that the usage of scenarios within the optimization process has a strong influence on the hull form.

  11. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  12. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  13. 28 CFR 2.101 - Probable cause hearing and determination.

    Science.gov (United States)

    2010-07-01

    ... who have given information upon which revocation may be based) at a postponed probable cause hearing... attendance, unless good cause is found for not allowing confrontation. Whenever a probable cause hearing is...

  14. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  15. Optimal Power Allocation Strategy in a Joint Bistatic Radar and Communication System Based on Low Probability of Intercept.

    Science.gov (United States)

    Shi, Chenguang; Wang, Fei; Salous, Sana; Zhou, Jianjiang

    2017-11-25

    In this paper, we investigate a low probability of intercept (LPI)-based optimal power allocation strategy for a joint bistatic radar and communication system, which is composed of a dedicated transmitter, a radar receiver, and a communication receiver. The joint system is capable of fulfilling the requirements of both radar and communications simultaneously. First, assuming that the signal-to-noise ratio (SNR) corresponding to the target surveillance path is much weaker than that corresponding to the line of sight path at radar receiver, the analytically closed-form expression for the probability of false alarm is calculated, whereas the closed-form expression for the probability of detection is not analytically tractable and is approximated due to the fact that the received signals are not zero-mean Gaussian under target presence hypothesis. Then, an LPI-based optimal power allocation strategy is presented to minimize the total transmission power for information signal and radar waveform, which is constrained by a specified information rate for the communication receiver and the desired probabilities of detection and false alarm for the radar receiver. The well-known bisection search method is employed to solve the resulting constrained optimization problem. Finally, numerical simulations are provided to reveal the effects of several system parameters on the power allocation results. It is also demonstrated that the LPI performance of the joint bistatic radar and communication system can be markedly improved by utilizing the proposed scheme.

  16. Don't make cache too complex: A simple probability-based cache management scheme for SSDs.

    Directory of Open Access Journals (Sweden)

    Seungjae Baek

    Full Text Available Solid-state drives (SSDs have recently become a common storage component in computer systems, and they are fueled by continued bit cost reductions achieved with smaller feature sizes and multiple-level cell technologies. However, as the flash memory stores more bits per cell, the performance and reliability of the flash memory degrade substantially. To solve this problem, a fast non-volatile memory (NVM-based cache has been employed within SSDs to reduce the long latency required to write data. Absorbing small writes in a fast NVM cache can also reduce the number of flash memory erase operations. To maximize the benefits of an NVM cache, it is important to increase the NVM cache utilization. In this paper, we propose and study ProCache, a simple NVM cache management scheme, that makes cache-entrance decisions based on random probability testing. Our scheme is motivated by the observation that frequently written hot data will eventually enter the cache with a high probability, and that infrequently accessed cold data will not enter the cache easily. Owing to its simplicity, ProCache is easy to implement at a substantially smaller cost than similar previously studied techniques. We evaluate ProCache and conclude that it achieves comparable performance compared to a more complex reference counter-based cache-management scheme.

  17. Preservice Elementary Teachers and the Fundamentals of Probability

    Science.gov (United States)

    Dollard, Clark

    2011-01-01

    This study examined how preservice elementary teachers think about situations involving probability. Twenty-four preservice elementary teachers who had not yet studied probability as part of their preservice elementary mathematics coursework were interviewed using a task-based interview. The participants' responses showed a wide variety of…

  18. A generalized electron energy probability function for inductively coupled plasmas under conditions of nonlocal electron kinetics

    Science.gov (United States)

    Mouchtouris, S.; Kokkoris, G.

    2018-01-01

    A generalized equation for the electron energy probability function (EEPF) of inductively coupled Ar plasmas is proposed under conditions of nonlocal electron kinetics and diffusive cooling. The proposed equation describes the local EEPF in a discharge and the independent variable is the kinetic energy of electrons. The EEPF consists of a bulk and a depleted tail part and incorporates the effect of the plasma potential, Vp, and pressure. Due to diffusive cooling, the break point of the EEPF is eVp. The pressure alters the shape of the bulk and the slope of the tail part. The parameters of the proposed EEPF are extracted by fitting to measure EEPFs (at one point in the reactor) at different pressures. By coupling the proposed EEPF with a hybrid plasma model, measurements in the gaseous electronics conference reference reactor concerning (a) the electron density and temperature and the plasma potential, either spatially resolved or at different pressure (10-50 mTorr) and power, and (b) the ion current density of the electrode, are well reproduced. The effect of the choice of the EEPF on the results is investigated by a comparison to an EEPF coming from the Boltzmann equation (local electron kinetics approach) and to a Maxwellian EEPF. The accuracy of the results and the fact that the proposed EEPF is predefined renders its use a reliable alternative with a low computational cost compared to stochastic electron kinetic models at low pressure conditions, which can be extended to other gases and/or different electron heating mechanisms.

  19. Static Three-Dimensional Fuzzy Routing Based on the Receiving Probability in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sohrab Khanmohammadi

    2013-11-01

    Full Text Available A Wireless Sensor Network (WSN is a collection of low-cost, low-power and large-scale wireless sensor nodes. Routing protocols are an important topic in WSN. Every sensor node should use a proper mechanism to transmit the generated packets to its destination, usually a base station. In previous works, routing protocols use the global information of the network that causes the redundant packets to be increased. Moreover, it leads to an increase in the network traffic, to a decrease in the delivery ratio of data packets, and to a reduction in network life. In this paper, we propose a new inferential routing protocol called SFRRP (Static Three-Dimensional Fuzzy Routing based on the Receiving Probability. The proposed protocol solves the above mentioned problems considerably. The data packets are transmitted by hop-to-hop delivery to the base station. It uses a fuzzy procedure to transmit the sensed data or the buffered data packets to one of the neighbors called selected node. In the proposed fuzzy system, the distance and number of neighbors are input variables, while the receiving probability is the output variable. SFRRP just uses the local neighborhood information to forward the packets and is not needed by any redundant packet for route discovery. The proposed protocol has some advantages such as a high delivery ratio, less delay time, high network life, and less network traffic. The performance of the proposed protocol surpasses the performance of the Flooding routing protocol in terms of delivery ratio, delay time and network lifetime.

  20. The probability of containment failure by direct containment heating in zion

    International Nuclear Information System (INIS)

    Pilch, M.M.; Yan, H.; Theofanous, T.G.

    1994-01-01

    This report is the first step in the resolution of the Direct Containment Heating (DCH) issue for the Zion Nuclear Power Plant using the Risk Oriented Accident Analysis Methodology (ROAAM). This report includes the definition of a probabilistic framework that decomposes the DCH problem into three probability density functions that reflect the most uncertain initial conditions (UO 2 mass, zirconium oxidation fraction, and steel mass). Uncertainties in the initial conditions are significant, but the quantification approach is based on establishing reasonable bounds that are not unnecessarily conservative. To this end, the authors also make use of the ROAAM ideas of enveloping scenarios and open-quotes splinteringclose quotes. Two casual relations (CRs) are used in this framework: CR1 is a model that calculates the peak pressure in the containment as a function of the initial conditions, and CR2 is a model that returns the frequency of containment failure as a function of pressure within the containment. Uncertainty in CR1 is accounted for by the use of two independently developed phenomenological models, the Convection Limited Containment Heating (CLCH) model and the Two-Cell Equilibrium (TCE) model, and by probabilistically distributing the key parameter in both, which is the ratio of the melt entrainment time to the system blowdown time constant. The two phenomenological models have been compared with an extensive data base including recent integral simulations at two different physical scales (1/10th scale in the Surtsey facility at Sandia National Laboratories and 1/40th scale in the COREXIT facility at Argonne National Laboratory). The loads predicted by these models were significantly lower than those from previous parametric calculations. The containment load distributions do not intersect the containment strength curve in any significant way, resulting in containment failure probabilities less than 10 -3 for all scenarios considered

  1. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    Directory of Open Access Journals (Sweden)

    Jingyue Pang

    2018-03-01

    Full Text Available Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR and relevance vector machine (RVM are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP, which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%. There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application.

  2. Exact solutions and symmetry analysis for the limiting probability distribution of quantum walks

    International Nuclear Information System (INIS)

    Xu, Xin-Ping; Ide, Yusuke

    2016-01-01

    In the literature, there are numerous studies of one-dimensional discrete-time quantum walks (DTQWs) using a moving shift operator. However, there is no exact solution for the limiting probability distributions of DTQWs on cycles using a general coin or swapping shift operator. In this paper, we derive exact solutions for the limiting probability distribution of quantum walks using a general coin and swapping shift operator on cycles for the first time. Based on the exact solutions, we show how to generate symmetric quantum walks and determine the condition under which a symmetric quantum walk appears. Our results suggest that choosing various coin and initial state parameters can achieve a symmetric quantum walk. By defining a quantity to measure the variation of symmetry, deviation and mixing time of symmetric quantum walks are also investigated.

  3. Exact solutions and symmetry analysis for the limiting probability distribution of quantum walks

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Xin-Ping, E-mail: xuxp@mail.ihep.ac.cn [School of Physical Science and Technology, Soochow University, Suzhou 215006 (China); Ide, Yusuke [Department of Information Systems Creation, Faculty of Engineering, Kanagawa University, Yokohama, Kanagawa, 221-8686 (Japan)

    2016-10-15

    In the literature, there are numerous studies of one-dimensional discrete-time quantum walks (DTQWs) using a moving shift operator. However, there is no exact solution for the limiting probability distributions of DTQWs on cycles using a general coin or swapping shift operator. In this paper, we derive exact solutions for the limiting probability distribution of quantum walks using a general coin and swapping shift operator on cycles for the first time. Based on the exact solutions, we show how to generate symmetric quantum walks and determine the condition under which a symmetric quantum walk appears. Our results suggest that choosing various coin and initial state parameters can achieve a symmetric quantum walk. By defining a quantity to measure the variation of symmetry, deviation and mixing time of symmetric quantum walks are also investigated.

  4. Probability Theory, Not the Very Guide of Life

    Science.gov (United States)

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  5. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  6. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  7. Confirmation of the Conditional Outdoor Leadership Theory.

    Science.gov (United States)

    Dixon, Tim; Priest, Simon

    1991-01-01

    Responses of 75 expert outdoor leaders from Canada and the United States concerning leadership in 12 hypothetical backpacking scenarios provided partial support for a theory that predicted probability of leadership style (democratic, autocratic, or abdicratic) based on favorability of conditions, task orientation, and relationship orientation.…

  8. Probabilities for profitable fungicide use against gray leaf spot in hybrid maize.

    Science.gov (United States)

    Munkvold, G P; Martinson, C A; Shriver, J M; Dixon, P M

    2001-05-01

    ABSTRACT Gray leaf spot, caused by the fungus Cercospora zeae-maydis, causes considerable yield losses in hybrid maize grown in the north-central United States and elsewhere. Nonchemical management tactics have not adequately prevented these losses. The probability of profitably using fungicide application as a management tool for gray leaf spot was evaluated in 10 field experiments under conditions of natural inoculum in Iowa. Gray leaf spot severity in untreated control plots ranged from 2.6 to 72.8% for the ear leaf and from 3.0 to 7.7 (1 to 9 scale) for whole-plot ratings. In each experiment, fungicide applications with propiconazole or mancozeb significantly reduced gray leaf spot severity. Fungicide treatment significantly (P probability of achieving a positive net return with one or two propiconazole applications, based on the mean yields and standard deviations for treated and untreated plots, the price of grain, and the costs of the fungicide applications. For one application, the probability ranged from approximately 0.06 to more than 0.99, and exceeded 0.50 in six of nine scenarios (specific experiment/hybrid). The highest probabilities occurred in the 1995 experiments with the most susceptible hybrid. Probabilities were almost always higher for a single application of propiconazole than for two applications. These results indicate that a single application of propiconazole frequently can be profitable for gray leaf spot management in Iowa, but the probability of a profitable application is strongly influenced by hybrid susceptibility. The calculation of probabilities for positive net returns was more informative than mean separation in terms of assessing the economic success of the fungicide applications.

  9. Strategy evolution driven by switching probabilities in structured multi-agent systems

    Science.gov (United States)

    Zhang, Jianlei; Chen, Zengqiang; Li, Zhiqi

    2017-10-01

    Evolutionary mechanism driving the commonly seen cooperation among unrelated individuals is puzzling. Related models for evolutionary games on graphs traditionally assume that players imitate their successful neighbours with higher benefits. Notably, an implicit assumption here is that players are always able to acquire the required pay-off information. To relax this restrictive assumption, a contact-based model has been proposed, where switching probabilities between strategies drive the strategy evolution. However, the explicit and quantified relation between a player's switching probability for her strategies and the number of her neighbours remains unknown. This is especially a key point in heterogeneously structured system, where players may differ in the numbers of their neighbours. Focusing on this, here we present an augmented model by introducing an attenuation coefficient and evaluate its influence on the evolution dynamics. Results show that the individual influence on others is negatively correlated with the contact numbers specified by the network topologies. Results further provide the conditions under which the coexisting strategies can be calculated analytically.

  10. Plant calendar pattern based on rainfall forecast and the probability of its success in Deli Serdang regency of Indonesia

    Science.gov (United States)

    Darnius, O.; Sitorus, S.

    2018-03-01

    The objective of this study was to determine the pattern of plant calendar of three types of crops; namely, palawija, rice, andbanana, based on rainfall in Deli Serdang Regency. In the first stage, we forecasted rainfall by using time series analysis, and obtained appropriate model of ARIMA (1,0,0) (1,1,1)12. Based on the forecast result, we designed a plant calendar pattern for the three types of plant. Furthermore, the probability of success in the plant types following the plant calendar pattern was calculated by using the Markov process by discretizing the continuous rainfall data into three categories; namely, Below Normal (BN), Normal (N), and Above Normal (AN) to form the probability transition matrix. Finally, the combination of rainfall forecasting models and the Markov process were used to determine the pattern of cropping calendars and the probability of success in the three crops. This research used rainfall data of Deli Serdang Regency taken from the office of BMKG (Meteorologist Climatology and Geophysics Agency), Sampali Medan, Indonesia.

  11. Pemodelan Markov Switching Dengan Time-varying Transition Probability

    OpenAIRE

    Savitri, Anggita Puri; Warsito, Budi; Rahmawati, Rita

    2016-01-01

    Exchange rate or currency is an economic variable which reflects country's state of economy. It fluctuates over time because of its ability to switch the condition or regime caused by economic and political factors. The changes in the exchange rate are depreciation and appreciation. Therefore, it could be modeled using Markov Switching with Time-Varying Transition Probability which observe the conditional changes and use information variable. From this model, time-varying transition probabili...

  12. Thermal and mechanical quantitative sensory testing in Chinese patients with burning mouth syndrome--a probable neuropathic pain condition?

    Science.gov (United States)

    Mo, Xueyin; Zhang, Jinglu; Fan, Yuan; Svensson, Peter; Wang, Kelun

    2015-01-01

    To explore the hypothesis that burning mouth syndrome (BMS) probably is a neuropathic pain condition, thermal and mechanical sensory and pain thresholds were tested and compared with age- and gender-matched control participants using a standardized battery of psychophysical techniques. Twenty-five BMS patients (men: 8, women: 17, age: 49.5 ± 11.4 years) and 19 age- and gender-matched healthy control participants were included. The cold detection threshold (CDT), warm detection threshold (WDT), cold pain threshold (CPT), heat pain threshold (HPT), mechanical detection threshold (MDT) and mechanical pain threshold (MPT), in accordance with the German Network of Neuropathic Pain guidelines, were measured at the following four sites: the dorsum of the left hand (hand), the skin at the mental foramen (chin), on the tip of the tongue (tongue), and the mucosa of the lower lip (lip). Statistical analysis was performed using ANOVA with repeated measures to compare the means within and between groups. Furthermore, Z-score profiles were generated, and exploratory correlation analyses between QST and clinical variables were performed. Two-tailed tests with a significance level of 5 % were used throughout. CDTs (P < 0.02) were significantly lower (less sensitivity) and HPTs (P < 0.001) were significantly higher (less sensitivity) at the tongue and lip in BMS patients compared to control participants. WDT (P = 0.007) was also significantly higher at the tongue in BMS patients compared to control subjects . There were no significant differences in MDT and MPT between the BMS patients and healthy subjects at any of the four test sites. Z-scores showed that significant loss of function can be identified for CDT (Z-scores = -0.9±1.1) and HPT (Z-scores = 1.5±0.4). There were no significant correlations between QST and clinical variables (pain intensity, duration, depressions scores). BMS patients had a significant loss of thermal function but not

  13. Fixation Probabilities of Evolutionary Graphs Based on the Positions of New Appearing Mutants

    Directory of Open Access Journals (Sweden)

    Pei-ai Zhang

    2014-01-01

    Full Text Available Evolutionary graph theory is a nice measure to implement evolutionary dynamics on spatial structures of populations. To calculate the fixation probability is usually regarded as a Markov chain process, which is affected by the number of the individuals, the fitness of the mutant, the game strategy, and the structure of the population. However the position of the new mutant is important to its fixation probability. Here the position of the new mutant is laid emphasis on. The method is put forward to calculate the fixation probability of an evolutionary graph (EG of single level. Then for a class of bilevel EGs, their fixation probabilities are calculated and some propositions are discussed. The conclusion is obtained showing that the bilevel EG is more stable than the corresponding one-rooted EG.

  14. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  15. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  16. Allelic drop-out probabilities estimated by logistic regression

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic drop-out...... is occurring. With this discussion, we hope to improve the drop-out model, so that it can be used for practical forensic genetics and stimulate further discussions. We discuss how to estimate drop-out probabilities when using a varying number of PCR cycles and other experimental conditions....

  17. The Effect of High Frequency Pulse on the Discharge Probability in Micro EDM

    Science.gov (United States)

    Liu, Y.; Qu, Y.; Zhang, W.; Ma, F.; Sha, Z.; Wang, Y.; Rolfe, B.; Zhang, S.

    2017-12-01

    High frequency pulse improves the machining efficiency of micro electric discharge machining (micro EDM), while it also brings some changes in micro EDM process. This paper focuses on the influence of skin-effect under the high frequency pulse on energy distribution and transmission in micro EDM, based on which, the rules of discharge probability of electrode end face are also analysed. On the basis of the electrical discharge process under the condition of high frequency pulse in micro EDM, COMSOL Multiphysics software is used to establish energy transmission model in micro electrode. The discharge energy distribution and transmission within tool electrode under different pulse frequencies, electrical currents, and permeability situation are studied in order to get the distribution pattern of current density and electric field intensity in the electrode end face under the influence of electrical parameters change. The electric field intensity distribution is regarded as the influencing parameter of discharge probability on the electrode end. Finally, MATLAB is used to fit the curve and obtain the distribution of discharge probability of electrode end face.

  18. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  19. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  20. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  1. Gravity and count probabilities in an expanding universe

    Science.gov (United States)

    Bouchet, Francois R.; Hernquist, Lars

    1992-01-01

    The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.

  2. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  3. Slamming Simulations in a Conditional Wave

    DEFF Research Database (Denmark)

    Seng, Sopheak; Jensen, Jørgen Juncher

    2012-01-01

    A study of slamming events in conditional waves is presented in this paper. The ship is sailing in head sea and the motion is solved for under the assumption of rigid body motion constrained to two degree-of-freedom i.e. heave and pitch. Based on a time domain non-linear strip theory most probable...

  4. United States streamflow probabilities based on forecasted La Nina, winter-spring 2000

    Science.gov (United States)

    Dettinger, M.D.; Cayan, D.R.; Redmond, K.T.

    1999-01-01

    Although for the last 5 months the TahitiDarwin Southern Oscillation Index (SOI) has hovered close to normal, the “equatorial” SOI has remained in the La Niña category and predictions are calling for La Niña conditions this winter. In view of these predictions of continuing La Niña and as a direct extension of previous studies of the relations between El NiñoSouthern Oscil-lation (ENSO) conditions and streamflow in the United States (e.g., Redmond and Koch, 1991; Cayan and Webb, 1992; Redmond and Cayan, 1994; Dettinger et al., 1998; Garen, 1998; Cayan et al., 1999; Dettinger et al., in press), the probabilities that United States streamflows from December 1999 through July 2000 will be in upper and lower thirds (terciles) of the historical records are estimated here. The processes that link ENSO to North American streamflow are discussed in detail in these diagnostics studies. Our justification for generating this forecast is threefold: (1) Cayan et al. (1999) recently have shown that ENSO influences on streamflow variations and extremes are proportionately larger than the corresponding precipitation teleconnections. (2) Redmond and Cayan (1994) and Dettinger et al. (in press) also have shown that the low-frequency evolution of ENSO conditions support long-lead correlations between ENSO and streamflow in many rivers of the conterminous United States. (3) In many rivers, significant (weeks-to-months) delays between precipitation and the release to streams of snowmelt or ground-water discharge can support even longer term forecasts of streamflow than is possible for precipitation. The relatively slow, orderly evolution of El Niño-Southern Oscillation episodes, the accentuated dependence of streamflow upon ENSO, and the long lags between precipitation and flow encourage us to provide the following analysis as a simple prediction of this year’s river flows.

  5. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  6. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  7. Data analysis & probability drill sheets : grades 6-8

    CERN Document Server

    Forest, Chris

    2011-01-01

    For grades 6-8, our Common Core State Standards-based resource meets the data analysis & probability concepts addressed by the NCTM standards and encourages your students to review the concepts in unique ways. Each drill sheet contains warm-up and timed drill activities for the student to practice data analysis & probability concepts.

  8. Using Fuzzy Probability Weights in Cumulative Prospect Theory

    Directory of Open Access Journals (Sweden)

    Užga-Rebrovs Oļegs

    2016-12-01

    Full Text Available During the past years, a rapid growth has been seen in the descriptive approaches to decision choice. As opposed to normative expected utility theory, these approaches are based on the subjective perception of probabilities by the individuals, which takes place in real situations of risky choice. The modelling of this kind of perceptions is made on the basis of probability weighting functions. In cumulative prospect theory, which is the focus of this paper, decision prospect outcome weights are calculated using the obtained probability weights. If the value functions are constructed in the sets of positive and negative outcomes, then, based on the outcome value evaluations and outcome decision weights, generalised evaluations of prospect value are calculated, which are the basis for choosing an optimal prospect.

  9. Impact probabilities of meteoroid streams with artificial satellites: An assessment

    International Nuclear Information System (INIS)

    Foschini, L.; Cevolani, G.

    1997-01-01

    Impact probabilities of artificial satellites with meteoroid streams were calculated using data collected with the CNR forward scatter (FS) bistatic radar over the Bologna-Lecce baseline (about 700 km). Results show that impact probabilities are 2 times higher than other previously calculated values. Nevertheless, although catastrophic impacts are still rare even in the case of meteor storm conditions, it is expected that high meteoroid fluxes can erode satellites surfaces and weaken their external structures

  10. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  11. Probability analysis of WWER-1000 fuel elements behavior under steady-state, transient and accident conditions of reactor operation

    International Nuclear Information System (INIS)

    Tutnov, A.; Alexeev, E.

    2001-01-01

    'PULSAR-2' and 'PULSAR+' codes make it possible to simulate thermo-mechanical and thermo-physical parameters of WWER fuel elements. The probabilistic approach is used instead of traditional deterministic one to carry out a sensitive study of fuel element behavior under steady-state operation mode. Fuel elements initial parameters are given as a density of the probability distributions. Calculations are provided for all possible combinations of initial data as fuel-cladding gap, fuel density and gas pressure. Dividing values of these parameters to intervals final variants for calculations are obtained . Intervals of permissible fuel-cladding gap size have been divided to 10 equal parts, fuel density and gas pressure - to 5 parts. Probability of each variant realization is determined by multiplying the probabilities of separate parameters, because the tolerances of these parameters are distributed independently. Simulation results are turn out in the probabilistic bar charts. The charts present probability distribution of the changes in fuel outer diameter, hoop stress kinetics and fuel temperature versus irradiation time. A normative safety factor is introduced for control of any criterion realization and for determination of a reserve to the criteria failure. A probabilistic analysis of fuel element behavior under Reactivity Initiating Accident (RIA) is also performed and probability fuel element depressurization under hypothetical RIA is presented

  12. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  13. Probabilistic interpretation of command and control signals: Bayesian updating of the probability of nuclear attack

    International Nuclear Information System (INIS)

    Pate-Cornell, M.Elisabeth; Fischbeck, Paul S.

    1995-01-01

    A warning system such as the Command, Control, Communication, and Intelligence system (C 3 I) for the United States nuclear forces operates on the basis of various sources of information among which are signals from sensors. A fundamental problem in the use of such signals is that these sensors provide only imperfect information. Bayesian probability, defined as a degree of belief in the possibility of each event, is therefore a key concept in the logical treatment of the signals. However, the base of evidence for estimation of these probabilities may be small and, therefore, the results of the updating (posterior probabilities of attack) may also be uncertain. In this paper, we examine the case where uncertainties hinge upon the existence of several possible underlying hypotheses (or models), and where the decision-maker attributes a different probability of attack to each of these fundamental hypotheses. We present a two-stage Bayesian updating process, first of the probabilities of the fundamental hypotheses, then of the probabilities of attack conditional on each hypothesis, given a positive signal from the C 3 I. We illustrate the method in the discrete case where there are only two possible fundamental hypotheses, and in the case of a continuous set of hypotheses. We discuss briefly the implications of the results for decision-making. The method can be generalized to other warning systems with imperfect signals, when the prior probability of the event of interest is uncertain

  14. Automatic Threshold Setting and Its Uncertainty Quantification in Wind Turbine Condition Monitoring System

    DEFF Research Database (Denmark)

    Marhadi, Kun Saptohartyadi; Skrimpas, Georgios Alexandros

    2015-01-01

    Setting optimal alarm thresholds in vibration based condition monitoring system is inherently difficult. There are no established thresholds for many vibration based measurements. Most of the time, the thresholds are set based on statistics of the collected data available. Often times the underly......Setting optimal alarm thresholds in vibration based condition monitoring system is inherently difficult. There are no established thresholds for many vibration based measurements. Most of the time, the thresholds are set based on statistics of the collected data available. Often times...... the underlying probability distribution that describes the data is not known. Choosing an incorrect distribution to describe the data and then setting up thresholds based on the chosen distribution could result in sub-optimal thresholds. Moreover, in wind turbine applications the collected data available may...... not represent the whole operating conditions of a turbine, which results in uncertainty in the parameters of the fitted probability distribution and the thresholds calculated. In this study, Johnson, Normal, and Weibull distributions are investigated; which distribution can best fit vibration data collected...

  15. Probability-based collaborative filtering model for predicting gene-disease associations.

    Science.gov (United States)

    Zeng, Xiangxiang; Ding, Ningxiang; Rodríguez-Patón, Alfonso; Zou, Quan

    2017-12-28

    Accurately predicting pathogenic human genes has been challenging in recent research. Considering extensive gene-disease data verified by biological experiments, we can apply computational methods to perform accurate predictions with reduced time and expenses. We propose a probability-based collaborative filtering model (PCFM) to predict pathogenic human genes. Several kinds of data sets, containing data of humans and data of other nonhuman species, are integrated in our model. Firstly, on the basis of a typical latent factorization model, we propose model I with an average heterogeneous regularization. Secondly, we develop modified model II with personal heterogeneous regularization to enhance the accuracy of aforementioned models. In this model, vector space similarity or Pearson correlation coefficient metrics and data on related species are also used. We compared the results of PCFM with the results of four state-of-arts approaches. The results show that PCFM performs better than other advanced approaches. PCFM model can be leveraged for predictions of disease genes, especially for new human genes or diseases with no known relationships.

  16. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  17. Future probabilities of coastal floods in Finland

    Science.gov (United States)

    Pellikka, Havu; Leijala, Ulpu; Johansson, Milla M.; Leinonen, Katri; Kahma, Kimmo K.

    2018-04-01

    Coastal planning requires detailed knowledge of future flooding risks, and effective planning must consider both short-term sea level variations and the long-term trend. We calculate distributions that combine short- and long-term effects to provide estimates of flood probabilities in 2050 and 2100 on the Finnish coast in the Baltic Sea. Our distributions of short-term sea level variations are based on 46 years (1971-2016) of observations from the 13 Finnish tide gauges. The long-term scenarios of mean sea level combine postglacial land uplift, regionally adjusted scenarios of global sea level rise, and the effect of changes in the wind climate. The results predict that flooding risks will clearly increase by 2100 in the Gulf of Finland and the Bothnian Sea, while only a small increase or no change compared to present-day conditions is expected in the Bothnian Bay, where the land uplift is stronger.

  18. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  19. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  20. Performance Based Evaluation of Concrete Strength under Various Curing Conditions to Investigate Climate Change Effects

    Directory of Open Access Journals (Sweden)

    Tae-Kyun Kim

    2015-07-01

    Full Text Available Recently, the manifestation of global warming-induced climate change has been observed through super typhoons, heavy snowfalls, torrential rains, and extended heat waves. These climate changes have been occurring all over the world and natural disasters have caused severe damage and deterioration of concrete structures and infrastructure. In an effort to deal with these problems due to extreme and abnormal climate changes, studies have been conducted to develop construction technologies and design guidelines. Nevertheless, study results applicable to construction sites continue to be ineffective and insufficient. Therefore, this study proposes ways to cope with climate change by considering the effect of concrete curing condition variations on concrete material performance. More specifically, the 3-, 7- and 28-day compressive and split tensile strength properties of concrete mix cured under various climatic factors including temperature, relative humidity, wind speed, and sunlight exposure time were evaluated to determine whether the concrete meets the current design requirements. Thereafter, a performance based evaluation (PBE was performed using satisfaction probabilities based on the test values to understand the problems associated with the current mix proportion design practice and to identify countermeasures to deal with climate change-induced curing conditions.

  1. Problems involved in calculating the probability of rare occurrences

    International Nuclear Information System (INIS)

    Tittes, E.

    1986-01-01

    Also with regard to the characteristics such as occurrence probability or occurrence rate, there are limits which have to be observed, or else probability data and thus the concept of determinable risk itself will lose its practical value. The mathematical models applied for probability assessment are based on data supplied by the insurance companies, reliability experts in the automobile industry, or by planning experts in the field of traffic or information supply. (DG) [de

  2. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy logical relationships.

    Science.gov (United States)

    Chen, Shyi-Ming; Chen, Shen-Wen

    2015-03-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy-trend logical relationships. Firstly, the proposed method fuzzifies the historical training data of the main factor and the secondary factor into fuzzy sets, respectively, to form two-factors second-order fuzzy logical relationships. Then, it groups the obtained two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, it calculates the probability of the "down-trend," the probability of the "equal-trend" and the probability of the "up-trend" of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group, respectively. Finally, it performs the forecasting based on the probabilities of the down-trend, the equal-trend, and the up-trend of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) and the NTD/USD exchange rates. The experimental results show that the proposed method outperforms the existing methods.

  3. Failure probability of PWR reactor coolant loop piping

    International Nuclear Information System (INIS)

    Lo, T.; Woo, H.H.; Holman, G.S.; Chou, C.K.

    1984-02-01

    This paper describes the results of assessments performed on the PWR coolant loop piping of Westinghouse and Combustion Engineering plants. For direct double-ended guillotine break (DEGB), consideration was given to crack existence probability, initial crack size distribution, hydrostatic proof test, preservice inspection, leak detection probability, crack growth characteristics, and failure criteria based on the net section stress failure and tearing modulus stability concept. For indirect DEGB, fragilities of major component supports were estimated. The system level fragility was then calculated based on the Boolean expression involving these fragilities. Indirect DEGB due to seismic effects was calculated by convolving the system level fragility and the seismic hazard curve. The results indicate that the probability of occurrence of both direct and indirect DEGB is extremely small, thus, postulation of DEGB in design should be eliminated and replaced by more realistic criteria

  4. Scenario based optimization of a container vessel with respect to its projected operating conditions

    Directory of Open Access Journals (Sweden)

    Wagner Jonas

    2014-06-01

    Full Text Available In this paper the scenario based optimization of the bulbous bow of the KRISO Container Ship (KCS is presented. The optimization of the parametrically modeled vessel is based on a statistically developed operational profile generated from noon-to-noon reports of a comparable 3600 TEU container vessel and specific development functions representing the growth of global economy during the vessels service time. In order to consider uncertainties, statistical fluctuations are added. An analysis of these data lead to a number of most probable upcoming operating conditions (OC the vessel will stay in the future. According to their respective likeliness an objective function for the evaluation of the optimal design variant of the vessel is derived and implemented within the parametrical optimization workbench FRIENDSHIP Framework. In the following this evaluation is done with respect to vessel’s calculated effective power based on the usage of potential flow code. The evaluation shows, that the usage of scenarios within the optimization process has a strong influence on the hull form.

  5. Scenario based optimization of a container vessel with respect to its projected operating conditions

    Directory of Open Access Journals (Sweden)

    Jonas Wagner

    2014-06-01

    Full Text Available In this paper the scenario based optimization of the bulbous bow of the KRISO Container Ship (KCS is presented. The optimization of the parametrically modeled vessel is based on a statistically developed operational profile generated from noon-to-noon reports of a comparable 3600 TEU container vessel and specific development functions representing the growth of global economy during the vessels service time. In order to consider uncertainties, statistical fluctuations are added. An analysis of these data lead to a number of most probable upcoming operating conditions (OC the vessel will stay in the future. According to their respective likeliness an objective function for the evaluation of the optimal design variant of the vessel is derived and implemented within the parametrical optimization workbench FRIENDSHIP Framework. In the following this evaluation is done with respect to vessel's calculated effective power based on the usage of potential flow code. The evaluation shows, that the usage of scenarios within the optimization process has a strong influence on the hull form.

  6. Probability calculations for three-part mineral resource assessments

    Science.gov (United States)

    Ellefsen, Karl J.

    2017-06-27

    Three-part mineral resource assessment is a methodology for predicting, in a specified geographic region, both the number of undiscovered mineral deposits and the amount of mineral resources in those deposits. These predictions are based on probability calculations that are performed with computer software that is newly implemented. Compared to the previous implementation, the new implementation includes new features for the probability calculations themselves and for checks of those calculations. The development of the new implementation lead to a new understanding of the probability calculations, namely the assumptions inherent in the probability calculations. Several assumptions strongly affect the mineral resource predictions, so it is crucial that they are checked during an assessment. The evaluation of the new implementation leads to new findings about the probability calculations,namely findings regarding the precision of the computations,the computation time, and the sensitivity of the calculation results to the input.

  7. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  8. Probability and logical structure of statistical theories

    International Nuclear Information System (INIS)

    Hall, M.J.W.

    1988-01-01

    A characterization of statistical theories is given which incorporates both classical and quantum mechanics. It is shown that each statistical theory induces an associated logic and joint probability structure, and simple conditions are given for the structure to be of a classical or quantum type. This provides an alternative for the quantum logic approach to axiomatic quantum mechanics. The Bell inequalities may be derived for those statistical theories that have a classical structure and satisfy a locality condition weaker than factorizability. The relation of these inequalities to the issue of hidden variable theories for quantum mechanics is discussed and clarified

  9. When does inferring reputation probability countervail temptation in cooperative behaviors for the prisoners’ dilemma game?

    International Nuclear Information System (INIS)

    Dai, Yu; Lu, Peng

    2015-01-01

    In evolutionary games, the temptation mechanism reduces cooperation percentage while the reputation mechanism promotes it. Inferring reputation theory proposes that agent's imitating neighbors with the highest reputation takes place with a probability. Although reputation promotes cooperation, when and how it enhances cooperation is still a question. This paper investigates the condition where the inferring reputation probability promotes cooperation. Hence, the effects of reputation and temptation on cooperation are explored under the spatial prisoners’ dilemma game, utilizing the methods of simulation and statistical analysis. Results show that temptation reduces cooperation unconditionally while reputation promotes it conditionally, i.e. reputation countervails temptation conditionally. When the inferring reputation probability is less than 0.5, reputation promotes cooperation substantially and thus countervails temptation. However, when the inferring reputation probability is larger than 0.5, its contribution to cooperation is relatively weak and cannot prevent temptation from undermining cooperation. Reputation even decreases cooperation together with temptation when the probability is higher than 0.8. It should be noticed that inferring reputation does not always succeed to countervail temptation and there is a specific interval for it to promote cooperation

  10. The Negated Conditional: A Litmus Test for the Suppositional Conditional?

    Science.gov (United States)

    Handley, Simon J.; Evans, Jonathan St. B. T.; Thompson, Valerie A.

    2006-01-01

    Under the suppositional account of conditionals, when people think about a conditional assertion, "if p then q," they engage in a mental simulation in which they imagine p holds and evaluate the probability that q holds under this supposition. One implication of this account is that belief in a conditional equates to conditional probability…

  11. ANALYSIS OF EFFECTIVENESS OF METHODOLOGICAL SYSTEM FOR PROBABILITY AND STOCHASTIC PROCESSES COMPUTER-BASED LEARNING FOR PRE-SERVICE ENGINEERS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2015-04-01

    Full Text Available The author substantiates that only methodological training systems of mathematical disciplines with implementation of information and communication technologies (ICT can meet the requirements of modern educational paradigm and make possible to increase the educational efficiency. Due to this fact, the necessity of developing the methodology of theory of probability and stochastic processes computer-based learning for pre-service engineers is underlined in the paper. The results of the experimental study for analysis of the efficiency of methodological system of theory of probability and stochastic processes computer-based learning for pre-service engineers are shown. The analysis includes three main stages: ascertaining, searching and forming. The key criteria of the efficiency of designed methodological system are the level of probabilistic and stochastic skills of students and their learning motivation. The effect of implementing the methodological system of probability theory and stochastic processes computer-based learning on the level of students’ IT literacy is shown in the paper. The expanding of the range of objectives of ICT applying by students is described by author. The level of formation of students’ learning motivation on the ascertaining and forming stages of the experiment is analyzed. The level of intrinsic learning motivation for pre-service engineers is defined on these stages of the experiment. For this purpose, the methodology of testing the students’ learning motivation in the chosen specialty is presented in the paper. The increasing of intrinsic learning motivation of the experimental group students (E group against the control group students (C group is demonstrated.

  12. The probability distribution of maintenance cost of a system affected by the gamma process of degradation: Finite time solution

    International Nuclear Information System (INIS)

    Cheng, Tianjin; Pandey, Mahesh D.; Weide, J.A.M. van der

    2012-01-01

    The stochastic gamma process has been widely used to model uncertain degradation in engineering systems and structures. The optimization of the condition-based maintenance (CBM) policy is typically based on the minimization of the asymptotic cost rate. In the financial planning of a maintenance program, however, a more accurate prediction interval for the cost is needed for prudent decision making. The prediction interval cannot be estimated unless the probability distribution of cost is known. In this context, the asymptotic cost rate has a limited utility. This paper presents the derivation of the probability distribution of maintenance cost, when the system degradation is modelled as a stochastic gamma process. A renewal equation is formulated to derive the characteristic function, then the discrete Fourier transform of the characteristic function leads to the complete probability distribution of cost in a finite time setting. The proposed approach is useful for a precise estimation of prediction limits and optimization of the maintenance cost.

  13. PROBABLE FORECASTING IN THE COURSE OF INTERPRETING

    Directory of Open Access Journals (Sweden)

    Ye. B. Kagan

    2017-01-01

    Full Text Available Introduction. Translation practice has a heuristic nature and involves cognitive structures of consciousness of any interpreter. When preparing translators, special attention is paid to the development of their skill of probable forecasting.The aim of the present publication is to understand the process of anticipation from the position of the cognitive model of translation, development of exercises aimed at the development of prognostic abilities of students and interpreters when working with newspaper articles, containing metaphorical headlines.Methodology and research methods. The study is based on the competence approach to the training of students-translators, the complex of interrelated scientific methods, the main of which is the psycholinguistic experiment. With the use of quantitative data the features of the perception of newspaper texts on their metaphorical titles are characterized.Results and scientific novelty. On the basis of the conducted experiment to predict the content of newspaper articles with metaphorical headlines it is concluded that the main condition of predictability is the expectation. Probable forecasting as a professional competence of a future translator is formed in the process of training activities by integrating efforts of various departments of any language university. Specific exercises for the development of anticipation of students while studying the course of translation and interpretation are offered.Practical significance. The results of the study can be used by foreign language teachers of both language and non-language universities in teaching students of different specialties to translate foreign texts. 

  14. Probabilistic Cloning of Three Real States with Optimal Success Probabilities

    Science.gov (United States)

    Rui, Pin-shu

    2017-06-01

    We investigate the probabilistic quantum cloning (PQC) of three real states with average probability distribution. To get the analytic forms of the optimal success probabilities we assume that the three states have only two pairwise inner products. Based on the optimal success probabilities, we derive the explicit form of 1 →2 PQC for cloning three real states. The unitary operation needed in the PQC process is worked out too. The optimal success probabilities are also generalized to the M→ N PQC case.

  15. Failure frequencies and probabilities applicable to BWR and PWR piping

    International Nuclear Information System (INIS)

    Bush, S.H.; Chockie, A.D.

    1996-03-01

    This report deals with failure probabilities and failure frequencies of nuclear plant piping and the failure frequencies of flanges and bellows. Piping failure probabilities are derived from Piping Reliability Analysis Including Seismic Events (PRAISE) computer code calculations based on fatigue and intergranular stress corrosion as failure mechanisms. Values for both failure probabilities and failure frequencies are cited from several sources to yield a better evaluation of the spread in mean and median values as well as the widths of the uncertainty bands. A general conclusion is that the numbers from WASH-1400 often used in PRAs are unduly conservative. Failure frequencies for both leaks and large breaks tend to be higher than would be calculated using the failure probabilities, primarily because the frequencies are based on a relatively small number of operating years. Also, failure probabilities are substantially lower because of the probability distributions used in PRAISE calculations. A general conclusion is that large LOCA probability values calculated using PRAISE will be quite small, on the order of less than 1E-8 per year (<1E-8/year). The values in this report should be recognized as having inherent limitations and should be considered as estimates and not absolute values. 24 refs 24 refs

  16. Weighted conditional least-squares estimation

    International Nuclear Information System (INIS)

    Booth, J.G.

    1987-01-01

    A two-stage estimation procedure is proposed that generalizes the concept of conditional least squares. The method is instead based upon the minimization of a weighted sum of squares, where the weights are inverses of estimated conditional variance terms. Some general conditions are given under which the estimators are consistent and jointly asymptotically normal. More specific details are given for ergodic Markov processes with stationary transition probabilities. A comparison is made with the ordinary conditional least-squares estimators for two simple branching processes with immigration. The relationship between weighted conditional least squares and other, more well-known, estimators is also investigated. In particular, it is shown that in many cases estimated generalized least-squares estimators can be obtained using the weighted conditional least-squares approach. Applications to stochastic compartmental models, and linear models with nested error structures are considered

  17. Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not based on probability schemes

    NARCIS (Netherlands)

    Toepoel, V.; Emerson, Hannah

    2017-01-01

    Weighting techniques in web surveys based on no probability schemes are devised to correct biases due to self-selection, undercoverage, and nonresponse. In an interactive panel, 38 survey experts addressed weighting techniques and auxiliary variables in web surveys. Most of them corrected all biases

  18. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  19. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  20. Experience-based probabilities modulate expectations in a gender-coded artificial language

    Directory of Open Access Journals (Sweden)

    Anton Öttl

    2016-08-01

    Full Text Available The current study combines artificial language learning with visual world eyetracking to investigate acquisition of representations associating spoken words and visual referents using morphologically complex pseudowords. Pseudowords were constructed to consistently encode referential gender by means of suffixation for a set of imaginary figures that could be either male or female. During training, the frequency of exposure to pseudowords and their imaginary figure referents were manipulated such that a given word and its referent would be more likely to occur in either the masculine form or the feminine form, or both forms would be equally likely. Results show that these experience-based probabilities affect the formation of new representations to the extent that participants were faster at recognizing a referent whose gender was consistent with the induced expectation than a referent whose gender was inconsistent with this expectation. Disambiguating gender information available from the suffix did not mask the induced expectations. Eyetracking data provide additional evidence that such expectations surface during online lexical processing. Taken together, these findings indicate that experience-based information is accessible during the earliest stages of processing, and are consistent with the view that language comprehension depends on the activation of perceptual memory traces.

  1. Evolvement simulation of the probability of neutron-initiating persistent fission chain

    International Nuclear Information System (INIS)

    Wang Zhe; Hong Zhenying

    2014-01-01

    Background: Probability of neutron-initiating persistent fission chain, which has to be calculated in analysis of critical safety, start-up of reactor, burst waiting time on pulse reactor, bursting time on pulse reactor, etc., is an inherent parameter in a multiplying assembly. Purpose: We aim to derive time-dependent integro-differential equation for such probability in relative velocity space according to the probability conservation, and develop the deterministic code Dynamic Segment Number Probability (DSNP) based on the multi-group S N method. Methods: The reliable convergence of dynamic calculation was analyzed and numerical simulation of the evolvement process of dynamic probability for varying concentration was performed under different initial conditions. Results: On Highly Enriched Uranium (HEU) Bare Spheres, when the time is long enough, the results of dynamic calculation approach to those of static calculation. The most difference of such results between DSNP and Partisn code is less than 2%. On Baker model, over the range of about 1 μs after the first criticality, the most difference between the dynamic and static calculation is about 300%. As for a super critical system, the finite fission chains decrease and the persistent fission chains increase as the reactivity aggrandizes, the dynamic evolvement curve of initiation probability is close to the static curve within the difference of 5% when the K eff is more than 1.2. The cumulative probability curve also indicates that the difference of integral results between the dynamic calculation and the static calculation decreases from 35% to 5% as the K eff increases. This demonstrated that the ability of initiating a self-sustaining fission chain reaction approaches stabilization, while the former difference (35%) showed the important difference of the dynamic results near the first criticality with the static ones. The DSNP code agrees well with Partisn code. Conclusions: There are large numbers of

  2. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  3. Finite element model updating of concrete structures based on imprecise probability

    Science.gov (United States)

    Biswal, S.; Ramaswamy, A.

    2017-09-01

    Imprecise probability based methods are developed in this study for the parameter estimation, in finite element model updating for concrete structures, when the measurements are imprecisely defined. Bayesian analysis using Metropolis Hastings algorithm for parameter estimation is generalized to incorporate the imprecision present in the prior distribution, in the likelihood function, and in the measured responses. Three different cases are considered (i) imprecision is present in the prior distribution and in the measurements only, (ii) imprecision is present in the parameters of the finite element model and in the measurement only, and (iii) imprecision is present in the prior distribution, in the parameters of the finite element model, and in the measurements. Procedures are also developed for integrating the imprecision in the parameters of the finite element model, in the finite element software Abaqus. The proposed methods are then verified against reinforced concrete beams and prestressed concrete beams tested in our laboratory as part of this study.

  4. Selection of risk reduction portfolios under interval-valued probabilities

    International Nuclear Information System (INIS)

    Toppila, Antti; Salo, Ahti

    2017-01-01

    A central problem in risk management is that of identifying the optimal combination (or portfolio) of improvements that enhance the reliability of the system most through reducing failure event probabilities, subject to the availability of resources. This optimal portfolio can be sensitive with regard to epistemic uncertainties about the failure events' probabilities. In this paper, we develop an optimization model to support the allocation of resources to improvements that mitigate risks in coherent systems in which interval-valued probabilities defined by lower and upper bounds are employed to capture epistemic uncertainties. Decision recommendations are based on portfolio dominance: a resource allocation portfolio is dominated if there exists another portfolio that improves system reliability (i) at least as much for all feasible failure probabilities and (ii) strictly more for some feasible probabilities. Based on non-dominated portfolios, recommendations about improvements to implement are derived by inspecting in how many non-dominated portfolios a given improvement is contained. We present an exact method for computing the non-dominated portfolios. We also present an approximate method that simplifies the reliability function using total order interactions so that larger problem instances can be solved with reasonable computational effort. - Highlights: • Reliability allocation under epistemic uncertainty about probabilities. • Comparison of alternatives using dominance. • Computational methods for generating the non-dominated alternatives. • Deriving decision recommendations that are robust with respect to epistemic uncertainty.

  5. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  7. On the probability of occurrence of rogue waves

    Directory of Open Access Journals (Sweden)

    E. M. Bitner-Gregersen

    2012-03-01

    Full Text Available A number of extreme and rogue wave studies have been conducted theoretically, numerically, experimentally and based on field data in the last years, which have significantly advanced our knowledge of ocean waves. So far, however, consensus on the probability of occurrence of rogue waves has not been achieved. The present investigation is addressing this topic from the perspective of design needs. Probability of occurrence of extreme and rogue wave crests in deep water is here discussed based on higher order time simulations, experiments and hindcast data. Focus is given to occurrence of rogue waves in high sea states.

  8. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  9. Probability of foliar injury for Acer sp. based on foliar fluoride concentrations.

    Science.gov (United States)

    McDonough, Andrew M; Dixon, Murray J; Terry, Debbie T; Todd, Aaron K; Luciani, Michael A; Williamson, Michele L; Roszak, Danuta S; Farias, Kim A

    2016-12-01

    Fluoride is considered one of the most phytotoxic elements to plants, and indicative fluoride injury has been associated over a wide range of foliar fluoride concentrations. The aim of this study was to determine the probability of indicative foliar fluoride injury based on Acer sp. foliar fluoride concentrations using a logistic regression model. Foliage from Acer nedundo, Acer saccharinum, Acer saccharum and Acer platanoides was collected along a distance gradient from three separate brick manufacturing facilities in southern Ontario as part of a long-term monitoring programme between 1995 and 2014. Hydrogen fluoride is the major emission source associated with the manufacturing facilities resulting with highly elevated foliar fluoride close to the facilities and decreasing with distance. Consistent with other studies, indicative fluoride injury was observed over a wide range of foliar concentrations (9.9-480.0 μg F -  g -1 ). The logistic regression model was statistically significant for the Acer sp. group, A. negundo and A. saccharinum; consequently, A. negundo being the most sensitive species among the group. In addition, A. saccharum and A. platanoides were not statistically significant within the model. We are unaware of published foliar fluoride values for Acer sp. within Canada, and this research provides policy maker and scientist with probabilities of indicative foliar injury for common urban Acer sp. trees that can help guide decisions about emissions controls. Further research should focus on mechanisms driving indicative fluoride injury over wide ranging foliar fluoride concentrations and help determine foliar fluoride thresholds for damage.

  10. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  11. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  12. Use of probability tables for propagating uncertainties in neutronics

    International Nuclear Information System (INIS)

    Coste-Delclaux, M.; Diop, C.M.; Lahaye, S.

    2017-01-01

    Highlights: • Moment-based probability table formalism is described. • Representation by probability tables of any uncertainty distribution is established. • Multiband equations for two kinds of uncertainty propagation problems are solved. • Numerical examples are provided and validated against Monte Carlo simulations. - Abstract: Probability tables are a generic tool that allows representing any random variable whose probability density function is known. In the field of nuclear reactor physics, this tool is currently used to represent the variation of cross-sections versus energy (neutron transport codes TRIPOLI4®, MCNP, APOLLO2, APOLLO3®, ECCO/ERANOS…). In the present article we show how we can propagate uncertainties, thanks to a probability table representation, through two simple physical problems: an eigenvalue problem (neutron multiplication factor) and a depletion problem.

  13. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  14. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  15. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  16. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    Science.gov (United States)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  17. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    Science.gov (United States)

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  18. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    Science.gov (United States)

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  19. Probability Distribution of Long-run Indiscriminate Felling of Trees in ...

    African Journals Online (AJOL)

    Bright

    conditionally independent of every prior state given the current state (Obodos, ... of events or experiments in which the probability of occurrence for an event ... represent the exhaustive and mutually exclusive outcomes (states) of a system at.

  20. Sensitivity analysis of limit state functions for probability-based plastic design

    Science.gov (United States)

    Frangopol, D. M.

    1984-01-01

    The evaluation of the total probability of a plastic collapse failure P sub f for a highly redundant structure of random interdependent plastic moments acted on by random interdepedent loads is a difficult and computationally very costly process. The evaluation of reasonable bounds to this probability requires the use of second moment algebra which involves man statistical parameters. A computer program which selects the best strategy for minimizing the interval between upper and lower bounds of P sub f is now in its final stage of development. The relative importance of various uncertainties involved in the computational process on the resulting bounds of P sub f, sensitivity is analyzed. Response sensitivities for both mode and system reliability of an ideal plastic portal frame are shown.

  1. Probability of inadvertent operation of electrical components in harsh environments

    International Nuclear Information System (INIS)

    Knoll, A.

    1989-01-01

    Harsh environment, which means humidity and high temperature, may and will affect unsealed electrical components by causing leakage ground currents in ungrounded direct current systems. The concern in a nuclear power plant is that such harsh environment conditions could cause inadvertent operation of normally deenergized components, which may have a safety-related isolation function. Harsh environment is a common cause failure, and one way to approach the problem is to assume that all the unsealed electrical components will simultaneously and inadvertently energize as a result of the environmental common cause failure. This assumption is unrealistically conservative. Test results indicated that insulating resistences of any terminal block in harsh environments have a random distribution in the range of 1 to 270 kΩ, with a mean value ∼59 kΩ. The objective of this paper is to evaluate a realistic conditional failure probability for inadvertent operation of electrical components in harsh environments. This value will be used thereafter in probabilistic safety evaluations of harsh environment events and will replace both the overconservative common cause probability of 1 and the random failure probability used for mild environments

  2. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    Science.gov (United States)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-06-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  3. Recruitment in a Colorado population of big brown bats: Breeding probabilities, litter size, and first-year survival

    Science.gov (United States)

    O'Shea, T.J.; Ellison, L.E.; Neubaum, D.J.; Neubaum, M.A.; Reynolds, C.A.; Bowen, R.A.

    2010-01-01

    We used markrecapture estimation techniques and radiography to test hypotheses about 3 important aspects of recruitment in big brown bats (Eptesicus fuscus) in Fort Collins, Colorado: adult breeding probabilities, litter size, and 1st-year survival of young. We marked 2,968 females with passive integrated transponder (PIT) tags at multiple sites during 2001-2005 and based our assessments on direct recaptures (breeding probabilities) and passive detection with automated PIT tag readers (1st-year survival). We interpreted our data in relation to hypotheses regarding demographic influences of bat age, roost, and effects of years with unusual environmental conditions: extreme drought (2002) and arrival of a West Nile virus epizootic (2003). Conditional breeding probabilities at 6 roosts sampled in 2002-2005 were estimated as 0.64 (95% confidence interval [95% CI] = 0.530.73) in 1-year-old females, but were consistently high (95% CI = 0.940.96) and did not vary by roost, year, or prior year breeding status in older adults. Mean litter size was 1.11 (95% CI = 1.051.17), based on examination of 112 pregnant females by radiography. Litter size was not higher in older or larger females and was similar to results of other studies in western North America despite wide variation in latitude. First-year survival was estimated as 0.67 (95% CI = 0.610.73) for weaned females at 5 maternity roosts over 5 consecutive years, was lower than adult survival (0.79; 95% CI = 0.770.81), and varied by roost. Based on model selection criteria, strong evidence exists for complex roost and year effects on 1st-year survival. First-year survival was lowest in bats born during the drought year. Juvenile females that did not return to roosts as 1-year-olds had lower body condition indices in late summer of their natal year than those known to survive. ?? 2009 American Society of Mammalogists.

  4. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  5. The probability of containment failure by direct containment heating in Zion

    International Nuclear Information System (INIS)

    Pilch, M.M.; Yan, H.; Theofanous, T.G.

    1994-12-01

    This report is the first step in the resolution of the Direct Containment Heating (DCH) issue for the Zion Nuclear Power Plant using the Risk Oriented Accident Analysis Methodology (ROAAM). This report includes the definition of a probabilistic framework that decomposes the DCH problem into three probability density functions that reflect the most uncertain initial conditions (UO 2 mass, zirconium oxidation fraction, and steel mass). Uncertainties in the initial conditions are significant, but our quantification approach is based on establishing reasonable bounds that are not unnecessarily conservative. To this end, we also make use of the ROAAM ideas of enveloping scenarios and ''splintering.'' Two causal relations (CRs) are used in this framework: CR1 is a model that calculates the peak pressure in the containment as a function of the initial conditions, and CR2 is a model that returns the frequency of containment failure as a function of pressure within the containment. Uncertainty in CR1 is accounted for by the use of two independently developed phenomenological models, the Convection Limited Containment Heating (CLCH) model and the Two-Cell Equilibrium (TCE) model, and by probabilistically distributing the key parameter in both, which is the ratio of the melt entrainment time to the system blowdown time constant. The two phenomenological models have been compared with an extensive database including recent integral simulations at two different physical scales. The containment load distributions do not intersect the containment strength (fragility) curve in any significant way, resulting in containment failure probabilities less than 10 -3 for all scenarios considered. Sensitivity analyses did not show any areas of large sensitivity

  6. Prioritizing forest fuels treatments based on the probability of high-severity fire restores adaptive capacity in Sierran forests.

    Science.gov (United States)

    Krofcheck, Daniel J; Hurteau, Matthew D; Scheller, Robert M; Loudermilk, E Louise

    2018-02-01

    In frequent fire forests of the western United States, a legacy of fire suppression coupled with increases in fire weather severity have altered fire regimes and vegetation dynamics. When coupled with projected climate change, these conditions have the potential to lead to vegetation type change and altered carbon (C) dynamics. In the Sierra Nevada, fuels reduction approaches that include mechanical thinning followed by regular prescribed fire are one approach to restore the ability of the ecosystem to tolerate episodic fire and still sequester C. Yet, the spatial extent of the area requiring treatment makes widespread treatment implementation unlikely. We sought to determine if a priori knowledge of where uncharacteristic wildfire is most probable could be used to optimize the placement of fuels treatments in a Sierra Nevada watershed. We developed two treatment placement strategies: the naive strategy, based on treating all operationally available area and the optimized strategy, which only treated areas where crown-killing fires were most probable. We ran forecast simulations using projected climate data through 2,100 to determine how the treatments differed in terms of C sequestration, fire severity, and C emissions relative to a no-management scenario. We found that in both the short (20 years) and long (100 years) term, both management scenarios increased C stability, reduced burn severity, and consequently emitted less C as a result of wildfires than no-management. Across all metrics, both scenarios performed the same, but the optimized treatment required significantly less C removal (naive=0.42 Tg C, optimized=0.25 Tg C) to achieve the same treatment efficacy. Given the extent of western forests in need of fire restoration, efficiently allocating treatments is a critical task if we are going to restore adaptive capacity in frequent-fire forests. © 2017 John Wiley & Sons Ltd.

  7. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    Background: Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. Methods: From the Danish National...... the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  8. Catalog of 199 register-based definitions of chronic conditions

    DEFF Research Database (Denmark)

    Hvidberg, Michael F; Johnsen, Søren P; Glümer, Charlotte

    2016-01-01

    INTRODUCTION: The aim of the current study was to present and discuss a broad range of register-based definitions of chronic conditions for use in register research, as well as the challenges and pitfalls when defining chronic conditions by the use of registers. MATERIALS AND METHODS: The definit......INTRODUCTION: The aim of the current study was to present and discuss a broad range of register-based definitions of chronic conditions for use in register research, as well as the challenges and pitfalls when defining chronic conditions by the use of registers. MATERIALS AND METHODS......: The definitions were defined based on information from nationwide Danish public healthcare registers. Medical and epidemiological specialists identified and grouped relevant diagnosis codes that covered chronic conditions, using the International Classification System version 10 (ICD-10). Where relevant...... definitions were proposed based on record linkage between multiple registers, including registers of prescribed drugs and use of general practitioners' services. CONCLUSIONS THIS STUDY PROVIDED A CATALOG OF REGISTER-BASED DEFINITIONS FOR CHRONIC CONDITIONS FOR USE IN HEALTHCARE PLANNING AND RESEARCH, WHICH IS...

  9. Probability- and curve-based fractal reconstruction on 2D DEM terrain profile

    International Nuclear Information System (INIS)

    Lai, F.-J.; Huang, Y.M.

    2009-01-01

    Data compression and reconstruction has been playing important roles in information science and engineering. As part of them, image compression and reconstruction that mainly deal with image data set reduction for storage or transmission and data set restoration with least loss is still a topic deserved a great deal of works to focus on. In this paper we propose a new scheme in comparison with the well-known Improved Douglas-Peucker (IDP) method to extract characteristic or feature points of two-dimensional digital elevation model (2D DEM) terrain profile to compress data set. As for reconstruction in use of fractal interpolation, we propose a probability-based method to speed up the fractal interpolation execution to a rate as high as triple or even ninefold of the regular. In addition, a curve-based method is proposed in the study to determine the vertical scaling factor that much affects the generation of the interpolated data points to significantly improve the reconstruction performance. Finally, an evaluation is made to show the advantage of employing the proposed new method to extract characteristic points associated with our novel fractal interpolation scheme.

  10. Statistical conditional sampling for variable-resolution video compression.

    Directory of Open Access Journals (Sweden)

    Alexander Wong

    Full Text Available In this study, we investigate a variable-resolution approach to video compression based on Conditional Random Field and statistical conditional sampling in order to further improve compression rate while maintaining high-quality video. In the proposed approach, representative key-frames within a video shot are identified and stored at full resolution. The remaining frames within the video shot are stored and compressed at a reduced resolution. At the decompression stage, a region-based dictionary is constructed from the key-frames and used to restore the reduced resolution frames to the original resolution via statistical conditional sampling. The sampling approach is based on the conditional probability of the CRF modeling by use of the constructed dictionary. Experimental results show that the proposed variable-resolution approach via statistical conditional sampling has potential for improving compression rates when compared to compressing the video at full resolution, while achieving higher video quality when compared to compressing the video at reduced resolution.

  11. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  12. Personnel reliability impact on petrochemical facilities monitoring system's failure skipping probability

    Science.gov (United States)

    Kostyukov, V. N.; Naumenko, A. P.

    2017-08-01

    The paper dwells upon urgent issues of evaluating impact of actions conducted by complex technological systems operators on their safe operation considering application of condition monitoring systems for elements and sub-systems of petrochemical production facilities. The main task for the research is to distinguish factors and criteria of monitoring system properties description, which would allow to evaluate impact of errors made by personnel on operation of real-time condition monitoring and diagnostic systems for machinery of petrochemical facilities, and find and objective criteria for monitoring system class, considering a human factor. On the basis of real-time condition monitoring concepts of sudden failure skipping risk, static and dynamic error, monitoring systems, one may solve a task of evaluation of impact that personnel's qualification has on monitoring system operation in terms of error in personnel or operators' actions while receiving information from monitoring systems and operating a technological system. Operator is considered as a part of the technological system. Although, personnel's behavior is usually a combination of the following parameters: input signal - information perceiving, reaction - decision making, response - decision implementing. Based on several researches on behavior of nuclear powers station operators in USA, Italy and other countries, as well as on researches conducted by Russian scientists, required data on operator's reliability were selected for analysis of operator's behavior at technological facilities diagnostics and monitoring systems. The calculations revealed that for the monitoring system selected as an example, the failure skipping risk for the set values of static (less than 0.01) and dynamic (less than 0.001) errors considering all related factors of data on reliability of information perception, decision-making, and reaction fulfilled is 0.037, in case when all the facilities and error probability are under

  13. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    Science.gov (United States)

    Tan, Elcin

    A new physically-based methodology for probable maximum precipitation (PMP) estimation is developed over the American River Watershed (ARW) using the Weather Research and Forecast (WRF-ARW) model. A persistent moisture flux convergence pattern, called Pineapple Express, is analyzed for 42 historical extreme precipitation events, and it is found that Pineapple Express causes extreme precipitation over the basin of interest. An average correlation between moisture flux convergence and maximum precipitation is estimated as 0.71 for 42 events. The performance of the WRF model is verified for precipitation by means of calibration and independent validation of the model. The calibration procedure is performed only for the first ranked flood event 1997 case, whereas the WRF model is validated for 42 historical cases. Three nested model domains are set up with horizontal resolutions of 27 km, 9 km, and 3 km over the basin of interest. As a result of Chi-square goodness-of-fit tests, the hypothesis that "the WRF model can be used in the determination of PMP over the ARW for both areal average and point estimates" is accepted at the 5% level of significance. The sensitivities of model physics options on precipitation are determined using 28 microphysics, atmospheric boundary layer, and cumulus parameterization schemes combinations. It is concluded that the best triplet option is Thompson microphysics, Grell 3D ensemble cumulus, and YSU boundary layer (TGY), based on 42 historical cases, and this TGY triplet is used for all analyses of this research. Four techniques are proposed to evaluate physically possible maximum precipitation using the WRF: 1. Perturbations of atmospheric conditions; 2. Shift in atmospheric conditions; 3. Replacement of atmospheric conditions among historical events; and 4. Thermodynamically possible worst-case scenario creation. Moreover, climate change effect on precipitation is discussed by emphasizing temperature increase in order to determine the

  14. Physically based probability criterion for exceeding radionuclide concentration limits in heterogeneous bedrock

    International Nuclear Information System (INIS)

    Worman, A.; Xu, S.; Dverstorp, B.

    2004-01-01

    A significant problem in a risk analysis of the repository for high-level nuclear waste is to estimate the barrier effect of the geosphere. The significant spatial variability of the rock properties implies that migrating RNs encounter a distribution of bedrock properties and mass-transfer mechanisms in different proportions along the transport paths. For practical reasons, we will never be able to know exactly this distribution of properties by performing a reasonable amount of measurements in a site investigation. On the contrary, recent experimental studies reveal that crystalline bedrock can possess a marked heterogeneity of various physical and geochemical properties that potentially may have a certain impact on the transport of RNs in fractured bedrock. Also current field investigation techniques provide only fragmentary information of the properties of the geosphere. This is a basic motivation for treating flows of water and solute elements in groundwaters by means of stochastic continuum models. The stochastic analysis is based on the idea that we know only certain point values of the property fields and use this information to estimate intermediate values. The probabilistic properties of the stochastic analysis are suitable input variables for risk analyses of the relevant sequence of extreme events for which empirical observations are rare or non-existing. The purpose of this paper is to outline the implications of the stochastic approach for estimating probabilities that certain concentration limits are exceeded at discharge points from. the bedrock in case of a leakage from the waste repository. The analysis is restricted to the water flow and solute transport in the bedrock alone without consideration of the full sequence of events in a full risk analysis and the Bayesian statistics involved in such conditioned (and cross-correlated) event series. The focus is on the implication for the risk analysis of the auto-covariance structure in bedrock

  15. Bounds on survival probability given mean probability of failure per demand; and the paradoxical advantages of uncertainty

    International Nuclear Information System (INIS)

    Strigini, Lorenzo; Wright, David

    2014-01-01

    When deciding whether to accept into service a new safety-critical system, or choosing between alternative systems, uncertainty about the parameters that affect future failure probability may be a major problem. This uncertainty can be extreme if there is the possibility of unknown design errors (e.g. in software), or wide variation between nominally equivalent components. We study the effect of parameter uncertainty on future reliability (survival probability), for systems required to have low risk of even only one failure or accident over the long term (e.g. their whole operational lifetime) and characterised by a single reliability parameter (e.g. probability of failure per demand – pfd). A complete mathematical treatment requires stating a probability distribution for any parameter with uncertain value. This is hard, so calculations are often performed using point estimates, like the expected value. We investigate conditions under which such simplified descriptions yield reliability values that are sure to be pessimistic (or optimistic) bounds for a prediction based on the true distribution. Two important observations are (i) using the expected value of the reliability parameter as its true value guarantees a pessimistic estimate of reliability, a useful property in most safety-related decisions; (ii) with a given expected pfd, broader distributions (in a formally defined meaning of “broader”), that is, systems that are a priori “less predictable”, lower the risk of failures or accidents. Result (i) justifies the simplification of using a mean in reliability modelling; we discuss within which scope this justification applies, and explore related scenarios, e.g. how things improve if we can test the system before operation. Result (ii) not only offers more flexible ways of bounding reliability predictions, but also has important, often counter-intuitive implications for decision making in various areas, like selection of components, project management

  16. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  17. Generating prior probabilities for classifiers of brain tumours using belief networks

    Directory of Open Access Journals (Sweden)

    Arvanitis Theodoros N

    2007-09-01

    Full Text Available Abstract Background Numerous methods for classifying brain tumours based on magnetic resonance spectra and imaging have been presented in the last 15 years. Generally, these methods use supervised machine learning to develop a classifier from a database of cases for which the diagnosis is already known. However, little has been published on developing classifiers based on mixed modalities, e.g. combining imaging information with spectroscopy. In this work a method of generating probabilities of tumour class from anatomical location is presented. Methods The method of "belief networks" is introduced as a means of generating probabilities that a tumour is any given type. The belief networks are constructed using a database of paediatric tumour cases consisting of data collected over five decades; the problems associated with using this data are discussed. To verify the usefulness of the networks, an application of the method is presented in which prior probabilities were generated and combined with a classification of tumours based solely on MRS data. Results Belief networks were constructed from a database of over 1300 cases. These can be used to generate a probability that a tumour is any given type. Networks are presented for astrocytoma grades I and II, astrocytoma grades III and IV, ependymoma, pineoblastoma, primitive neuroectodermal tumour (PNET, germinoma, medulloblastoma, craniopharyngioma and a group representing rare tumours, "other". Using the network to generate prior probabilities for classification improves the accuracy when compared with generating prior probabilities based on class prevalence. Conclusion Bayesian belief networks are a simple way of using discrete clinical information to generate probabilities usable in classification. The belief network method can be robust to incomplete datasets. Inclusion of a priori knowledge is an effective way of improving classification of brain tumours by non-invasive methods.

  18. An attempt to model the probability of growth and aflatoxin B1 production of Aspergillus flavus under non-isothermal conditions in pistachio nuts.

    Science.gov (United States)

    Aldars-García, Laila; Ramos, Antonio J; Sanchis, Vicente; Marín, Sonia

    2015-10-01

    Human exposure to aflatoxins in foods is of great concern. The aim of this work was to use predictive mycology as a strategy to mitigate the aflatoxin burden in pistachio nuts postharvest. The probability of growth and aflatoxin B1 (AFB1) production of aflatoxigenic Aspergillus flavus, isolated from pistachio nuts, under static and non-isothermal conditions was studied. Four theoretical temperature scenarios, including temperature levels observed in pistachio nuts during shipping and storage, were used. Two types of inoculum were included: a cocktail of 25 A. flavus isolates and a single isolate inoculum. Initial water activity was adjusted to 0.87. Logistic models, with temperature and time as explanatory variables, were fitted to the probability of growth and AFB1 production under a constant temperature. Subsequently, they were used to predict probabilities under non-isothermal scenarios, with levels of concordance from 90 to 100% in most of the cases. Furthermore, the presence of AFB1 in pistachio nuts could be correctly predicted in 70-81 % of the cases from a growth model developed in pistachio nuts, and in 67-81% of the cases from an AFB1 model developed in pistachio agar. The information obtained in the present work could be used by producers and processors to predict the time for AFB1 production by A. flavus on pistachio nuts during transport and storage. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Uncertainty plus Prior Equals Rational Bias: An Intuitive Bayesian Probability Weighting Function

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-01-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several…

  20. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  1. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  2. Probability distribution functions for intermittent scrape-off layer plasma fluctuations

    Science.gov (United States)

    Theodorsen, A.; Garcia, O. E.

    2018-03-01

    A stochastic model for intermittent fluctuations in the scrape-off layer of magnetically confined plasmas has been constructed based on a super-position of uncorrelated pulses arriving according to a Poisson process. In the most common applications of the model, the pulse amplitudes are assumed exponentially distributed, supported by conditional averaging of large-amplitude fluctuations in experimental measurement data. This basic assumption has two potential limitations. First, statistical analysis of measurement data using conditional averaging only reveals the tail of the amplitude distribution to be exponentially distributed. Second, exponentially distributed amplitudes leads to a positive definite signal which cannot capture fluctuations in for example electric potential and radial velocity. Assuming pulse amplitudes which are not positive definite often make finding a closed form for the probability density function (PDF) difficult, even if the characteristic function remains relatively simple. Thus estimating model parameters requires an approach based on the characteristic function, not the PDF. In this contribution, the effect of changing the amplitude distribution on the moments, PDF and characteristic function of the process is investigated and a parameter estimation method using the empirical characteristic function is presented and tested on synthetically generated data. This proves valuable for describing intermittent fluctuations of all plasma parameters in the boundary region of magnetized plasmas.

  3. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  4. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  5. Probability dynamics of a repopulating tumor in case of fractionated external radiotherapy.

    Science.gov (United States)

    Stavreva, Nadia; Stavrev, Pavel; Fallone, B Gino

    2009-12-01

    In this work two analytical methods are developed for computing the probability distribution of the number of surviving cells of a repopulating tumor during a fractionated external radio-treatment. Both methods are developed for the case of pure birth processes. They both allow the description of the tumor dynamics in case of cell radiosensitivity changing in time and for treatment schedules with variable dose per fraction and variable time intervals between fractions. The first method is based on a direct solution of the set of differential equations describing the tumor dynamics. The second method is based on the works of Hanin et al. [Hanin LG, Zaider M, Yakovlev AY. Distribution of the number of clonogens surviving fractionated radiotherapy: a long-standing problem revisited. Int J Radiat Biol 2001;77:205-13; Hanin LG. Iterated birth and death process as a model of radiation cell survival. Math Biosci 2001;169:89-107; Hanin LG. A stochastic model of tumor response to fractionated radiation: limit theorems and rate of convergence. Math Biosci 2004;191:1-17], where probability generating functions are used. In addition a Monte Carlo algorithm for simulating the probability distributions is developed for the same treatment conditions as for the analytical methods. The probability distributions predicted by the three methods are compared graphically for a certain set of values of the model parameters and an excellent agreement is found to exist between all three results, thus proving the correct implementation of the methods. However, numerical difficulties have been encountered with both analytical methods depending on the values of the model parameters. Therefore, the Poisson approximation is also studied and it is compared to the exact methods for several different combinations of the model parameter values. It is concluded that the Poisson approximation works sufficiently well only for slowly repopulating tumors and a low cell survival probability and that it

  6. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  7. Condition based maintenance pilot projects at Pickering ND

    International Nuclear Information System (INIS)

    Zemdegs, R.T.

    1995-01-01

    Ontario Hydro has recognized that the approaches to maintenance have undergone significant changes to the past decades. The traditional break down maintenance approach has been replaced by preventative maintenance and more recently, by condition based maintenance. The nuclear plants of Ontario Hydro have evaluated on a number of alternative programs to improve their maintenance effectiveness and to reduce costs, including Reliability Centred Maintenance (RCM), call-up review, component-based PM programs, analysis of failure history and so on. Pickering ND (nuclear division) and Ontario Hydro's Nuclear Technologies Services Division, have embarked on a Condition Based Maintenance pilot project to address the above issues as a breakthrough solution for smarter maintenance. The Condition Based Maintenance pilot project will demonstrate an end-to-end process utilizing a Reliability Centred Maintenance structured approach to re-engineer and redefine the existing maintenance programs. The project emphasizes on-condition maintenance where justified, and utilizes an information management tool to provide the required records keeping and analysis infrastructure. This paper briefly describes the planned maintenance model at Pickering ND used to guide the CBM pilot, and an overview of the methodology used to develop on-condition equipment indicators as part of a re-engineered maintenance plan

  8. Transmission probability method based on triangle meshes for solving unstructured geometry neutron transport problem

    Energy Technology Data Exchange (ETDEWEB)

    Wu Hongchun [Nuclear Engineering Department, Xi' an Jiaotong University, Xi' an 710049, Shaanxi (China)]. E-mail: hongchun@mail.xjtu.edu.cn; Liu Pingping [Nuclear Engineering Department, Xi' an Jiaotong University, Xi' an 710049, Shaanxi (China); Zhou Yongqiang [Nuclear Engineering Department, Xi' an Jiaotong University, Xi' an 710049, Shaanxi (China); Cao Liangzhi [Nuclear Engineering Department, Xi' an Jiaotong University, Xi' an 710049, Shaanxi (China)

    2007-01-15

    In the advanced reactor, the fuel assembly or core with unstructured geometry is frequently used and for calculating its fuel assembly, the transmission probability method (TPM) has been used widely. However, the rectangle or hexagon meshes are mainly used in the TPM codes for the normal core structure. The triangle meshes are most useful for expressing the complicated unstructured geometry. Even though finite element method and Monte Carlo method is very good at solving unstructured geometry problem, they are very time consuming. So we developed the TPM code based on the triangle meshes. The TPM code based on the triangle meshes was applied to the hybrid fuel geometry, and compared with the results of the MCNP code and other codes. The results of comparison were consistent with each other. The TPM with triangle meshes would thus be expected to be able to apply to the two-dimensional arbitrary fuel assembly.

  9. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  10. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    Science.gov (United States)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-09-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  11. Recursive recovery of Markov transition probabilities from boundary value data

    Energy Technology Data Exchange (ETDEWEB)

    Patch, Sarah Kathyrn [Univ. of California, Berkeley, CA (United States)

    1994-04-01

    In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requires finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 x 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 x 2 x 2 problem, is solved.

  12. Constructing quantum games from symmetric non-factorizable joint probabilities

    International Nuclear Information System (INIS)

    Chappell, James M.; Iqbal, Azhar; Abbott, Derek

    2010-01-01

    We construct quantum games from a table of non-factorizable joint probabilities, coupled with a symmetry constraint, requiring symmetrical payoffs between the players. We give the general result for a Nash equilibrium and payoff relations for a game based on non-factorizable joint probabilities, which embeds the classical game. We study a quantum version of Prisoners' Dilemma, Stag Hunt, and the Chicken game constructed from a given table of non-factorizable joint probabilities to find new outcomes in these games. We show that this approach provides a general framework for both classical and quantum games without recourse to the formalism of quantum mechanics.

  13. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  14. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  15. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  16. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  17. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    Science.gov (United States)

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-04

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .

  18. Unbiased multi-fidelity estimate of failure probability of a free plane jet

    Science.gov (United States)

    Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin

    2017-11-01

    Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.

  19. Site-to-Source Finite Fault Distance Probability Distribution in Probabilistic Seismic Hazard and the Relationship Between Minimum Distances

    Science.gov (United States)

    Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.

    2017-12-01

    We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.

  20. The Misapplication of Probability Theory in Quantum Mechanics

    Science.gov (United States)

    Racicot, Ronald

    2014-03-01

    This article is a revision of two papers submitted to the APS in the past two and a half years. In these papers, arguments and proofs are summarized for the following: (1) The wrong conclusion by EPR that Quantum Mechanics is incomplete, perhaps requiring the addition of ``hidden variables'' for completion. Theorems that assume such ``hidden variables,'' such as Bell's theorem, are also wrong. (2) Quantum entanglement is not a realizable physical phenomenon and is based entirely on assuming a probability superposition model for quantum spin. Such a model directly violates conservation of angular momentum. (3) Simultaneous multiple-paths followed by a quantum particle traveling through space also cannot possibly exist. Besides violating Noether's theorem, the multiple-paths theory is based solely on probability calculations. Probability calculations by themselves cannot possibly represent simultaneous physically real events. None of the reviews of the submitted papers actually refuted the arguments and evidence that was presented. These analyses should therefore be carefully evaluated since the conclusions reached have such important impact in quantum mechanics and quantum information theory.

  1. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  2. Pipe failure probability - the Thomas paper revisited

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.

    2000-01-01

    Almost twenty years ago, in Volume 2 of Reliability Engineering (the predecessor of Reliability Engineering and System Safety), a paper by H. M. Thomas of Rolls Royce and Associates Ltd. presented a generalized approach to the estimation of piping and vessel failure probability. The 'Thomas-approach' used insights from actual failure statistics to calculate the probability of leakage and conditional probability of rupture given leakage. It was intended for practitioners without access to data on the service experience with piping and piping system components. This article revisits the Thomas paper by drawing on insights from development of a new database on piping failures in commercial nuclear power plants worldwide (SKI-PIPE). Partially sponsored by the Swedish Nuclear Power Inspectorate (SKI), the R and D leading up to this note was performed during 1994-1999. Motivated by data requirements of reliability analysis and probabilistic safety assessment (PSA), the new database supports statistical analysis of piping failure data. Against the background of this database development program, the article reviews the applicability of the 'Thomas approach' in applied risk and reliability analysis. It addresses the question whether a new and expanded database on the service experience with piping systems would alter the original piping reliability correlation as suggested by H. M. Thomas

  3. Constraint-based Student Modelling in Probability Story Problems with Scaffolding Techniques

    Directory of Open Access Journals (Sweden)

    Nabila Khodeir

    2018-01-01

    Full Text Available Constraint-based student modelling (CBM is an important technique employed in intelligent tutoring systems to model student knowledge to provide relevant assistance. This paper introduces the Math Story Problem Tutor (MAST, a Web-based intelligent tutoring system for probability story problems, which is able to generate problems of different contexts, types and difficulty levels for self-paced learning. Constraints in MAST are specified at a low-level of granularity to allow fine-grained diagnosis of the student error. Furthermore, MAST extends CBM to address errors due to misunderstanding of the narrative story. It can locate and highlight keywords that may have been overlooked or misunderstood leading to an error. This is achieved by utilizing the role of sentences and keywords that are defined through the Natural Language Generation (NLG methods deployed in the story problem generation. MAST also integrates CBM with scaffolding questions and feedback to provide various forms of help and guidance to the student. This allows the student to discover and correct any errors in his/her solution. MAST has been preliminary evaluated empirically and the results show the potential effectiveness in tutoring students with a decrease in the percentage of violated constraints along the learning curve. Additionally, there is a significant improvement in the results of the post–test exam in comparison to the pre-test exam of the students using MAST in comparison to those relying on the textbook

  4. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  5. Tuned by experience: How orientation probability modulates early perceptual processing.

    Science.gov (United States)

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-09-01

    Probable stimuli are more often and more quickly detected. While stimulus probability is known to affect decision-making, it can also be explained as a perceptual phenomenon. Using spatial gratings, we have previously shown that probable orientations are also more precisely estimated, even while participants remained naive to the manipulation. We conducted an electrophysiological study to investigate the effect that probability has on perception and visual-evoked potentials. In line with previous studies on oddballs and stimulus prevalence, low-probability orientations were associated with a greater late positive 'P300' component which might be related to either surprise or decision-making. However, the early 'C1' component, thought to reflect V1 processing, was dampened for high-probability orientations while later P1 and N1 components were unaffected. Exploratory analyses revealed a participant-level correlation between C1 and P300 amplitudes, suggesting a link between perceptual processing and decision-making. We discuss how these probability effects could be indicative of sharpening of neurons preferring the probable orientations, due either to perceptual learning, or to feature-based attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. An Intelligent Fleet Condition-Based Maintenance Decision Making Method Based on Multi-Agent

    OpenAIRE

    Bo Sun; Qiang Feng; Songjie Li

    2012-01-01

    According to the demand for condition-based maintenance online decision making among a mission oriented fleet, an intelligent maintenance decision making method based on Multi-agent and heuristic rules is proposed. The process of condition-based maintenance within an aircraft fleet (each containing one or more Line Replaceable Modules) based on multiple maintenance thresholds is analyzed. Then the process is abstracted into a Multi-Agent Model, a 2-layer model structure containing host negoti...

  7. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  8. Pure perceptual-based learning of second-, third-, and fourth-order sequential probabilities.

    Science.gov (United States)

    Remillard, Gilbert

    2011-07-01

    There is evidence that sequence learning in the traditional serial reaction time task (SRTT), where target location is the response dimension, and sequence learning in the perceptual SRTT, where target location is not the response dimension, are handled by different mechanisms. The ability of the latter mechanism to learn sequential contingencies that can be learned by the former mechanism was examined. Prior research has established that people can learn second-, third-, and fourth-order probabilities in the traditional SRTT. The present study reveals that people can learn such probabilities in the perceptual SRTT. This suggests that the two mechanisms may have similar architectures. A possible neural basis of the two mechanisms is discussed.

  9. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  10. Landslide Probability Assessment by the Derived Distributions Technique

    Science.gov (United States)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model

  11. Betting on conditionals

    OpenAIRE

    Politzer , Guy; Over , David P; Baratgin , Jean

    2010-01-01

    A study is reported testing two hypotheses about a close parallel relation between indicative conditionals, if A then B, and conditional bets, I bet you that if A then B. The first is that both the indicative conditional and the conditional bet are related to the conditional probability, P(B|A). The second is that de Finetti's three-valued truth table has psychological reality for both types of conditional – true, false, or void for indicative conditionals and win, lose or void for conditiona...

  12. The Effects of Framing, Reflection, Probability, and Payoff on Risk Preference in Choice Tasks.

    Science.gov (United States)

    Kühberger; Schulte-Mecklenbeck; Perner

    1999-06-01

    A meta-analysis of Asian-disease-like studies is presented to identify the factors which determine risk preference. First the confoundings between probability levels, payoffs, and framing conditions are clarified in a task analysis. Then the role of framing, reflection, probability, type, and size of payoff is evaluated in a meta-analysis. It is shown that bidirectional framing effects exist for gains and for losses. Presenting outcomes as gains tends to induce risk aversion, while presenting outcomes as losses tends to induce risk seeking. Risk preference is also shown to depend on the size of the payoffs, on the probability levels, and on the type of good at stake (money/property vs human lives). In general, higher payoffs lead to increasing risk aversion. Higher probabilities lead to increasing risk aversion for gains and to increasing risk seeking for losses. These findings are confirmed by a subsequent empirical test. Shortcomings of existing formal theories, such as prospect theory, cumulative prospect theory, venture theory, and Markowitz's utility theory, are identified. It is shown that it is not probabilities or payoffs, but the framing condition, which explains most variance. These findings are interpreted as showing that no linear combination of formally relevant predictors is sufficient to capture the essence of the framing phenomenon. Copyright 1999 Academic Press.

  13. Sensitivity analysis on the effect of software-induced common cause failure probability in the computer-based reactor trip system unavailability

    International Nuclear Information System (INIS)

    Kamyab, Shahabeddin; Nematollahi, Mohammadreza; Shafiee, Golnoush

    2013-01-01

    Highlights: ► Importance and sensitivity analysis has been performed for a digitized reactor trip system. ► The results show acceptable trip unavailability, for software failure probabilities below 1E −4 . ► However, the value of Fussell–Vesley indicates that software common cause failure is still risk significant. ► Diversity and effective test is founded beneficial to reduce software contribution. - Abstract: The reactor trip system has been digitized in advanced nuclear power plants, since the programmable nature of computer based systems has a number of advantages over non-programmable systems. However, software is still vulnerable to common cause failure (CCF). Residual software faults represent a CCF concern, which threat the implemented achievements. This study attempts to assess the effectiveness of so-called defensive strategies against software CCF with respect to reliability. Sensitivity analysis has been performed by re-quantifying the models upon changing the software failure probability. Importance measures then have been estimated in order to reveal the specific contribution of software CCF in the trip failure probability. The results reveal the importance and effectiveness of signal and software diversity as applicable strategies to ameliorate inefficiencies due to software CCF in the reactor trip system (RTS). No significant change has been observed in the rate of RTS failure probability for the basic software CCF greater than 1 × 10 −4 . However, the related Fussell–Vesley has been greater than 0.005, for the lower values. The study concludes that consideration of risk associated with the software based systems is a multi-variant function which requires compromising among them in more precise and comprehensive studies

  14. The Context Matters: Outcome Probability and Expectation Mismatch Modulate the Feedback Negativity When Self-Evaluation of Response Correctness Is Possible.

    Science.gov (United States)

    Leue, Anja; Cano Rodilla, Carmen; Beauducel, André

    2015-01-01

    Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated.

  15. Constructing quantum games from symmetric non-factorizable joint probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Chappell, James M., E-mail: james.m.chappell@adelaide.edu.a [School of Chemistry and Physics, University of Adelaide, South Australia 5005 (Australia); School of Electrical and Electronic Engineering, University of Adelaide, South Australia 5005 (Australia); Iqbal, Azhar [School of Electrical and Electronic Engineering, University of Adelaide, South Australia 5005 (Australia); Centre for Advanced Mathematics and Physics, National University of Sciences and Technology, Peshawar Road, Rawalpindi (Pakistan); Abbott, Derek [School of Electrical and Electronic Engineering, University of Adelaide, South Australia 5005 (Australia)

    2010-09-06

    We construct quantum games from a table of non-factorizable joint probabilities, coupled with a symmetry constraint, requiring symmetrical payoffs between the players. We give the general result for a Nash equilibrium and payoff relations for a game based on non-factorizable joint probabilities, which embeds the classical game. We study a quantum version of Prisoners' Dilemma, Stag Hunt, and the Chicken game constructed from a given table of non-factorizable joint probabilities to find new outcomes in these games. We show that this approach provides a general framework for both classical and quantum games without recourse to the formalism of quantum mechanics.

  16. On Farmer's line, probability density functions, and overall risk

    International Nuclear Information System (INIS)

    Munera, H.A.; Yadigaroglu, G.

    1986-01-01

    Limit lines used to define quantitative probabilistic safety goals can be categorized according to whether they are based on discrete pairs of event sequences and associated probabilities, on probability density functions (pdf's), or on complementary cumulative density functions (CCDFs). In particular, the concept of the well-known Farmer's line and its subsequent reinterpretations is clarified. It is shown that Farmer's lines are pdf's and, therefore, the overall risk (defined as the expected value of the pdf) that they represent can be easily calculated. It is also shown that the area under Farmer's line is proportional to probability, while the areas under CCDFs are generally proportional to expected value

  17. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  18. Condition based maintenance in the context of opportunistic maintenance

    NARCIS (Netherlands)

    Koochaki, Javid; Bokhorst, Jos A. C.; Wortmann, Hans; Klingenberg, Warse

    2012-01-01

    Condition based maintenance (CBM) uses the operating condition of a component to predict a failure event. Compared to age based replacement (ABR), CBM usually results in higher availability and lower maintenance costs, since it tries to prevent unplanned downtime and avoid unnecessary preventive

  19. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  20. Estimation of the probability of success in petroleum exploration

    Science.gov (United States)

    Davis, J.C.

    1977-01-01

    A probabilistic model for oil exploration can be developed by assessing the conditional relationship between perceived geologic variables and the subsequent discovery of petroleum. Such a model includes two probabilistic components, the first reflecting the association between a geologic condition (structural closure, for example) and the occurrence of oil, and the second reflecting the uncertainty associated with the estimation of geologic variables in areas of limited control. Estimates of the conditional relationship between geologic variables and subsequent production can be found by analyzing the exploration history of a "training area" judged to be geologically similar to the exploration area. The geologic variables are assessed over the training area using an historical subset of the available data, whose density corresponds to the present control density in the exploration area. The success or failure of wells drilled in the training area subsequent to the time corresponding to the historical subset provides empirical estimates of the probability of success conditional upon geology. Uncertainty in perception of geological conditions may be estimated from the distribution of errors made in geologic assessment using the historical subset of control wells. These errors may be expressed as a linear function of distance from available control. Alternatively, the uncertainty may be found by calculating the semivariogram of the geologic variables used in the analysis: the two procedures will yield approximately equivalent results. The empirical probability functions may then be transferred to the exploration area and used to estimate the likelihood of success of specific exploration plays. These estimates will reflect both the conditional relationship between the geological variables used to guide exploration and the uncertainty resulting from lack of control. The technique is illustrated with case histories from the mid-Continent area of the U.S.A. ?? 1977 Plenum

  1. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  2. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  3. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    Science.gov (United States)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  4. Call Arrival Rate Prediction and Blocking Probability Estimation for Infrastructure based Mobile Cognitive Radio Personal Area Network

    Directory of Open Access Journals (Sweden)

    Neeta Nathani

    2017-08-01

    Full Text Available The Cognitive Radio usage has been estimated as non-emergency service with low volume traffic. Present work proposes an infrastructure based Cognitive Radio network and probability of success of CR traffic in licensed band. The Cognitive Radio nodes will form cluster. The cluster nodes will communicate on Industrial, Scientific and Medical band using IPv6 over Low-Power Wireless Personal Area Network based protocol from sensor to Gateway Cluster Head. For Cognitive Radio-Media Access Control protocol for Gateway to Cognitive Radio-Base Station communication, it will use vacant channels of licensed band. Standalone secondary users of Cognitive Radio Network shall be considered as a Gateway with one user. The Gateway will handle multi-channel multi radio for communication with Base Station. Cognitive Radio Network operators shall define various traffic data accumulation counters at Base Station for storing signal strength, Carrier-to-Interference and Noise Ratio, etc. parameters and record channel occupied/vacant status. The researches has been done so far using hour as interval is too long for parameters like holding time expressed in minutes and hence channel vacant/occupied status time is only probabilistically calculated. In the present work, an infrastructure based architecture has been proposed which polls channel status each minute in contrary to hourly polling of data. The Gateways of the Cognitive Radio Network shall monitor status of each Primary User periodically inside its working range and shall inform to Cognitive Radio- Base Station for preparation of minutewise database. For simulation, the occupancy data for all primary user channels were pulled in one minute interval from a live mobile network. Hourly traffic data and minutewise holding times has been analyzed to optimize the parameters of Seasonal Auto Regressive Integrated Moving Average prediction model. The blocking probability of an incoming Cognitive Radio call has been

  5. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  6. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  7. Quaternion Based Thermal Condition Monitoring System

    Science.gov (United States)

    Wong, Wai Kit; Loo, Chu Kiong; Lim, Way Soong; Tan, Poi Ngee

    In this paper, we will propose a new and effective machine condition monitoring system using log-polar mapper, quaternion based thermal image correlator and max-product fuzzy neural network classifier. Two classification characteristics namely: peak to sidelobe ratio (PSR) and real to complex ratio of the discrete quaternion correlation output (p-value) are applied in the proposed machine condition monitoring system. Large PSR and p-value observe in a good match among correlation of the input thermal image with a particular reference image, while small PSR and p-value observe in a bad/not match among correlation of the input thermal image with a particular reference image. In simulation, we also discover that log-polar mapping actually help solving rotation and scaling invariant problems in quaternion based thermal image correlation. Beside that, log-polar mapping can have a two fold of data compression capability. Log-polar mapping can help smoother up the output correlation plane too, hence makes a better measurement way for PSR and p-values. Simulation results also show that the proposed system is an efficient machine condition monitoring system with accuracy more than 98%.

  8. Predictive probability methods for interim monitoring in clinical trials with longitudinal outcomes.

    Science.gov (United States)

    Zhou, Ming; Tang, Qi; Lang, Lixin; Xing, Jun; Tatsuoka, Kay

    2018-04-17

    In clinical research and development, interim monitoring is critical for better decision-making and minimizing the risk of exposing patients to possible ineffective therapies. For interim futility or efficacy monitoring, predictive probability methods are widely adopted in practice. Those methods have been well studied for univariate variables. However, for longitudinal studies, predictive probability methods using univariate information from only completers may not be most efficient, and data from on-going subjects can be utilized to improve efficiency. On the other hand, leveraging information from on-going subjects could allow an interim analysis to be potentially conducted once a sufficient number of subjects reach an earlier time point. For longitudinal outcomes, we derive closed-form formulas for predictive probabilities, including Bayesian predictive probability, predictive power, and conditional power and also give closed-form solutions for predictive probability of success in a future trial and the predictive probability of success of the best dose. When predictive probabilities are used for interim monitoring, we study their distributions and discuss their analytical cutoff values or stopping boundaries that have desired operating characteristics. We show that predictive probabilities utilizing all longitudinal information are more efficient for interim monitoring than that using information from completers only. To illustrate their practical application for longitudinal data, we analyze 2 real data examples from clinical trials. Copyright © 2018 John Wiley & Sons, Ltd.

  9. A diagnostic strategy for pulmonary embolism based on standardised pretest probability and perfusion lung scanning: a management study

    International Nuclear Information System (INIS)

    Miniati, Massimo; Monti, Simonetta; Bauleo, Carolina; Scoscia, Elvio; Tonelli, Lucia; Dainelli, Alba; Catapano, Giosue; Formichi, Bruno; Di Ricco, Giorgio; Prediletto, Renato; Carrozzi, Laura; Marini, Carlo

    2003-01-01

    Pulmonary embolism remains a challenging diagnostic problem. We developed a simple diagnostic strategy based on combination of assessment of the pretest probability with perfusion lung scan results to reduce the need for pulmonary angiography. We studied 390 consecutive patients (78% in-patients) with suspected pulmonary embolism. The pretest probability was rated low ( 10%, ≤50%), moderately high (>50%, ≤90%) or high (>90%) according to a structured clinical model. Perfusion lung scans were independently assigned to one of four categories: normal; near-normal; abnormal, suggestive of pulmonary embolism (wedge-shaped perfusion defects); abnormal, not suggestive of pulmonary embolism (perfusion defects other than wedge shaped). Pulmonary embolism was diagnosed in patients with abnormal scans suggestive of pulmonary embolism and moderately high or high pretest probability. Patients with normal or near-normal scans and those with abnormal scans not suggestive of pulmonary embolism and low pretest probability were deemed not to have pulmonary embolism. All other patients were allocated to pulmonary angiography. Patients in whom pulmonary embolism was excluded were left untreated. All patients were followed up for 1 year. Pulmonary embolism was diagnosed non-invasively in 132 patients (34%), and excluded in 191 (49%). Pulmonary angiography was required in 67 patients (17%). The prevalence of pulmonary embolism was 41% (n=160). Patients in whom pulmonary embolism was excluded had a thrombo-embolic risk of 0.4% (95% confidence interval: 0.0%-2.8%). Our strategy permitted a non-invasive diagnosis or exclusion of pulmonary embolism in 83% of the cases (95% confidence interval: 79%-86%), and appeared to be safe. (orig.)

  10. Predicting longitudinal trajectories of health probabilities with random-effects multinomial logit regression.

    Science.gov (United States)

    Liu, Xian; Engel, Charles C

    2012-12-20

    Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond existing work by developing a retransformation method that derives longitudinal growth trajectories of unbiased health probabilities. We estimated variances of the predicted probabilities by using the delta method. Additionally, we transformed the covariates' regression coefficients on the multinomial logit function, not substantively meaningful, to the conditional effects on the predicted probabilities. The empirical illustration uses the longitudinal data from the Asset and Health Dynamics among the Oldest Old. Our analysis compared three sets of the predicted probabilities of three health states at six time points, obtained from, respectively, the retransformation method, the best linear unbiased prediction, and the fixed-effects approach. The results demonstrate that neglect of retransforming random errors in the random-effects multinomial logit model results in severely biased longitudinal trajectories of health probabilities as well as overestimated effects of covariates on the probabilities. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Probability Modeling and Thinking: What Can We Learn from Practice?

    Science.gov (United States)

    Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze

    2016-01-01

    Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…

  12. Failure probability assessment of wall-thinned nuclear pipes using probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Lee, Sang-Min; Chang, Yoon-Suk; Choi, Jae-Boong; Kim, Young-Jin

    2006-01-01

    The integrity of nuclear piping system has to be maintained during operation. In order to maintain the integrity, reliable assessment procedures including fracture mechanics analysis, etc., are required. Up to now, this has been performed using conventional deterministic approaches even though there are many uncertainties to hinder a rational evaluation. In this respect, probabilistic approaches are considered as an appropriate method for piping system evaluation. The objectives of this paper are to estimate the failure probabilities of wall-thinned pipes in nuclear secondary systems and to propose limited operating conditions under different types of loadings. To do this, a probabilistic assessment program using reliability index and simulation techniques was developed and applied to evaluate failure probabilities of wall-thinned pipes subjected to internal pressure, bending moment and combined loading of them. The sensitivity analysis results as well as prototypal integrity assessment results showed a promising applicability of the probabilistic assessment program, necessity of practical evaluation reflecting combined loading condition and operation considering limited condition

  13. Data systematics and semidirect decay probability of the giant dipole resonance

    International Nuclear Information System (INIS)

    Ishkhanov, B.S.; Kapitonov, I.M.; Tutyn', I.A.

    1998-01-01

    Information on probability of semidirect decay of giant dipole resonance of nuclei of sd- and fp-shells (A = 16-58) is elaborated on the base of the recent (γ, χγ ' ) experimental results. The shell effect in A-dependence of this probability is discovered

  14. Quantum operations, state transformations and probabilities

    International Nuclear Information System (INIS)

    Chefles, Anthony

    2002-01-01

    In quantum operations, probabilities characterize both the degree of the success of a state transformation and, as density operator eigenvalues, the degree of mixedness of the final state. We give a unified treatment of pure→pure state transformations, covering both probabilistic and deterministic cases. We then discuss the role of majorization in describing the dynamics of mixing in quantum operations. The conditions for mixing enhancement for all initial states are derived. We show that mixing is monotonically decreasing for deterministic pure→pure transformations, and discuss the relationship between these transformations and deterministic local operations with classical communication entanglement transformations

  15. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  16. Fractal supersymmetric QM, Geometric Probability and the Riemann Hypothesis

    CERN Document Server

    Castro, C

    2004-01-01

    The Riemann's hypothesis (RH) states that the nontrivial zeros of the Riemann zeta-function are of the form $ s_n =1/2+i\\lambda_n $. Earlier work on the RH based on supersymmetric QM, whose potential was related to the Gauss-Jacobi theta series, allows to provide the proper framework to construct the well defined algorithm to compute the probability to find a zero (an infinity of zeros) in the critical line. Geometric probability theory furnishes the answer to the very difficult question whether the probability that the RH is true is indeed equal to unity or not. To test the validity of this geometric probabilistic framework to compute the probability if the RH is true, we apply it directly to the the hyperbolic sine function $ \\sinh (s) $ case which obeys a trivial analog of the RH (the HSRH). Its zeros are equally spaced in the imaginary axis $ s_n = 0 + i n \\pi $. The geometric probability to find a zero (and an infinity of zeros) in the imaginary axis is exactly unity. We proceed with a fractal supersymme...

  17. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  18. A probability model for the failure of pressure containing parts

    International Nuclear Information System (INIS)

    Thomas, H.M.

    1978-01-01

    The model provides a method of estimating the order of magnitude of the leakage failure probability of pressure containing parts. It is a fatigue based model which makes use of the statistics available for both specimens and vessels. Some novel concepts are introduced but essentially the model simply quantifies the obvious i.e. that failure probability increases with increases in stress levels, number of cycles, volume of material and volume of weld metal. A further model based on fracture mechanics estimates the catastrophic fraction of leakage failures. (author)

  19. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  20. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    Science.gov (United States)

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-01-01

    Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept.The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation.No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice