WorldWideScience

Sample records for subjective probability estimates

  1. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...

  2. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...

  3. Estimation of first excursion probability for mechanical appendage system subjected to nonstationary earthquake excitation

    Energy Technology Data Exchange (ETDEWEB)

    Aoki, Shigeru; Suzuki, Kohei (Tokyo Metropolitan Univ. (Japan))

    1984-06-01

    An estimation technique whereby the first excursion probability of the mechanical appendage system subjected to the nonstationary seismic excitation can be conventionally calculated is proposed. The first excursion probability of the appendage system is estimated by using this method and the following results are obtained. (1) The probability from this technique is more convervative than that from a simulation technique taking artificial time histories compatible to the design spectrum as input excitation. (2) The first excursion probability is practically independent of the natural period of the appendage system when the tolerable barrier level is normalized by the response amplification factor given by the design spectrum. (3) The first excursion probability decreases as the damping ratio of the appendage system increases. It also decreases as the mass ratio of the appendage system to the supporting system increases. (4) For the inelastic appendage system, the first excursion probability is reduced, if an appropriate elongation is permitted.

  4. Implementation of Subjective Probability Estimates in Army Intelligence Procedures: A Critical Review of Research Findings

    Science.gov (United States)

    1980-03-01

    subjective probabil- ity estimates have been incorporated routinely into tactical intelligence comunications . Research in the area of intelligence...analysis: Report on Phase I. Report FSC-71-5047. Gaithersburg, Md.: International Business Machines (IBM), Federal Systems Division, 1971. Kelly, C. W

  5. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  6. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  7. Developing a Methodology for Eliciting Subjective Probability Estimates During Expert Evaluations of Safety Interventions: Application for Bayesian Belief Networks

    Science.gov (United States)

    Wiegmann, Douglas A.a

    2005-01-01

    The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.

  8. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  9. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2014-01-01

    We evaluate a binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Prior research has shown this procedure to robustly induce risk neutrality when subjects are given a single risk task defined over objective probabilities. Drawing a sample from...... the same subject population, we find evidence that the binary lottery procedure also induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation of subjective probabilities in subjects...

  10. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    We evaluate the binary lottery procedure for inducing risk neutral behavior in a subjective belief elicitation task. Harrison, Martínez-Correa and Swarthout [2013] found that the binary lottery procedure works robustly to induce risk neutrality when subjects are given one risk task defined over...... objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  11. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  12. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  13. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki

    2017-07-28

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  14. Robust Model-Free Multiclass Probability Estimation

    Science.gov (United States)

    Wu, Yichao; Zhang, Hao Helen; Liu, Yufeng

    2010-01-01

    Classical statistical approaches for multiclass probability estimation are typically based on regression techniques such as multiple logistic regression, or density estimation approaches such as linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). These methods often make certain assumptions on the form of probability functions or on the underlying distributions of subclasses. In this article, we develop a model-free procedure to estimate multiclass probabilities based on large-margin classifiers. In particular, the new estimation scheme is employed by solving a series of weighted large-margin classifiers and then systematically extracting the probability information from these multiple classification rules. A main advantage of the proposed probability estimation technique is that it does not impose any strong parametric assumption on the underlying distribution and can be applied for a wide range of large-margin classification methods. A general computational algorithm is developed for class probability estimation. Furthermore, we establish asymptotic consistency of the probability estimates. Both simulated and real data examples are presented to illustrate competitive performance of the new approach and compare it with several other existing methods. PMID:21113386

  15. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  16. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  17. Subjective Probability of Receiving Harm as a Function of Attraction and Harm Delivered.

    Science.gov (United States)

    Schlenker, Barry R.; And Others

    It was hypothesized that subjects who liked a source of potential harm would estimate the probability of receiving harm mediated by him as lower than would subjects who disliked the source. To test the hypothesis, subjects were asked to estimate the probability that a liked or disliked confederate would deliver an electric shock on each of 10…

  18. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2012-01-01

    . The experimental task consists of a series of standard lottery choices in which the subject is assumed to use conventional risk attitudes to select one lottery or the other and then a series of betting choices in which the subject is presented with a range of bookies offering odds on the outcome of some event...

  19. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  20. Display advertising: Estimating conversion probability efficiently

    OpenAIRE

    Safari, Abdollah; Altman, Rachel MacKay; Loughin, Thomas M.

    2017-01-01

    The goal of online display advertising is to entice users to "convert" (i.e., take a pre-defined action such as making a purchase) after clicking on the ad. An important measure of the value of an ad is the probability of conversion. The focus of this paper is the development of a computationally efficient, accurate, and precise estimator of conversion probability. The challenges associated with this estimation problem are the delays in observing conversions and the size of the data set (both...

  1. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2017-01-01

    Subjective beliefs are elicited routinely in economics experiments. However, such elicitation often suffers from two possible disadvantages. First, beliefs are recovered in the form of a summary statistic, usually the mean, of the underlying latent distribution. Second, recovered beliefs are bias...

  2. Subjective Probability and Information Retrieval: A Review of the Psychological Literature.

    Science.gov (United States)

    Thompson, Paul

    1988-01-01

    Reviews the subjective probability estimation literature of six schools of human judgement and decision making: decision theory, behavioral decision theory, psychological decision theory, social judgement theory, information integration theory, and attribution theory. Implications for probabilistic information retrieval are discussed, including…

  3. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  4. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM...

  5. Revising probability estimates: Why increasing likelihood means increasing impact.

    Science.gov (United States)

    Maglio, Sam J; Polman, Evan

    2016-08-01

    Forecasted probabilities rarely stay the same for long. Instead, they are subject to constant revision-moving upward or downward, uncertain events become more or less likely. Yet little is known about how people interpret probability estimates beyond static snapshots, like a 30% chance of rain. Here, we consider the cognitive, affective, and behavioral consequences of revisions to probability forecasts. Stemming from a lay belief that revisions signal the emergence of a trend, we find in 10 studies (comprising uncertain events such as weather, climate change, sex, sports, and wine) that upward changes to event-probability (e.g., increasing from 20% to 30%) cause events to feel less remote than downward changes (e.g., decreasing from 40% to 30%), and subsequently change people's behavior regarding those events despite the revised event-probabilities being the same. Our research sheds light on how revising the probabilities for future events changes how people manage those uncertain events. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Support Theory: A Nonextensional Representation of Subjective Probability.

    Science.gov (United States)

    Tversky, Amos; Koehler, Derek J.

    1994-01-01

    A new theory of subjective probability is presented. According to this theory, different descriptions of the same event can give rise to different judgments. Experimental evidence supporting this theory is summarized, demonstrating that the theory provides a unified treatment of a wide range of empirical findings. (SLD)

  7. Continuous subjective expected utility with non-additive probabilities

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1989-01-01

    textabstractA well-known theorem of Debreu about additive representations of preferences is applied in a non-additive context, to characterize continuous subjective expected utility maximization for the case where the probability measures may be non-additive. The approach of this paper does not need

  8. Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations?

    Directory of Open Access Journals (Sweden)

    David Howden

    2009-10-01

    Full Text Available Frequency probability theorists define an event’s probability distribution as the limit of a repeated set of trials belonging to a homogeneous collective. The subsets of this collective are events which we have deficient knowledge about on an individual level, although for the larger collective we have knowledge its aggregate behavior. Hence, probabilities can only be achieved through repeated trials of these subsets arriving at the established frequencies that define the probabilities. Crovelli (2009 argues that this is a mistaken approach, and that a subjective assessment of individual trials should be used instead. Bifurcating between the two concepts of risk and uncertainty, Crovelli first asserts that probability is the tool used to manage uncertain situations, and then attempts to rebuild a definition of probability theory with this in mind. We show that such an attempt has little to gain, and results in an indeterminate application of entrepreneurial forecasting to uncertain decisions—a process far-removed from any application of probability theory.

  9. Estimation of transition probabilities of credit ratings

    Science.gov (United States)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  10. Multidimensional rare event probability estimation algorithm

    Directory of Open Access Journals (Sweden)

    Leonidas Sakalauskas

    2013-09-01

    Full Text Available This work contains Monte–Carlo Markov Chain algorithm for estimation of multi-dimensional rare events frequencies. Logits of rare event likelihood we are modeling with Poisson distribution, which parameters are distributed by multivariate normal law with unknown parameters – mean vector and covariance matrix. The estimations of unknown parameters are calculated by the maximum likelihood method. There are equations derived, those must be satisfied with model’s maximum likelihood parameters estimations. Positive definition of evaluated covariance matrixes are controlled by calculating ratio between matrix maximum and minimum eigenvalues.

  11. Adaptive estimation of binomial probabilities under misclassification

    NARCIS (Netherlands)

    Albers, Willem/Wim; Veldman, H.J.

    1984-01-01

    If misclassification occurs the standard binomial estimator is usually seriously biased. It is known that an improvement can be achieved by using more than one observer in classifying the sample elements. Here it will be investigated which number of observers is optimal given the total number of

  12. Dental age estimation: the role of probability estimates at the 10 year threshold.

    Science.gov (United States)

    Lucas, Victoria S; McDonald, Fraser; Neil, Monica; Roberts, Graham

    2014-08-01

    The use of probability at the 18 year threshold has simplified the reporting of dental age estimates for emerging adults. The availability of simple to use widely available software has enabled the development of the probability threshold for individual teeth in growing children. Tooth development stage data from a previous study at the 10 year threshold were reused to estimate the probability of developing teeth being above or below the 10 year thresh-hold using the NORMDIST Function in Microsoft Excel. The probabilities within an individual subject are averaged to give a single probability that a subject is above or below 10 years old. To test the validity of this approach dental panoramic radiographs of 50 female and 50 male children within 2 years of the chronological age were assessed with the chronological age masked. Once the whole validation set of 100 radiographs had been assessed the masking was removed and the chronological age and dental age compared. The dental age was compared with chronological age to determine whether the dental age correctly or incorrectly identified a validation subject as above or below the 10 year threshold. The probability estimates correctly identified children as above or below on 94% of occasions. Only 2% of the validation group with a chronological age of less than 10 years were assigned to the over 10 year group. This study indicates the very high accuracy of assignment at the 10 year threshold. Further work at other legally important age thresholds is needed to explore the value of this approach to the technique of age estimation. Copyright © 2014. Published by Elsevier Ltd.

  13. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  14. Accounting Fraud: an estimation of detection probability

    Directory of Open Access Journals (Sweden)

    Artur Filipe Ewald Wuerges

    2014-12-01

    Full Text Available Financial statement fraud (FSF is costly for investors and can damage the credibility of the audit profession. To prevent and detect fraud, it is helpful to know its causes. The binary choice models (e.g. logit and probit commonly used in the extant literature, however, fail to account for undetected cases of fraud and thus present unreliable hypotheses tests. Using a sample of 118 companies accused of fraud by the Securities and Exchange Commission (SEC, we estimated a logit model that corrects the problems arising from undetected frauds in U.S. companies. To avoid multicollinearity problems, we extracted seven factors from 28 variables using the principal factors method. Our results indicate that only 1.43 percent of the instances of FSF were publicized by the SEC. Of the six significant variables included in the traditional, uncorrected logit model, three were found to be actually non-significant in the corrected model. The likelihood of FSF is 5.12 times higher when the firm’s auditor issues an adverse or qualified report.

  15. The estimation of tree posterior probabilities using conditional clade probability distributions.

    Science.gov (United States)

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  16. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...... as a filter with a transfer function depending on the actual velocity. This influences the detection probability, which gets lower at certain velocities. An index directly reflecting the probability of detection can easily be calculated from the cross-correlation estimated. This makes it possible to assess...

  17. Nonparametric Probability Density Estimation by Discrete Maximum Penalized- Likelihood Criteria

    OpenAIRE

    SCOTT, D. W.; Tapia, R. A.; Thompson, J. R.

    1980-01-01

    A nonparametric probability density estimator is proposed that is optimal with respect to a discretized form of a continuous penalized-likelihood criterion functional. Approximation results relating the discrete estimator to the estimate obtained by solving the corresponding infinite-dimensional problem are presented. The discrete estimator is shown to be consistent. The numerical implementation of this discrete estimator is outlined and examples displayed. A simulation study compares the int...

  18. Incorporating detection probability into northern Great Plains pronghorn population estimates

    Science.gov (United States)

    Jacques, Christopher N.; Jenks, Jonathan A.; Grovenburg, Troy W.; Klaver, Robert W.; DePerno, Christopher S.

    2014-01-01

    Pronghorn (Antilocapra americana) abundances commonly are estimated using fixed-wing surveys, but these estimates are likely to be negatively biased because of violations of key assumptions underpinning line-transect methodology. Reducing bias and improving precision of abundance estimates through use of detection probability and mark-resight models may allow for more responsive pronghorn management actions. Given their potential application in population estimation, we evaluated detection probability and mark-resight models for use in estimating pronghorn population abundance. We used logistic regression to quantify probabilities that detecting pronghorn might be influenced by group size, animal activity, percent vegetation, cover type, and topography. We estimated pronghorn population size by study area and year using mixed logit-normal mark-resight (MLNM) models. Pronghorn detection probability increased with group size, animal activity, and percent vegetation; overall detection probability was 0.639 (95% CI = 0.612–0.667) with 396 of 620 pronghorn groups detected. Despite model selection uncertainty, the best detection probability models were 44% (range = 8–79%) and 180% (range = 139–217%) greater than traditional pronghorn population estimates. Similarly, the best MLNM models were 28% (range = 3–58%) and 147% (range = 124–180%) greater than traditional population estimates. Detection probability of pronghorn was not constant but depended on both intrinsic and extrinsic factors. When pronghorn detection probability is a function of animal group size, animal activity, landscape complexity, and percent vegetation, traditional aerial survey techniques will result in biased pronghorn abundance estimates. Standardizing survey conditions, increasing resighting occasions, or accounting for variation in individual heterogeneity in mark-resight models will increase the accuracy and precision of pronghorn population estimates.

  19. Estimating total suspended sediment yield with probability sampling

    Science.gov (United States)

    Robert B. Thomas

    1985-01-01

    The ""Selection At List Time"" (SALT) scheme controls sampling of concentration for estimating total suspended sediment yield. The probability of taking a sample is proportional to its estimated contribution to total suspended sediment discharge. This procedure gives unbiased estimates of total suspended sediment yield and the variance of the...

  20. Estimating the probability of positive crossmatch after negative virtual crossmatch

    NARCIS (Netherlands)

    K.M. Glorie (Kristiaan)

    2012-01-01

    textabstractThis paper estimates the probability of virtual crossmatch failure in kidney exchange matching. In particu-lar, the probability of a positive crossmatch after a negative virtual crossmatch is related to the recipient’s PRA level. Using Dutch kidney exchange data, we find significant

  1. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2017-06-05

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  3. Naive Probability: Model-Based Estimates of Unique Events.

    Science.gov (United States)

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  4. First hitting probabilities for semi markov chains and estimation

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos

    2017-01-01

    We first consider a stochastic system described by an absorbing semi-Markov chain with finite state space and we introduce the absorption probability to a class of recurrent states. Afterwards, we study the first hitting probability to a subset of states for an irreducible semi-Markov chain....... In the latter case, a nonparametric estimator for the first hitting probability is proposed and the asymptotic properties of strong consistency and asymptotic normality are proven. Finally, a numerical application on a five-state system is presented to illustrate the performance of this estimator....

  5. ESTIMATION OF TRIP MODE PROBABILITY CHOICE USING MULTINOMIAL LOGISTIC MODEL

    Directory of Open Access Journals (Sweden)

    Bilous, A.

    2012-06-01

    Full Text Available The step of modal split of the four step model for determination of urban travel demand is analyzed. Utility functions are composed, their coefficients are calibrated in TransCAD. Equations for estimation of trip mode choice probability are shown and the numerical illustration of estimation is given.

  6. Fast Estimation of Outage Probabilities in MIMO Channels

    NARCIS (Netherlands)

    Srinivasan, R.; Tiba, G.

    2004-01-01

    Fast estimation methods for small outage probabilities of signaling in fading multiple-input multiple-output (MIMO) channels are developed. Communication over such channels is of much current interest, and quick and accurate methods for estimating outage capacities are needed. The methods described

  7. Estimating Outcome Probabilities of Quantum Circuits Using Quasiprobabilities.

    Science.gov (United States)

    Pashayan, Hakop; Wallman, Joel J; Bartlett, Stephen D

    2015-08-14

    We present a method for estimating the probabilities of outcomes of a quantum circuit using Monte Carlo sampling techniques applied to a quasiprobability representation. Our estimate converges to the true quantum probability at a rate determined by the total negativity in the circuit, using a measure of negativity based on the 1-norm of the quasiprobability. If the negativity grows at most polynomially in the size of the circuit, our estimator converges efficiently. These results highlight the role of negativity as a measure of nonclassical resources in quantum computation.

  8. Estimating the empirical probability of submarine landslide occurrence

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  9. Failure Probability Estimation of Wind Turbines by Enhanced Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Naess, Arvid

    2012-01-01

    This paper discusses the estimation of the failure probability of wind turbines required by codes of practice for designing them. The Standard Monte Carlo (SMC) simulations may be used for this reason conceptually as an alternative to the popular Peaks-Over-Threshold (POT) method. However......, estimation of very low failure probabilities with SMC simulations leads to unacceptably high computational costs. In this study, an Enhanced Monte Carlo (EMC) method is proposed that overcomes this obstacle. The method has advantages over both POT and SMC in terms of its low computational cost and accuracy...... is controlled by the pitch controller. This provides a fair framework for comparison of the behavior and failure event of the wind turbine with emphasis on the effect of the pitch controller. The Enhanced Monte Carlo method is then applied to the model and the failure probabilities of the model are estimated...

  10. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  11. Allelic drop-out probabilities estimated by logistic regression

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic dro...

  12. Estimation of failure probabilities of linear dynamic systems by ...

    Indian Academy of Sciences (India)

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold.

  13. COMPARATIVE ANALYSIS OF ESTIMATION METHODS OF PHARMACY ORGANIZATION BANKRUPTCY PROBABILITY

    Directory of Open Access Journals (Sweden)

    V. L. Adzhienko

    2014-01-01

    Full Text Available A purpose of this study was to determine the probability of bankruptcy by various methods in order to predict the financial crisis of pharmacy organization. Estimating the probability of pharmacy organization bankruptcy was conducted using W. Beaver’s method adopted in the Russian Federation, with integrated assessment of financial stability use on the basis of scoring analysis. The results obtained by different methods are comparable and show that the risk of bankruptcy of the pharmacy organization is small.

  14. Incorporating medical interventions into carrier probability estimation for genetic counseling

    Directory of Open Access Journals (Sweden)

    Katki Hormuzd A

    2007-03-01

    Full Text Available Abstract Background Mendelian models for predicting who may carry an inherited deleterious mutation of known disease genes based on family history are used in a variety of clinical and research activities. People presenting for genetic counseling are increasingly reporting risk-reducing medical interventions in their family histories because, recently, a slew of prophylactic interventions have become available for certain diseases. For example, oophorectomy reduces risk of breast and ovarian cancers, and is now increasingly being offered to women with family histories of breast and ovarian cancer. Mendelian models should account for medical interventions because interventions modify mutation penetrances and thus affect the carrier probability estimate. Methods We extend Mendelian models to account for medical interventions by accounting for post-intervention disease history through an extra factor that can be estimated from published studies of the effects of interventions. We apply our methods to incorporate oophorectomy into the BRCAPRO model, which predicts a woman's risk of carrying mutations in BRCA1 and BRCA2 based on her family history of breast and ovarian cancer. This new BRCAPRO is available for clinical use. Results We show that accounting for interventions undergone by family members can seriously affect the mutation carrier probability estimate, especially if the family member has lived many years post-intervention. We show that interventions have more impact on the carrier probability as the benefits of intervention differ more between carriers and non-carriers. Conclusion These findings imply that carrier probability estimates that do not account for medical interventions may be seriously misleading and could affect a clinician's recommendation about offering genetic testing. The BayesMendel software, which allows one to implement any Mendelian carrier probability model, has been extended to allow medical interventions, so future

  15. Collective animal behavior from Bayesian estimation and probability matching.

    Directory of Open Access Journals (Sweden)

    Alfonso Pérez-Escudero

    2011-11-01

    Full Text Available Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  16. Collective animal behavior from Bayesian estimation and probability matching.

    Science.gov (United States)

    Pérez-Escudero, Alfonso; de Polavieja, Gonzalo G

    2011-11-01

    Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  17. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  18. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  19. A Balanced Approach to Adaptive Probability Density Estimation

    Directory of Open Access Journals (Sweden)

    Julio A. Kovacs

    2017-04-01

    Full Text Available Our development of a Fast (Mutual Information Matching (FIM of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.

  20. Polynomial probability distribution estimation using the method of moments

    Science.gov (United States)

    Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram–Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation. PMID:28394949

  1. Polynomial probability distribution estimation using the method of moments.

    Science.gov (United States)

    Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.

  2. The estimated probability of dizygotic twins: a comparison of two methods.

    Science.gov (United States)

    Hardin, Jill; Selvin, Steve; Carmichael, Suzan L; Shaw, Gary M

    2009-02-01

    This study presents a general model of two binary variables and applies it to twin sex pairing data from 21 twin data sources to estimate the frequency of dizygotic twins. The purpose of this study is to clarify the relationship between maximum likelihood and Weinberg's differential rule zygosity estimation methods. We explore the accuracy of these zygosity estimation measures in relation to twin ascertainment methods and the probability of a male. Twin sex pairing data from 21 twin data sources representing 15 countries was collected for use in this study. Maximum likelihood estimation of the probability of dizygotic twins is applied to describe the variation in the frequency of dizygotic twin births. The differences between maximum likelihood and Weinberg's differential rule zygosity estimation methods are presented as a function of twin data ascertainment method and the probability of a male. Maximum likelihood estimation of the probability of dizygotic twins ranges from 0.083 (95% approximate CI: 0.082, 0.085) to 0.750 (95% approximate CI: 0.749, 0.752) for voluntary ascertainment data sources and from 0.374 (95% approximate CI: 0.373, 0.375) to 0.987 (95% approximate CI: 0.959, 1.016) for active ascertainment data sources. In 17 of the 21 twin data sources differences of 0.01 or less occur between maximum likelihood and Weinberg zygosity estimation methods. The Weinberg and maximum likelihood estimates are negligibly different in most applications. Using the above general maximum likelihood estimate, the probability of a dizygotic twin is subject to substantial variation that is largely a function of twin data ascertainment method.

  3. Explaining participation differentials in Dutch higher education : the impact of subjective success probabilities on level choice and field choice

    NARCIS (Netherlands)

    Tolsma, J.; Need, A.; Jong, U. de

    2010-01-01

    In this article we examine whether subjective estimates of success probabilities explain the effect of social origin, sex, and ethnicity on students’ choices between different school tracks in Dutch higher education. The educational options analysed differ in level (i.e. university versus

  4. Verification and estimation of a posterior probability and probability density function using vector quantization and neural network

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Hee Seok; Kim, Hyun Duck [Kyungnam University, Masan (Korea, Republic of); Lee, Kwang Seok [Chinju National University (Korea, Republic of)

    1996-02-01

    In this paper, we proposed an estimation method of a posterior probability and PDF(Probability density function) using a feed forward neural network and code books of VQ(vector quantization). In this study, We estimates a posterior probability and probability density function, which compose a new parameter with well-known Mel cepstrum and verificate the performance for the five vowels taking from syllables by NN(neural network) and PNN(probabilistic neural network). In case of new parameter, showed the best result by probabilistic neural network and recognition rates are average 83.02%. (author). 7 refs., 4 figs., 3 tabs.

  5. Estimating occupancy probability of moose using hunter survey data

    Science.gov (United States)

    Crum, Nathan J.; Fuller, Angela K.; Sutherland, Christopher S.; Cooch, Evan G.; Hurst, Jeremy E.

    2017-01-01

    Monitoring rare species can be difficult, especially across large spatial extents, making conventional methods of population monitoring costly and logistically challenging. Citizen science has the potential to produce observational data across large areas that can be used to monitor wildlife distributions using occupancy models. We used citizen science (i.e., hunter surveys) to facilitate monitoring of moose (Alces alces) populations, an especially important endeavor because of their recent apparent declines in the northeastern and upper midwestern regions of the United States. To better understand patterns of occurrence of moose in New York, we used data collected through an annual survey of approximately 11,000 hunters between 2012 and 2014 that recorded detection–non-detection data of moose and other species. We estimated patterns of occurrence of moose in relation to land cover characteristics, climate effects, and interspecific interactions using occupancy models to analyze spatially referenced moose observations. Coniferous and deciduous forest with low prevalence of white-tailed deer (Odocoileus virginianus) had the highest probability of moose occurrence. This study highlights the potential of data collected using citizen science for understanding the spatial distribution of low-density species across large spatial extents and providing key information regarding where and when future research and management activities should be focused.

  6. Estimating the probability of neonatal early-onset infection on the basis of maternal risk factors.

    Science.gov (United States)

    Puopolo, Karen M; Draper, David; Wi, Soora; Newman, Thomas B; Zupancic, John; Lieberman, Ellice; Smith, Myesha; Escobar, Gabriel J

    2011-11-01

    To develop a quantitative model to estimate the probability of neonatal early-onset bacterial infection on the basis of maternal intrapartum risk factors. This was a nested case-control study of infants born at ≥34 weeks' gestation at 14 California and Massachusetts hospitals from 1993 to 2007. Case-subjects had culture-confirmed bacterial infection at 4 hours before delivery was associated with decreased risk. Our model showed good discrimination and calibration (c statistic = 0.800 and Hosmer-Lemeshow P = .142 in the entire data set). A predictive model based on information available in the immediate perinatal period performs better than algorithms based on risk-factor threshold values. This model establishes a prior probability for newborn sepsis, which could be combined with neonatal physical examination and laboratory values to establish a posterior probability to guide treatment decisions.

  7. Stability and coherence of health experts' upper and lower subjective probabilities about dose-response functions.

    Science.gov (United States)

    Wallsten, T S; Forsyth, B H

    1983-06-01

    As part of a method for assessing health risks associated with primary National Ambient Air Quality Standards. T. B. Feagans and W. F. Biller (Research Triangle Park, North Carolina. EPA Office of Air Quality Planning and Standards, May 1981) developed a technique for encoding experts' subjective probabilities regarding dose--response functions. The encoding technique is based on B. O. Koopman's (Bulletin of the American Mathematical Society, 1940, 46, 763-764; Annals of Mathematics, 1940, 41, 269-292) probability theory, which does not require probabilities to be sharp, but rather allows lower and upper probabilities to be associated with an event. Uncertainty about a dose--response function can be expressed either in terms of the response rate expected at a given concentration or, conversely, in terms of the concentration expected to support a given response rate. Feagans and Biller (1981, cited above) derive the relation between the two conditional probabilities, which is easily extended to upper and lower conditional probabilities. These relations were treated as coherence requirements in an experiment utilizing four ozone and four lead experts as subjects, each providing judgments on two separate occasions. Four subjects strongly satisfied the coherence requirements in both conditions. and three more did no in the second session only. The eighth subject also improved in Session 2. Encoded probabilities were highly correlated between the two sessions, but changed from the first to the second in a manner that improved coherence and reflected greater attention to certain parameters of the dose--response function.

  8. Optimized lower leg injury probability curves from postmortem human subject tests under axial impacts.

    Science.gov (United States)

    Yoganandan, Narayan; Arun, Mike W J; Pintar, Frank A; Szabo, Aniko

    2014-01-01

    Derive optimum injury probability curves to describe human tolerance of the lower leg using parametric survival analysis. The study reexamined lower leg postmortem human subjects (PMHS) data from a large group of specimens. Briefly, axial loading experiments were conducted by impacting the plantar surface of the foot. Both injury and noninjury tests were included in the testing process. They were identified by pre- and posttest radiographic images and detailed dissection following the impact test. Fractures included injuries to the calcaneus and distal tibia-fibula complex (including pylon), representing severities at the Abbreviated Injury Score (AIS) level 2+. For the statistical analysis, peak force was chosen as the main explanatory variable and the age was chosen as the covariable. Censoring statuses depended on experimental outcomes. Parameters from the parametric survival analysis were estimated using the maximum likelihood approach and the dfbetas statistic was used to identify overly influential samples. The best fit from the Weibull, log-normal, and log-logistic distributions was based on the Akaike information criterion. Plus and minus 95% confidence intervals were obtained for the optimum injury probability distribution. The relative sizes of the interval were determined at predetermined risk levels. Quality indices were described at each of the selected probability levels. The mean age, stature, and weight were 58.2±15.1 years, 1.74±0.08 m, and 74.9±13.8 kg, respectively. Excluding all overly influential tests resulted in the tightest confidence intervals. The Weibull distribution was the most optimum function compared to the other 2 distributions. A majority of quality indices were in the good category for this optimum distribution when results were extracted for 25-, 45- and 65-year-olds at 5, 25, and 50% risk levels age groups for lower leg fracture. For 25, 45, and 65 years, peak forces were 8.1, 6.5, and 5.1 kN at 5% risk; 9.6, 7.7, and 6.1 k

  9. Traceable accounts of subjective probability judgments in the IPCC and beyond

    Science.gov (United States)

    Baer, P. G.

    2012-12-01

    Uncertainty Guidance Papers for the TAR and subsequent assessments have left open the possibility of using such an expert elicitation within the IPCC drafting process, but to my knowledge it has never been done. Were it in fact attempted, it would reveal the inconvenient truth that there is no uniquely correct method for aggregating probability statements; indeed the standard practice within climate-related expert elicitations has been to report all individual estimates without aggregation. But if a report requires a single "consensus estimate," once you have even a single divergent opinion, the question of how to aggregate becomes unavoidable. In this paper, I review in greater detail the match or lack of it between the vision of a "traceable account" and IPCC practice, and the public discussion of selected examples of probabilistic judgments in AR4. I propose elements of a structure based on a flexible software architecture that could facilitate the development and documentation of what I call "collective subjective probability." Using a simple prototype and a pair of sample "findings" from AR4, I demonstrate an example of how such a structure could be used by a small expert community to implement a practical model of a "traceable account." I conclude with as discussion of the prospects of using such modular elicitations in support of, or as an alternative to, conventional IPCC assessment processes.

  10. On estimating the fracture probability of nuclear graphite components

    Science.gov (United States)

    Srinivasan, Makuteswara

    2008-10-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation.

  11. average probability of failure on demand estimation for burner ...

    African Journals Online (AJOL)

    HOD

    architecture. - Common cause failure. - Proof test interval. Pij – Probability from state i to j. 1. INTRODUCTION. In the process industry, the plant is designed to keep the process within .... parametric) model is developed to model CCFs by ...... design : principles, practice, and economics of plant and process design.

  12. Estimation for Domains in Double Sampling with Probabilities ...

    African Journals Online (AJOL)

    Available publications show that the variance of an estimator of a domain parameter depends variance of the study variable for the domain elements and on the variance of the mean of that variable for element of the domain in each constituent stratum. In this article, we show that the variance of an estimator of a domain total ...

  13. Bounding the Failure Probability Range of Polynomial Systems Subject to P-box Uncertainties

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper proposes a reliability analysis framework for systems subject to multiple design requirements that depend polynomially on the uncertainty. Uncertainty is prescribed by probability boxes, also known as p-boxes, whose distribution functions have free or fixed functional forms. An approach based on the Bernstein expansion of polynomials and optimization is proposed. In particular, we search for the elements of a multi-dimensional p-box that minimize (i.e., the best-case) and maximize (i.e., the worst-case) the probability of inner and outer bounding sets of the failure domain. This technique yields intervals that bound the range of failure probabilities. The offset between this bounding interval and the actual failure probability range can be made arbitrarily tight with additional computational effort.

  14. METAPHOR: Probability density estimation for machine learning based photometric redshifts

    Science.gov (United States)

    Amaro, V.; Cavuoti, S.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.

    2017-06-01

    We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method able to provide a reliable PDF for photometric galaxy redshifts estimated through empirical techniques. METAPHOR is a modular workflow, mainly based on the MLPQNA neural network as internal engine to derive photometric galaxy redshifts, but giving the possibility to easily replace MLPQNA with any other method to predict photo-z's and their PDF. We present here the results about a validation test of the workflow on the galaxies from SDSS-DR9, showing also the universality of the method by replacing MLPQNA with KNN and Random Forest models. The validation test include also a comparison with the PDF's derived from a traditional SED template fitting method (Le Phare).

  15. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  16. Collective Animal Behavior from Bayesian Estimation and Probability Matching

    OpenAIRE

    Alfonso Perez-Escudero; Gonzalo G. de Polavieja

    2011-01-01

    Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertai...

  17. Heritability estimates and correlations between subjectively ...

    African Journals Online (AJOL)

    PavarniN

    exceptions were positive genetic correlations of fibre diameter (FD) and coefficient of variation of FD with staple formation score and belly and points score. Genetic progress in subjective traits thus appears possible, if desired in a selection strategy. Keywords: Correlations, heritabilities, linearly assessed traits, subjective ...

  18. Actions and Beliefs : Estimating Distribution-Based Preferences Using a Large Scale Experiment with Probability Questions on Expectations

    NARCIS (Netherlands)

    Bellemare, C.; Kroger, S.; van Soest, A.H.O.

    2005-01-01

    We combine the choice data of proposers and responders in the ultimatum game, their expectations elicited in the form of subjective probability questions, and the choice data of proposers ("dictator") in a dictator game to estimate a structural model of decision making under uncertainty.We use a

  19. Development of a score and probability estimate for detecting angle closure based on anterior segment optical coherence tomography.

    Science.gov (United States)

    Nongpiur, Monisha E; Haaland, Benjamin A; Perera, Shamira A; Friedman, David S; He, Mingguang; Sakata, Lisandro M; Baskaran, Mani; Aung, Tin

    2014-01-01

    To develop a score along with an estimated probability of disease for detecting angle closure based on anterior segment optical coherence tomography (AS OCT) imaging. Cross-sectional study. A total of 2047 subjects 50 years of age and older were recruited from a community polyclinic in Singapore. All subjects underwent standardized ocular examination including gonioscopy and imaging by AS OCT (Carl Zeiss Meditec). Customized software (Zhongshan Angle Assessment Program) was used to measure AS OCT parameters. Complete data were available for 1368 subjects. Data from the right eyes were used for analysis. A stepwise logistic regression model with Akaike information criterion was used to generate a score that then was converted to an estimated probability of the presence of gonioscopic angle closure, defined as the inability to visualize the posterior trabecular meshwork for at least 180 degrees on nonindentation gonioscopy. Of the 1368 subjects, 295 (21.6%) had gonioscopic angle closure. The angle closure score was calculated from the shifted linear combination of the AS OCT parameters. The score can be converted to an estimated probability of having angle closure using the relationship: estimated probability = e(score)/(1 + e(score)), where e is the natural exponential. The score performed well in a second independent sample of 178 angle-closure subjects and 301 normal controls, with an area under the receiver operating characteristic curve of 0.94. A score derived from a single AS OCT image, coupled with an estimated probability, provides an objective platform for detection of angle closure. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Efficient Estimation of first Passage Probability of high-Dimensional Nonlinear Systems

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Bucher, Christian

    2011-01-01

    An efficient method for estimating low first passage probabilities of high-dimensional nonlinear systems based on asymptotic estimation of low probabilities is presented. The method does not require any a priori knowledge of the system, i.e. it is a black-box method, and has very low requirements......, the failure probabilities of three well-known nonlinear systems are estimated. Next, a reduced degree-of-freedom model of a wind turbine is developed and is exposed to a turbulent wind field. The model incorporates very high dimensions and strong nonlinearities simultaneously. The failure probability...

  1. Estimating The Probability Of Achieving Shortleaf Pine Regeneration At Variable Specified Levels

    Science.gov (United States)

    Thomas B. Lynch; Jean Nkouka; Michael M. Huebschmann; James M. Guldin

    2002-01-01

    A model was developed that can be used to estimate the probability of achieving regeneration at a variety of specified stem density levels. The model was fitted to shortleaf pine (Pinus echinata Mill.) regeneration data, and can be used to estimate the probability of achieving desired levels of regeneration between 300 and 700 stems per acre 9-l 0...

  2. Maximum likelihood estimation for predicting the probability of obtaining variable shortleaf pine regeneration densities

    Science.gov (United States)

    Thomas B. Lynch; Jean Nkouka; Michael M. Huebschmann; James M. Guldin

    2003-01-01

    A logistic equation is the basis for a model that predicts the probability of obtaining regeneration at specified densities. The density of regeneration (trees/ha) for which an estimate of probability is desired can be specified by means of independent variables in the model. When estimating parameters, the dependent variable is set to 1 if the regeneration density (...

  3. Higher risk of probable mental emotional disorder in low or severe vision subjects

    Directory of Open Access Journals (Sweden)

    Lutfah Rif’ati

    2012-07-01

    health problem priority in Indonesia. This paper presents an assessment of severe visual impairments related to the risk of MED. Methods: This paper assessed a part of Basic Health Research (Riskesdas 2007 data. For this assessment, subjects 15 years old or more had their visual acuity measured using the Snellen chart and their mental health status determined using the Self Reporting Questionnaire (SRQ 20. A subject was considered to have probable MED if the subject had a total score of 6 or more on the SRQ. Based on the measure of visual acuity, visual acuity was divided into 3 categories: normal/mild (20/20 to 20/60; low vision (less than 20/60 to 3/60; and blind (less than 3/60 to 0/0. Results: Among 972,989 subjects, 554,886 were aged 15 years or older. 11.4% of the subjects had probable MED. The prevalence of low vision and blindness was 5.1% and 0.9%, respectively. Compared to subjects with normal or mild visual impairments, subjects with low vision had a 74% increased risk for probable MED [adjusted relative risk (RRa=1,75; 95% confidence interval (CI=1,71-1,79].  Blind subjects had a 2.7-fold risk to be probable MED (RRa=2.69; 95% CI=2.60-2.78] compared to subjects with normal or mild visual impairments. Conclusion: Visual impairment severity increased probable MED risk. Therefore, visual impairment subjects need more attention on probable MED. (Health Science Indones 2011;2:9-13

  4. Estimation method of multivariate exponential probabilities based on a simplex coordinates transform

    NARCIS (Netherlands)

    Olieman, N.J.; Putten, van B.

    2006-01-01

    A novel unbiased estimator for estimating the probability mass of a multivariate exponential distribution over a measurable set is introduced and is called the Exponential Simplex (ES) estimator. For any measurable set, the standard error of the ES-estimator is at most the standard error of the well

  5. Estimation method of multivariate exponential probabilities based on a simple coordinates transform

    NARCIS (Netherlands)

    Olieman, N.J.; Putten, van B.

    2010-01-01

    A novel unbiased estimator for estimating the probability mass of a multivariate exponential distribution over a measurable set is introduced and is called the exponential simplex (ES) estimator. For any measurable set and given sample size, the statistical efficiency of the ES estimator is higher

  6. Estimation and asymptotic theory for transition probabilities in Markov renewal multi-state models.

    Science.gov (United States)

    Spitoni, Cristian; Verduijn, Marion; Putter, Hein

    2012-08-07

    In this paper we discuss estimation of transition probabilities for semi-Markov multi-state models. Non-parametric and semi-parametric estimators of the transition probabilities for a large class of models (forward going models) are proposed. Large sample theory is derived using the functional delta method and the use of resampling is proposed to derive confidence bands for the transition probabilities. The last part of the paper concerns the presentation of the main ideas of the R implementation of the proposed estimators, and data from a renal replacement study are used to illustrate the behavior of the estimators proposed.

  7. The contribution of threat probability estimates to reexperiencing symptoms: a prospective analog study.

    Science.gov (United States)

    Regambal, Marci J; Alden, Lynn E

    2012-09-01

    Individuals with posttraumatic stress disorder (PTSD) are hypothesized to have a "sense of current threat." Perceived threat from the environment (i.e., external threat), can lead to overestimating the probability of the traumatic event reoccurring (Ehlers & Clark, 2000). However, it is unclear if external threat judgments are a pre-existing vulnerability for PTSD or a consequence of trauma exposure. We used trauma analog methodology to prospectively measure probability estimates of a traumatic event, and investigate how these estimates were related to cognitive processes implicated in PTSD development. 151 participants estimated the probability of being in car-accident related situations, watched a movie of a car accident victim, and then completed a measure of data-driven processing during the movie. One week later, participants re-estimated the probabilities, and completed measures of reexperiencing symptoms and symptom appraisals/reactions. Path analysis revealed that higher pre-existing probability estimates predicted greater data-driven processing which was associated with negative appraisals and responses to intrusions. Furthermore, lower pre-existing probability estimates and negative responses to intrusions were both associated with a greater change in probability estimates. Reexperiencing symptoms were predicted by negative responses to intrusions and, to a lesser degree, by greater changes in probability estimates. The undergraduate student sample may not be representative of the general public. The reexperiencing symptoms are less severe than what would be found in a trauma sample. Threat estimates present both a vulnerability and a consequence of exposure to a distressing event. Furthermore, changes in these estimates are associated with cognitive processes implicated in PTSD. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Inverse Probability Weighted Generalised Empirical Likelihood Estimators : Firm Size and R&D Revisited

    NARCIS (Netherlands)

    Inkmann, J.

    2005-01-01

    The inverse probability weighted Generalised Empirical Likelihood (IPW-GEL) estimator is proposed for the estimation of the parameters of a vector of possibly non-linear unconditional moment functions in the presence of conditionally independent sample selection or attrition.The estimator is applied

  9. Methods of Conflict Prediction and Conflict Probability Estimation for en Route Flight

    Directory of Open Access Journals (Sweden)

    В.М. Васильєв

    2004-02-01

    Full Text Available  The probabilistic methods of conflict prediction and estimation of conflict situation are offered for en route flight. The mathematical statement of problem, the criterion of conflict detection, and randomized estimation procedure are presented. The analytical expressions for evaluation conflict probability are derived for estimation of air traffic safety in collision avoidance systems.

  10. An investigation of dentists' and dental students' estimates of diagnostic probabilities.

    Science.gov (United States)

    Chambers, David W; Mirchel, Ryan; Lundergan, William

    2010-06-01

    Research in medicine has shown that physicians have difficulty estimating the probability that a patient has a condition on the basis of available diagnostic evidence. They consistently undervalue baseline information about the patient relative to test information and are poor intuitive calculators of probability. The authors could not locate in the literature any studies of diagnostic probability estimates from baseline information and test data for dentists. Using two vignettes that contained different baseline information, dental students and clinical faculty members estimated the probability that the described hypothetical patient had the condition in question. Respondents also commented on the project. Both groups of respondents overemphasized the importance of test evidence relative to baseline information, although experienced practitioners did so to a lesser extent than did students. Respondents, especially practitioners, expressed resistance to performing a diagnostic task that required precise estimates of probability. Dentists appear to estimate diagnostic probabilities in an intuitive fashion, but they do so imprecisely. Clinical experience provides some protection against the bias of overestimating test evidence compared with baseline information. These findings raise questions about how practitioners use probability estimates and whether other models also may play a role. The incorporation of information from evidence-based dentistry into practice requires better understanding.

  11. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    Science.gov (United States)

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Neural correlates of decision making with explicit information about probabilities and incentives in elderly healthy subjects.

    Science.gov (United States)

    Labudda, Kirsten; Woermann, Friedrich G; Mertens, Markus; Pohlmann-Eden, Bernd; Markowitsch, Hans J; Brand, Matthias

    2008-06-01

    Recent functional neuroimaging and lesion studies demonstrate the involvement of the orbitofrontal/ventromedial prefrontal cortex as a key structure in decision making processes. This region seems to be particularly crucial when contingencies between options and consequences are unknown but have to be learned by the use of feedback following previous decisions (decision making under ambiguity). However, little is known about the neural correlates of decision making under risk conditions in which information about probabilities and potential outcomes is given. In the present study, we used functional magnetic resonance imaging to measure blood-oxygenation-level-dependent (BOLD) responses in 12 subjects during a decision making task. This task provided explicit information about probabilities and associated potential incentives. The responses were compared to BOLD signals in a control condition without information about incentives. In contrast to previous decision making studies, we completely removed the outcome phase following a decision to exclude the potential influence of feedback previously received on current decisions. The results indicate that the integration of information about probabilities and incentives leads to activations within the dorsolateral prefrontal cortex, the posterior parietal lobe, the anterior cingulate and the right lingual gyrus. We assume that this pattern of activation is due to the involvement of executive functions, conflict detection mechanisms and arithmetic operations during the deliberation phase of decisional processes that are based on explicit information.

  13. Easy probability estimation of the diagnosis of early axial spondyloarthritis by summing up scores.

    Science.gov (United States)

    Feldtkeller, Ernst; Rudwaleit, Martin; Zeidler, Henning

    2013-09-01

    Several sets of criteria for the diagnosis of axial SpA (including non-radiographic axial spondyloarthritis) have been proposed in the literature in which scores were attributed to relevant findings and the diagnosis requests a minimal sum of these scores. To quantitatively estimate the probability of axial SpA, multiplying the likelihood ratios of all relevant findings was proposed by Rudwaleit et al. in 2004. The objective of our proposal is to combine the advantages of both, i.e. to estimate the probability by summing up scores instead of multiplying likelihood ratios. An easy way to estimate the probability of axial spondyloarthritis is to use the logarithms of the likelihood ratios as scores attributed to relevant findings and to use the sum of these scores for the probability estimation. A list of whole-numbered scores for relevant findings is presented, and also threshold sum values necessary for a definite and for a probable diagnosis of axial SpA as well as a threshold below which the diagnosis of axial spondyloarthritis can be excluded. In a diagram, the probability of axial spondyloarthritis is given for sum values between these thresholds. By the method proposed, the advantages of both, the easy summing up of scores and the quantitative calculation of the diagnosis probability, are combined. Our method also makes it easier to estimate which additional tests are necessary to come to a definite diagnosis.

  14. Two-step estimation in ratio-of-mediator-probability weighted causal mediation analysis.

    Science.gov (United States)

    Bein, Edward; Deutsch, Jonah; Hong, Guanglei; Porter, Kristin E; Qin, Xu; Yang, Cheng

    2018-01-10

    This study investigates appropriate estimation of estimator variability in the context of causal mediation analysis that employs propensity score-based weighting. Such an analysis decomposes the total effect of a treatment on the outcome into an indirect effect transmitted through a focal mediator and a direct effect bypassing the mediator. Ratio-of-mediator-probability weighting estimates these causal effects by adjusting for the confounding impact of a large number of pretreatment covariates through propensity score-based weighting. In step 1, a propensity score model is estimated. In step 2, the causal effects of interest are estimated using weights derived from the prior step's regression coefficient estimates. Statistical inferences obtained from this 2-step estimation procedure are potentially problematic if the estimated standard errors of the causal effect estimates do not reflect the sampling uncertainty in the estimation of the weights. This study extends to ratio-of-mediator-probability weighting analysis a solution to the 2-step estimation problem by stacking the score functions from both steps. We derive the asymptotic variance-covariance matrix for the indirect effect and direct effect 2-step estimators, provide simulation results, and illustrate with an application study. Our simulation results indicate that the sampling uncertainty in the estimated weights should not be ignored. The standard error estimation using the stacking procedure offers a viable alternative to bootstrap standard error estimation. We discuss broad implications of this approach for causal analysis involving propensity score-based weighting. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Estimating transition probabilities for stage-based population projection matrices using capture-recapture data

    Science.gov (United States)

    Nichols, J.D.; Sauer, J.R.; Pollock, K.H.; Hestbeck, J.B.

    1992-01-01

    In stage-based demography, animals are often categorized into size (or mass) classes, and size-based probabilities of surviving and changing mass classes must be estimated before demographic analyses can be conducted. In this paper, we develop two procedures for the estimation of mass transition probabilities from capture-recapture data. The first approach uses a multistate capture-recapture model that is parameterized directly with the transition probabilities of interest. Maximum likelihood estimates are then obtained numerically using program SURVIV. The second approach involves a modification of Pollock's robust design. Estimation proceeds by conditioning on animals caught in a particualr class at time i, and then using closed models to estimate the number of these that are alive in other classes at i + 1. Both methods are illustrated by application to meadow vole, Microtus pennsylvanicus, capture-recapture data. The two methods produced reasonable estimates that were similar. Advantages of these two approaches include the directness of estimation, the absence of need for restrictive assumptions about the independence of survival and growth, the testability of assumptions, and the testability of related hypotheses of ecological interest (e.g., the hypothesis of temporal variation in transition probabilities).

  16. Estimating probability density functions and entropies of chua's circuit using b-spline functions

    OpenAIRE

    Savacı, Ferit Acar; Güngör, Mesut

    2012-01-01

    n this paper, first the probability density functions (PDFs) of the states of Chua's circuit have been estimated using B-spline functions and then the state entropies of Chua's circuit with respect to the bifurcation parameter have been obtained. The results of the proposed B-spline density estimator have been compared with the results obtained from the Parzen density estimator. © 2012 World Scientific Publishing Company.

  17. Further Evidence That the Effects of Repetition on Subjective Time Depend on Repetition Probability.

    Science.gov (United States)

    Skylark, William J; Gheorghiu, Ana I

    2017-01-01

    Repeated stimuli typically have shorter apparent duration than novel stimuli. Most explanations for this effect have attributed it to the repeated stimuli being more expected or predictable than the novel items, but an emerging body of work suggests that repetition and expectation exert distinct effects on time perception. The present experiment replicated a recent study in which the probability of repetition was varied between blocks of trials. As in the previous work, the repetition effect was smaller when repeats were common (and therefore more expected) than when they were rare. These results add to growing evidence that, contrary to traditional accounts, expectation increases apparent duration whereas repetition compresses subjective time, perhaps via a low-level process like adaptation. These opposing processes can be seen as instances of a more general "processing principle," according to which subjective time is a function of the perceptual strength of the stimulus representation, and therefore depends on a confluence of "bottom-up" and "top-down" variables.

  18. Improving the Estimation of Markov Transition Probabilities Using Mechanistic-Empirical Models

    Directory of Open Access Journals (Sweden)

    Daijiro Mizutani

    2017-10-01

    Full Text Available In many current state-of-the-art bridge management systems, Markov models are used for both the prediction of deterioration and the determination of optimal intervention strategies. Although transition probabilities of Markov models are generally estimated using inspection data, it is not uncommon that there are situations where there are inadequate data available to estimate the transition probabilities. In this article, a methodology is proposed to estimate the transition probabilities from mechanistic-empirical models for reinforced concrete elements. The proposed methodology includes the estimation of the transition probabilities analytically when possible and when not through the use of Bayesian statistics, which requires the formulation of a likelihood function and the use of Markov Chain Monte Carlo simulations. In an example, the difference between the average condition predicted over a 100-year time period with a Markov model developed using the proposed methodology and the condition predicted using mechanistic-empirical models were found to be 54% of that when the state-of-the-art methodology, i.e., a methodology that estimates the transition probabilities using best fit curves based on yearly condition distributions, was used. The variation in accuracy of the Markov model as a function of the number of deterioration paths generated using the mechanistic-empirical models is also shown.

  19. The role of misclassification in estimating proportions and an estimator of misclassification probability

    Science.gov (United States)

    Patrick L. Zimmerman; Greg C. Liknes

    2010-01-01

    Dot grids are often used to estimate the proportion of land cover belonging to some class in an aerial photograph. Interpreter misclassification is an often-ignored source of error in dot-grid sampling that has the potential to significantly bias proportion estimates. For the case when the true class of items is unknown, we present a maximum-likelihood estimator of...

  20. Improved ischemic stroke outcome prediction using model estimation of outcome probability: the THRIVE-c calculation.

    Science.gov (United States)

    Flint, Alexander C; Rao, Vivek A; Chan, Sheila L; Cullen, Sean P; Faigeles, Bonnie S; Smith, Wade S; Bath, Philip M; Wahlgren, Nils; Ahmed, Niaz; Donnan, Geoff A; Johnston, S Claiborne

    2015-08-01

    The Totaled Health Risks in Vascular Events (THRIVE) score is a previously validated ischemic stroke outcome prediction tool. Although simplified scoring systems like the THRIVE score facilitate ease-of-use, when computers or devices are available at the point of care, a more accurate and patient-specific estimation of outcome probability should be possible by computing the logistic equation with patient-specific continuous variables. We used data from 12 207 subjects from the Virtual International Stroke Trials Archive and the Safe Implementation of Thrombolysis in Stroke - Monitoring Study to develop and validate the performance of a model-derived estimation of outcome probability, the THRIVE-c calculation. Models were built with logistic regression using the underlying predictors from the THRIVE score: age, National Institutes of Health Stroke Scale score, and the Chronic Disease Scale (presence of hypertension, diabetes mellitus, or atrial fibrillation). Receiver operator characteristics analysis was used to assess model performance and compare the THRIVE-c model to the traditional THRIVE score, using a two-tailed Chi-squared test. The THRIVE-c model performed similarly in the randomly chosen development cohort (n = 6194, area under the curve = 0·786, 95% confidence interval 0·774-0·798) and validation cohort (n = 6013, area under the curve = 0·784, 95% confidence interval 0·772-0·796) (P = 0·79). Similar performance was also seen in two separate external validation cohorts. The THRIVE-c model (area under the curve = 0·785, 95% confidence interval 0·777-0·793) had superior performance when compared with the traditional THRIVE score (area under the curve = 0·746, 95% confidence interval 0·737-0·755) (P computing the logistic equation with patient-specific continuous variables in the THRIVE-c calculation, outcomes at the individual patient level are more accurately estimated. Given the widespread availability of

  1. Effect of family relatedness on characteristics of estimated IBD probabilities in relation to precision of QTL estimates.

    Science.gov (United States)

    Freyer, Gertraude; Hernández-Sánchez, Jules; Vukasinovic, Natascha

    2010-09-26

    A random QTL effects model uses a function of probabilities that two alleles in the same or in different animals at a particular genomic position are identical by descent (IBD). Estimates of such IBD probabilities and therefore, modeling and estimating QTL variances, depend on marker polymorphism, strength of linkage and linkage disequilibrium of markers and QTL, and the relatedness of animals in the pedigree. The effect of relatedness of animals in a pedigree on IBD probabilities and their characteristics was examined in a simulation study. The study based on nine multi-generational family structures, similar to a pedigree structure of a real dairy population, distinguished by an increased level of inbreeding from zero to 28% across the studied population. Highest inbreeding level in the pedigree, connected with highest relatedness, was accompanied by highest IBD probabilities of two alleles at the same locus, and by lower relative variation coefficients. Profiles of correlation coefficients of IBD probabilities along the marked chromosomal segment with those at the true QTL position were steepest when the inbreeding coefficient in the pedigree was highest. Precision of estimated QTL location increased with increasing inbreeding and pedigree relatedness. A method to assess the optimum level of inbreeding for QTL detection is proposed, depending on population parameters. An increased overall relationship in a QTL mapping design has positive effects on precision of QTL position estimates. But the relationship of inbreeding level and the capacity for QTL detection depending on the recombination rate of QTL and adjacent informative marker is not linear.

  2. Multifractals embedded in short time series: An unbiased estimation of probability moment

    Science.gov (United States)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  3. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    Science.gov (United States)

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  4. An Advanced Dynamic Framed-Slotted ALOHA Algorithm Based on Bayesian Estimation and Probability Response

    Directory of Open Access Journals (Sweden)

    Chaowei Wang

    2013-01-01

    Full Text Available This paper proposes an advanced dynamic framed-slotted ALOHA algorithm based on Bayesian estimation and probability response (BE-PDFSA to improve the performance of radio frequency identification (RFID system. The Bayesian estimation is introduced to improve the accuracy of the estimation algorithm for lacking a large number of observations in one query. The probability response is used to adjust responsive probability of the unrecognized tags to make the responsive tag number equal to the frame length. In this way, we can solve the problem of high collision rate with the increase of tag number and improve the throughput of the whole system. From the simulation results, we can see that the algorithm we proposed can greatly improve the stability of RFID system compared with DFSA and other commonly used algorithms.

  5. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  6. Sequential design of computer experiments for the estimation of a probability of failure

    OpenAIRE

    Bect, Julien; Ginsbourger, David; Li, Ling; Picheny, Victor; Vazquez, Emmanuel

    2010-01-01

    International audience; This paper deals with the problem of estimating the volume of the excursion set of a function $f:\\mathbb{R}^d \\to \\mathbb{R}$ above a given threshold, under a probability measure on $\\RR^d$ that is assumed to be known. In the industrial world, this corresponds to the problem of estimating a probability of failure of a system. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited and therefore classic...

  7. Local linear estimation of concordance probability with application to covariate effects models on association for bivariate failure-time data.

    Science.gov (United States)

    Ding, Aidong Adam; Hsieh, Jin-Jian; Wang, Weijing

    2015-01-01

    Bivariate survival analysis has wide applications. In the presence of covariates, most literature focuses on studying their effects on the marginal distributions. However covariates can also affect the association between the two variables. In this article we consider the latter issue by proposing a nonstandard local linear estimator for the concordance probability as a function of covariates. Under the Clayton copula, the conditional concordance probability has a simple one-to-one correspondence with the copula parameter for different data structures including those subject to independent or dependent censoring and dependent truncation. The proposed method can be used to study how covariates affect the Clayton association parameter without specifying marginal regression models. Asymptotic properties of the proposed estimators are derived and their finite-sample performances are examined via simulations. Finally, for illustration, we apply the proposed method to analyze a bone marrow transplant data set.

  8. Time of Arrival Estimation in Probability-Controlled Generalized CDMA Systems

    Directory of Open Access Journals (Sweden)

    Hagit Messer

    2007-11-01

    Full Text Available In recent years, more and more wireless communications systems are required to provide also a positioning measurement. In code division multiple access (CDMA communication systems, the positioning accuracy is significantly degraded by the multiple access interference (MAI caused by other users in the system. This MAI is commonly managed by a power control mechanism, and yet, MAI has a major effect on positioning accuracy. Probability control is a recently introduced interference management mechanism. In this mechanism, a user with excess power chooses not to transmit some of its symbols. The information in the nontransmitted symbols is recovered by an error-correcting code (ECC, while all other users receive a more reliable data during these quiet periods. Previous research had shown that the implementation of a probability control mechanism can significantly reduce the MAI. In this paper, we show that probability control also improves the positioning accuracy. We focus on time-of-arrival (TOA based positioning systems. We analyze the TOA estimation performance in a generalized CDMA system, in which the probability control mechanism is employed, where the transmitted signal is noncontinuous with a symbol transmission probability smaller than 1. The accuracy of the TOA estimation is determined using appropriate modifications of the Cramer-Rao bound on the delay estimation. Keeping the average transmission power constant, we show that the TOA accuracy of each user does not depend on its transmission probability, while being a nondecreasing function of the transmission probability of any other user. Therefore, a generalized, noncontinuous CDMA system with a probability control mechanism can always achieve better positioning performance, for all users in the network, than a conventional, continuous, CDMA system.

  9. Conditional probability distribution (CPD) method in temperature based death time estimation: Error propagation analysis.

    Science.gov (United States)

    Hubig, Michael; Muggenthaler, Holger; Mall, Gita

    2014-05-01

    Bayesian estimation applied to temperature based death time estimation was recently introduced as conditional probability distribution or CPD-method by Biermann and Potente. The CPD-method is useful, if there is external information that sets the boundaries of the true death time interval (victim last seen alive and found dead). CPD allows computation of probabilities for small time intervals of interest (e.g. no-alibi intervals of suspects) within the large true death time interval. In the light of the importance of the CPD for conviction or acquittal of suspects the present study identifies a potential error source. Deviations in death time estimates will cause errors in the CPD-computed probabilities. We derive formulae to quantify the CPD error as a function of input error. Moreover we observed the paradox, that in cases, in which the small no-alibi time interval is located at the boundary of the true death time interval, adjacent to the erroneous death time estimate, CPD-computed probabilities for that small no-alibi interval will increase with increasing input deviation, else the CPD-computed probabilities will decrease. We therefore advise not to use CPD if there is an indication of an error or a contra-empirical deviation in the death time estimates, that is especially, if the death time estimates fall out of the true death time interval, even if the 95%-confidence intervals of the estimate still overlap the true death time interval. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Estimating the posterior probability that genome-wide association findings are true or false.

    Science.gov (United States)

    Bukszár, József; McClay, Joseph L; van den Oord, Edwin J C G

    2009-07-15

    A limitation of current methods used to declare significance in genome-wide association studies (GWAS) is that they do not provide clear information about the probability that GWAS findings are true of false. This lack of information increases the chance of false discoveries and may result in real effects being missed. We propose a method to estimate the posterior probability that a marker has (no) effect given its test statistic value, also called the local false discovery rate (FDR), in the GWAS. A critical step involves the estimation the parameters of the distribution of the true alternative tests. For this, we derived and implemented the real maximum likelihood function, which turned out to provide us with significantly more accurate estimates than the widely used mixture model likelihood. Actual GWAS data are used to illustrate properties of the posterior probability estimates empirically. In addition to evaluating individual markers, a variety of applications are conceivable. For instance, posterior probability estimates can be used to control the FDR more precisely than Benjamini-Hochberg procedure. The codes are freely downloadable from the web site http://www.people.vcu.edu/~jbukszar.

  11. Estimating Rare Event Probabilities in Large Scale Stochastic Hybrid Systems by Sequential Monte Carlo Simulation

    NARCIS (Netherlands)

    Blom, H.A.P.; Krystul, J.; Bakker, G.J.

    2006-01-01

    We study the problem of estimating small reachability probabilities for large scale stochastic hybrid processes through Sequential Monte Carlo (SMC) simulation. Recently, [Cerou et al., 2002, 2005] developed an SMC approach for diffusion processes, and referred to the resulting SMC algorithm as an

  12. Estimating the Probability of a Rare Event Over a Finite Time Horizon

    NARCIS (Netherlands)

    de Boer, Pieter-Tjerk; L'Ecuyer, Pierre; Rubino, Gerardo; Tuffin, Bruno

    2007-01-01

    We study an approximation for the zero-variance change of measure to estimate the probability of a rare event in a continuous-time Markov chain. The rare event occurs when the chain reaches a given set of states before some fixed time limit. The jump rates of the chain are expressed as functions of

  13. Impact of Uncertainty on Non-Medical Professionals' Estimates of Sexual Abuse Probability.

    Science.gov (United States)

    Fargason, Crayton A., Jr.; Peralta-Carcelen, Myriam C.; Fountain, Kathleen E.; Amaya, Michelle I.; Centor, Robert

    1997-01-01

    Assesses how an educational intervention describing uncertainty in child sexual-abuse assessments affects estimates of sexual abuse probability by non-physician child-abuse professionals (CAPs). Results, based on evaluations of 89 CAPs after the intervention, indicate they undervalued medical-exam findings and had difficulty adjusting for medical…

  14. Estimating success probability of a rugby goal kick and developing a ...

    African Journals Online (AJOL)

    The objective of this study was firstly to derive a formula to estimate the success probability of a particular rugby goal kick and, secondly to derive a goal kicker rating measure that could be used to rank rugby union goal kickers. Various factors that could influence the success of a particular goal kick were considered.

  15. Allelic drop-out probabilities estimated by logistic regression--Further considerations and practical implementation

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic dro...

  16. Fast estimation of false alarm probabilities of STAP detectors - the AMF

    NARCIS (Netherlands)

    Srinivasan, R.; Rangaswamy, Muralidhar

    2005-01-01

    This paper describes an attempt to harness the power of adaptive importance sampling techniques for estimating false alarm probabilities of detectors that use space-time adaptive processing. Fast simulation using these techniques have been notably successful in the study of conventional constant

  17. Estimating Transitional Probabilities with Cross-Sectional Data to Assess Smoking Behavior Progression: A Validation Analysis.

    Science.gov (United States)

    Chen, Xinguang; Lin, Feng

    2012-09-03

    New analytical tools are needed to advance tobacco research, tobacco control planning and tobacco use prevention practice. In this study, we validated a method to extract information from cross-sectional survey for quantifying population dynamics of adolescent smoking behavior progression. With a 3-stage 7-path model, probabilities of smoking behavior progression were estimated employing the Probabilistic Discrete Event System (PDES) method and the cross-sectional data from 1997-2006 National Survey on Drug Use and Health (NSDUH). Validity of the PDES method was assessed using data from the National Longitudinal Survey of Youth 1997 and trends in smoking transition covering the period during which funding for tobacco control was cut substantively in 2003 in the United States. Probabilities for all seven smoking progression paths were successfully estimated with the PDES method and the NSDUH data. The absolute difference in the estimated probabilities between the two approaches varied from 0.002 to 0.076 (p>0.05 for all) and were highly correlated with each other (R(2) =0.998, pcross-sectional survey data. The estimated transitional probabilities add new evidence supporting more advanced tobacco research, tobacco control planning and tobacco use prevention practice. This method can be easily extended to study other health risk behaviors.

  18. Wildland fire probabilities estimated from weather model-deduced monthly mean fire danger indices

    Science.gov (United States)

    Haiganoush K. Preisler; Shyh-Chin Chen; Francis Fujioka; John W. Benoit; Anthony L. Westerling

    2008-01-01

    The National Fire Danger Rating System indices deduced from a regional simulation weather model were used to estimate probabilities and numbers of large fire events on monthly and 1-degree grid scales. The weather model simulations and forecasts are ongoing experimental products from the Experimental Climate Prediction Center at the Scripps Institution of Oceanography...

  19. OPTIMAL ESTIMATION OF RANDOM PROCESSES ON THE CRITERION OF MAXIMUM A POSTERIORI PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. A. Lobaty

    2016-01-01

    Full Text Available The problem of obtaining the equations for the a posteriori probability density of a stochastic Markov process with a linear measurement model. Unlike common approaches based on consideration as a criterion for optimization of the minimum mean square error of estimation, in this case, the optimization criterion is considered the maximum a posteriori probability density of the process being evaluated.The a priori probability density estimated Gaussian process originally considered a differentiable function that allows us to expand it in a Taylor series without use of intermediate transformations characteristic functions and harmonic decomposition. For small time intervals the probability density measurement error vector, by definition, as given by a Gaussian with zero expectation. This makes it possible to obtain a mathematical expression for the residual function, which characterizes the deviation of the actual measurement process from its mathematical model.To determine the optimal a posteriori estimation of the state vector is given by the assumption that this estimate is consistent with its expectation – the maximum a posteriori probability density. This makes it possible on the basis of Bayes’ formula for the a priori and a posteriori probability density of an equation Stratonovich-Kushner.Using equation Stratonovich-Kushner in different types and values of the vector of drift and diffusion matrix of a Markov stochastic process can solve a variety of filtration tasks, identify, smoothing and system status forecast for continuous and for discrete systems. Discrete continuous implementation of the developed algorithms posteriori assessment provides a specific, discrete algorithms for the implementation of the on-board computer, a mobile robot system.

  20. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  1. Using zero-norm constraint for sparse probability density function estimation

    Science.gov (United States)

    Hong, X.; Chen, S.; Harris, C. J.

    2012-11-01

    A new sparse kernel probability density function (pdf) estimator based on zero-norm constraint is constructed using the classical Parzen window (PW) estimate as the target function. The so-called zero-norm of the parameters is used in order to achieve enhanced model sparsity, and it is suggested to minimize an approximate function of the zero-norm. It is shown that under certain condition, the kernel weights of the proposed pdf estimator based on the zero-norm approximation can be updated using the multiplicative nonnegative quadratic programming algorithm. Numerical examples are employed to demonstrate the efficacy of the proposed approach.

  2. On the Estimation of Detection Probabilities for Sampling Stream-Dwelling Fishes.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, James T.

    1999-11-01

    To examine the adequacy of fish probability of detection estimates, I examined distributional properties of survey and monitoring data for bull trout (Salvelinus confluentus), brook trout (Salvelinus fontinalis), westslope cutthroat trout (Oncorhynchus clarki lewisi), chinook salmon parr (Oncorhynchus tshawytscha), and steelhead /redband trout (Oncorhynchus mykiss spp.), from 178 streams in the Interior Columbia River Basin. Negative binomial dispersion parameters varied considerably among species and streams, but were significantly (P<0.05) positively related to fish density. Across streams, the variances in fish abundances differed greatly among species and indicated that the data for all species were overdispersed with respect to the Poisson (i.e., the variances exceeded the means). This significantly affected Poisson probability of detection estimates, which were the highest across species and were, on average, 3.82, 2.66, and 3.47 times greater than baseline values. Required sample sizes for species detection at the 95% confidence level were also lowest for the Poisson, which underestimated sample size requirements an average of 72% across species. Negative binomial and Poisson-gamma probability of detection and sample size estimates were more accurate than the Poisson and generally less than 10% from baseline values. My results indicate the Poisson and binomial assumptions often are violated, which results in probability of detection estimates that are biased high and sample size estimates that are biased low. To increase the accuracy of these estimates, I recommend that future studies use predictive distributions than can incorporate multiple sources of uncertainty or excess variance and that all distributional assumptions be explicitly tested.

  3. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  4. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  5. A fast algorithm for estimating transmission probabilities in QTL detection designs with dense maps

    Directory of Open Access Journals (Sweden)

    Gilbert Hélène

    2009-11-01

    Full Text Available Abstract Background In the case of an autosomal locus, four transmission events from the parents to progeny are possible, specified by the grand parental origin of the alleles inherited by this individual. Computing the probabilities of these transmission events is essential to perform QTL detection methods. Results A fast algorithm for the estimation of these probabilities conditional to parental phases has been developed. It is adapted to classical QTL detection designs applied to outbred populations, in particular to designs composed of half and/or full sib families. It assumes the absence of interference. Conclusion The theory is fully developed and an example is given.

  6. Estimation of submarine mass failure probability from a sequence of deposits with age dates

    Science.gov (United States)

    Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.

    2013-01-01

    The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.

  7. Use of attribute association error probability estimates to evaluate quality of medical record geocodes.

    Science.gov (United States)

    Klaus, Christian A; Carrasco, Luis E; Goldberg, Daniel W; Henry, Kevin A; Sherman, Recinda L

    2015-09-15

    The utility of patient attributes associated with the spatiotemporal analysis of medical records lies not just in their values but also the strength of association between them. Estimating the extent to which a hierarchy of conditional probability exists between patient attribute associations such as patient identifying fields, patient and date of diagnosis, and patient and address at diagnosis is fundamental to estimating the strength of association between patient and geocode, and patient and enumeration area. We propose a hierarchy for the attribute associations within medical records that enable spatiotemporal relationships. We also present a set of metrics that store attribute association error probability (AAEP), to estimate error probability for all attribute associations upon which certainty in a patient geocode depends. A series of experiments were undertaken to understand how error estimation could be operationalized within health data and what levels of AAEP in real data reveal themselves using these methods. Specifically, the goals of this evaluation were to (1) assess if the concept of our error assessment techniques could be implemented by a population-based cancer registry; (2) apply the techniques to real data from a large health data agency and characterize the observed levels of AAEP; and (3) demonstrate how detected AAEP might impact spatiotemporal health research. We present an evaluation of AAEP metrics generated for cancer cases in a North Carolina county. We show examples of how we estimated AAEP for selected attribute associations and circumstances. We demonstrate the distribution of AAEP in our case sample across attribute associations, and demonstrate ways in which disease registry specific operations influence the prevalence of AAEP estimates for specific attribute associations. The effort to detect and store estimates of AAEP is worthwhile because of the increase in confidence fostered by the attribute association level approach to the

  8. Annotated corpus and the empirical evaluation of probability estimates of grammatical forms

    Directory of Open Access Journals (Sweden)

    Ševa Nada

    2003-01-01

    Full Text Available The aim of the present study is to demonstrate the usage of an annotated corpus in the field of experimental psycholinguistics. Specifically, we demonstrate how the manually annotated Corpus of Serbian Language (Kostić, Đ. 2001 can be used for probability estimates of grammatical forms, which allow the control of independent variables in psycholinguistic experiments. We address the issue of processing Serbian inflected forms within two subparadigms of feminine nouns. In regression analysis, almost all processing variability of inflected forms has been accounted for by the amount of information (i.e. bits carried by the presented forms. In spite of the fact that probability distributions of inflected forms for the two paradigms differ, it was shown that the best prediction of processing variability is obtained by the probabilities derived from the predominant subparadigm which encompasses about 80% of feminine nouns. The relevance of annotated corpora in experimental psycholinguistics is discussed more in detail .

  9. Estimating conditional probability of volcanic flows for forecasting event distribution and making evacuation decisions

    Science.gov (United States)

    Stefanescu, E. R.; Patra, A.; Sheridan, M. F.; Cordoba, G.

    2012-04-01

    In this study we propose a conditional probability framework for Galeras volcano, which is one of the most active volcanoes on the world. Nearly 400,000 people currently live near the volcano; 10,000 of them reside within the zone of high volcanic hazard. Pyroclastic flows pose a major hazard for this population. Some of the questions we try to answer when studying conditional probabilities for volcanic hazards are: "Should a village be evacuated and villagers moved to a different location?", "Should we construct a road along this valley or along a different one?", "Should this university be evacuated?" Here, we try to identify critical regions such as villages, infrastructures, cities, university to determine their relative probability of inundation in case of an volcanic eruption. In this study, a set of numerical simulation were performed using a computational tool TITAN2D which simulates granular flow over digital representation of the natural terrain. The particular choice from among the methods described below can be based on the amount of information necessary in the evacuation decision and on the complexity of the analysis required in taking such decision. A set of 4200 TITAN2D runs were performed for several different location so that the area of all probably vents is covered. The output of the geophysical model provides a flow map which contains the maximum flow depth over time. Frequency approach - In estimating the conditional probability of volcanic flows we define two discrete random variables (r.v.) A and B, where P(A =1) and P(B=1) represents the probability of having a flow at location A, and B, respectively. For this analysis we choose two critical locations identified by their UTM coordinates. The flow map is then used in identifying at the pixel level, flow or non-flow at the two locations. By counting the number of times there is flow or non-flow, we are able to find the marginal probabilities along with the joint probability associated with an

  10. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis

    Science.gov (United States)

    Chiba, Tomoaki; Akaho, Shotaro; Murata, Noboru

    2017-01-01

    In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group’s sales beat GM’s sales, which is a reasonable scenario. PMID:28076383

  11. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K

    2016-01-01

    -scores, as such knowledge would enhance causal SNP and gene discovery, help elucidate mechanistic pathways, and inform future study design. Here we present a parsimonious methodology for modeling effect sizes and replication probabilities, relying only on summary statistics from GWAS substudies, and a scheme allowing......Genome-wide Association Studies (GWAS) result in millions of summary statistics ("z-scores") for single nucleotide polymorphism (SNP) associations with phenotypes. These rich datasets afford deep insights into the nature and extent of genetic contributions to complex phenotypes such as psychiatric...... 9.3 million SNP z-scores in both cases. We show that, over a broad range of z-scores and sample sizes, the model accurately predicts expectation estimates of true effect sizes and replication probabilities in multistage GWAS designs. We assess the degree to which effect sizes are over-estimated when...

  12. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis.

    Science.gov (United States)

    Chiba, Tomoaki; Hino, Hideitsu; Akaho, Shotaro; Murata, Noboru

    2017-01-01

    In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group's sales beat GM's sales, which is a reasonable scenario.

  13. Tests of Catastrophic Outlier Prediction in Empirical Photometric Redshift Estimation with Redshift Probability Distributions

    Science.gov (United States)

    Jones, Evan; Singal, Jack

    2018-01-01

    We present results of using individual galaxies' redshift probability information derived from a photometric redshift (photo-z) algorithm, SPIDERz, to identify potential catastrophic outliers in photometric redshift determinations. By using test data comprised of COSMOS multi-band photometry and known spectroscopic redshifts from the 3D-HST survey spanning a wide redshift range (0method to flag potential catastrophic outliers in an analysis which relies on accurate photometric redshifts. SPIDERz is a custom support vector machine classification algorithm for photo-z analysis that naturally outputs a distribution of redshift probability information for each galaxy in addition to a discrete most probable photo-z value. By applying an analytic technique with flagging criteria to identify the presence of probability distribution features characteristic of catastrophic outlier photo-z estimates, such as multiple redshift probability peaks separated by substantial redshift distances, we can flag potential catastrophic outliers in photo-z determinations. We find that our proposed method can correctly flag large fractions of the outliers and catastrophic outlier galaxies, while only flagging a small fraction of the total non-outlier galaxies. We examine the performance of this strategy in photo-z determinations using a range of flagging parameter values. These results could potentially be useful for utilization of photometric redshifts in future large scale surveys where catastrophic outliers are particularly detrimental to the science goals.

  14. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  15. Skew Generalized Extreme Value Distribution: Probability Weighted Moments Estimation and Application to Block Maxima Procedure

    OpenAIRE

    Ribereau, Pierre; Masiello, Esterina; Naveau, Philippe

    2014-01-01

    International audience; Following the work of Azzalini ([2] and [3]) on the skew normal distribution, we propose an extension of the Generalized Extreme Value (GEV) distribution, the SGEV. This new distribution allows for a better t of maxima and can be interpreted as both the distribution of maxima when maxima are taken on dependent data and when maxima are taken over a random block size. We propose to estimate the parameters of the SGEV distribution via the Probability Weighted Moments meth...

  16. Probability estimates of seismic event occurrence compared to health hazards - Forecasting Taipei's Earthquakes

    Science.gov (United States)

    Fung, D. C. N.; Wang, J. P.; Chang, S. H.; Chang, S. C.

    2014-12-01

    Using a revised statistical model built on past seismic probability models, the probability of different magnitude earthquakes occurring within variable timespans can be estimated. The revised model is based on Poisson distribution and includes the use of best-estimate values of the probability distribution of different magnitude earthquakes recurring from a fault from literature sources. Our study aims to apply this model to the Taipei metropolitan area with a population of 7 million, which lies in the Taipei Basin and is bounded by two normal faults: the Sanchaio and Taipei faults. The Sanchaio fault is suggested to be responsible for previous large magnitude earthquakes, such as the 1694 magnitude 7 earthquake in northwestern Taipei (Cheng et. al., 2010). Based on a magnitude 7 earthquake return period of 543 years, the model predicts the occurrence of a magnitude 7 earthquake within 20 years at 1.81%, within 79 years at 6.77% and within 300 years at 21.22%. These estimates increase significantly when considering a magnitude 6 earthquake; the chance of one occurring within the next 20 years is estimated to be 3.61%, 79 years at 13.54% and 300 years at 42.45%. The 79 year period represents the average lifespan of the Taiwan population. In contrast, based on data from 2013, the probability of Taiwan residents experiencing heart disease or malignant neoplasm is 11.5% and 29%. The inference of this study is that the calculated risk that the Taipei population is at from a potentially damaging magnitude 6 or greater earthquake occurring within their lifetime is just as great as of suffering from a heart attack or other health ailments.

  17. Incident prediction: a statistical approach to dynamic probability estimation : application to a test site in Barcelona

    OpenAIRE

    Montero Mercadé, Lídia; Barceló Bugeda, Jaime; Perarnau, Josep

    2002-01-01

    DR 2002/08 Departament d'EIO - Research Supported by PRIME European Project Real-time models for estimating incident probabilities (EIP models) are innovative methods for predicting the potential occurrence of incidents and improving the effectiveness of incident management policies devoted to increasing road safety. EIP models imbedded in traffic management systems can lead to the development of control strategies for reducing the likelihood of incidents before they occur. This paper pre...

  18. The Probability of Default Under IFRS 9: Multi-period Estimation and Macroeconomic Forecast

    Directory of Open Access Journals (Sweden)

    Tomáš Vaněk

    2017-01-01

    Full Text Available In this paper we propose a straightforward, flexible and intuitive computational framework for the multi-period probability of default estimation incorporating macroeconomic forecasts. The concept is based on Markov models, the estimated economic adjustment coefficient and the official economic forecasts of the Czech National Bank. The economic forecasts are taken into account in a separate step to better distinguish between idiosyncratic and systemic risk. This approach is also attractive from the interpretational point of view. The proposed framework can be used especially when calculating lifetime expected credit losses under IFRS 9.

  19. Remediating Non-Positive Definite State Covariances for Collision Probability Estimation

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis team estimates the probability of collision (Pc) for a set of Earth-orbiting satellites. The Pc estimation software processes satellite position+velocity states and their associated covariance matri-ces. On occasion, the software encounters non-positive definite (NPD) state co-variances, which can adversely affect or prevent the Pc estimation process. Inter-polation inaccuracies appear to account for the majority of such covariances, alt-hough other mechanisms contribute also. This paper investigates the origin of NPD state covariance matrices, three different methods for remediating these co-variances when and if necessary, and the associated effects on the Pc estimation process.

  20. Statistical computation of Boltzmann entropy and estimation of the optimal probability density function from statistical sample

    Science.gov (United States)

    Sui, Ning; Li, Min; He, Ping

    2014-12-01

    In this work, we investigate the statistical computation of the Boltzmann entropy of statistical samples. For this purpose, we use both histogram and kernel function to estimate the probability density function of statistical samples. We find that, due to coarse-graining, the entropy is a monotonic increasing function of the bin width for histogram or bandwidth for kernel estimation, which seems to be difficult to select an optimal bin width/bandwidth for computing the entropy. Fortunately, we notice that there exists a minimum of the first derivative of entropy for both histogram and kernel estimation, and this minimum point of the first derivative asymptotically points to the optimal bin width or bandwidth. We have verified these findings by large amounts of numerical experiments. Hence, we suggest that the minimum of the first derivative of entropy be used as a selector for the optimal bin width or bandwidth of density estimation. Moreover, the optimal bandwidth selected by the minimum of the first derivative of entropy is purely data-based, independent of the unknown underlying probability density distribution, which is obviously superior to the existing estimators. Our results are not restricted to one-dimensional, but can also be extended to multivariate cases. It should be emphasized, however, that we do not provide a robust mathematical proof of these findings, and we leave these issues with those who are interested in them.

  1. A robust design mark-resight abundance estimator allowing heterogeneity in resighting probabilities

    Science.gov (United States)

    McClintock, B.T.; White, Gary C.; Burnham, K.P.

    2006-01-01

    This article introduces the beta-binomial estimator (BBE), a closed-population abundance mark-resight model combining the favorable qualities of maximum likelihood theory and the allowance of individual heterogeneity in sighting probability (p). The model may be parameterized for a robust sampling design consisting of multiple primary sampling occasions where closure need not be met between primary occasions. We applied the model to brown bear data from three study areas in Alaska and compared its performance to the joint hypergeometric estimator (JHE) and Bowden's estimator (BOWE). BBE estimates suggest heterogeneity levels were non-negligible and discourage the use of JHE for these data. Compared to JHE and BOWE, confidence intervals were considerably shorter for the AICc model-averaged BBE. To evaluate the properties of BBE relative to JHE and BOWE when sample sizes are small, simulations were performed with data from three primary occasions generated under both individual heterogeneity and temporal variation in p. All models remained consistent regardless of levels of variation in p. In terms of precision, the AICc model-averaged BBE showed advantages over JHE and BOWE when heterogeneity was present and mean sighting probabilities were similar between primary occasions. Based on the conditions examined, BBE is a reliable alternative to JHE or BOWE and provides a framework for further advances in mark-resight abundance estimation. ?? 2006 American Statistical Association and the International Biometric Society.

  2. Issues in estimating probability of detection of NDT techniques - A model assisted approach.

    Science.gov (United States)

    Rentala, Vamsi Krishna; Mylavarapu, Phani; Gautam, Jai Prakash

    2018-02-13

    In order to successfully implement Damage Tolerance (DT) methodology for aero-engines, Non-Destructive Testing (NDT) techniques are vital for assessing the remaining life of the component. Probability of Detection (POD), a standard measure of NDT reliability, is usually estimated as per MIL-HDBK-1823A standard. Estimation of POD of any NDT technique can be obtained by both experimental and model assisted methods. POD depends on many factors such as material, geometry, defect characteristics, inspection technique, etc. These requirements put enormous limitations on generating experimental POD curves and hence, Model Assisted Probability of Detection (MAPOD) curves are currently in vogue. In this study, MAPOD approaches were demonstrated by addressing various issues related to selection of crack sizes distribution, challenges involved in censoring and regression, estimation of distribution parameters, etc. Ultrasonic testing on volumetric defects has been identified as a platform to discuss the challenges involved. A COMSOL Multiphysics based FEM numerical model developed to simulate ultrasonic response from a Ti-6Al-4V cylindrical block has been validated experimentally. Further, the individual ultrasonic response from various Flat Bottom Hole (FBH) defects following lognormal distribution has been generated using the numerical model. a 90/95 (detecting a flaw with 90% probability and 95% confidence) value obtained from POD curve showed that the POD value increased with an increase in decision threshold. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Using optimal transport theory to estimate transition probabilities in metapopulation dynamics

    Science.gov (United States)

    Nichols, Jonathan M.; Spendelow, Jeffrey A.; Nichols, James D.

    2017-01-01

    This work considers the estimation of transition probabilities associated with populations moving among multiple spatial locations based on numbers of individuals at each location at two points in time. The problem is generally underdetermined as there exists an extremely large number of ways in which individuals can move from one set of locations to another. A unique solution therefore requires a constraint. The theory of optimal transport provides such a constraint in the form of a cost function, to be minimized in expectation over the space of possible transition matrices. We demonstrate the optimal transport approach on marked bird data and compare to the probabilities obtained via maximum likelihood estimation based on marked individuals. It is shown that by choosing the squared Euclidean distance as the cost, the estimated transition probabilities compare favorably to those obtained via maximum likelihood with marked individuals. Other implications of this cost are discussed, including the ability to accurately interpolate the population's spatial distribution at unobserved points in time and the more general relationship between the cost and minimum transport energy.

  4. How Much Will the Sea Level Rise? Outcome Selection and Subjective Probability in Climate Change Predictions.

    Science.gov (United States)

    Juanchich, Marie; Sirota, Miroslav

    2017-08-17

    We tested whether people focus on extreme outcomes to predict climate change and assessed the gap between the frequency of the predicted outcome and its perceived probability while controlling for climate change beliefs. We also tested 2 cost-effective interventions to reduce the preference for extreme outcomes and the frequency-probability gap by manipulating the probabilistic format: numerical or dual-verbal-numerical. In 4 experiments, participants read a scenario featuring a distribution of sea level rises, selected a sea rise to complete a prediction (e.g., "It is 'unlikely' that the sea level will rise . . . inches") and judged the likelihood of this sea rise occurring. Results showed that people have a preference for predicting extreme climate change outcomes in verbal predictions (59% in Experiments 1-4) and that this preference was not predicted by climate change beliefs. Results also showed an important gap between the predicted outcome frequency and participants' perception of the probability that it would occur. The dual-format reduced the preference for extreme outcomes for low and medium probability predictions but not for high ones, and none of the formats consistently reduced the frequency-probability gap. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. On estimating probability of presence from use-availability or presence-background data.

    Science.gov (United States)

    Phillips, Steven J; Elith, Jane

    2013-06-01

    A fundamental ecological modeling task is to estimate the probability that a species is present in (or uses) a site, conditional on environmental variables. For many species, available data consist of "presence" data (locations where the species [or evidence of it] has been observed), together with "background" data, a random sample of available environmental conditions. Recently published papers disagree on whether probability of presence is identifiable from such presence-background data alone. This paper aims to resolve the disagreement, demonstrating that additional information is required. We defined seven simulated species representing various simple shapes of response to environmental variables (constant, linear, convex, unimodal, S-shaped) and ran five logistic model-fitting methods using 1000 presence samples and 10 000 background samples; the simulations were repeated 100 times. The experiment revealed a stark contrast between two groups of methods: those based on a strong assumption that species' true probability of presence exactly matches a given parametric form had highly variable predictions and much larger RMS error than methods that take population prevalence (the fraction of sites in which the species is present) as an additional parameter. For six species, the former group grossly under- or overestimated probability of presence. The cause was not model structure or choice of link function, because all methods were logistic with linear and, where necessary, quadratic terms. Rather, the experiment demonstrates that an estimate of prevalence is not just helpful, but is necessary (except in special cases) for identifying probability of presence. We therefore advise against use of methods that rely on the strong assumption, due to Lele and Keim (recently advocated by Royle et al.) and Lancaster and Imbens. The methods are fragile, and their strong assumption is unlikely to be true in practice. We emphasize, however, that we are not arguing against

  6. The Effect of Computer-Assisted Teaching on Remedying Misconceptions: The Case of the Subject "Probability"

    Science.gov (United States)

    Gurbuz, Ramazan; Birgin, Osman

    2012-01-01

    The aim of this study is to determine the effects of computer-assisted teaching (CAT) on remedying misconceptions students often have regarding some probability concepts in mathematics. Toward this aim, computer-assisted teaching materials were developed and used in the process of teaching. Within the true-experimental research method, a pre- and…

  7. Estimating earthquake-induced failure probability and downtime of critical facilities.

    Science.gov (United States)

    Porter, Keith; Ramer, Kyle

    2012-01-01

    Fault trees have long been used to estimate failure risk in earthquakes, especially for nuclear power plants (NPPs). One interesting application is that one can assess and manage the probability that two facilities - a primary and backup - would be simultaneously rendered inoperative in a single earthquake. Another is that one can calculate the probabilistic time required to restore a facility to functionality, and the probability that, during any given planning period, the facility would be rendered inoperative for any specified duration. A large new peer-reviewed library of component damageability and repair-time data for the first time enables fault trees to be used to calculate the seismic risk of operational failure and downtime for a wide variety of buildings other than NPPs. With the new library, seismic risk of both the failure probability and probabilistic downtime can be assessed and managed, considering the facility's unique combination of structural and non-structural components, their seismic installation conditions, and the other systems on which the facility relies. An example is offered of real computer data centres operated by a California utility. The fault trees were created and tested in collaboration with utility operators, and the failure probability and downtime results validated in several ways.

  8. Tips for Teachers of Evidence-based Medicine: Clinical Prediction Rules (CPRs) and Estimating Pretest Probability

    Science.gov (United States)

    McGinn, Thomas; Jervis, Ramiro; Wisnivesky, Juan; Keitz, Sheri

    2008-01-01

    Background Clinical prediction rules (CPR) are tools that clinicians can use to predict the most likely diagnosis, prognosis, or response to treatment in a patient based on individual characteristics. CPRs attempt to standardize, simplify, and increase the accuracy of clinicians’ diagnostic and prognostic assessments. The teaching tips series is designed to give teachers advice and materials they can use to attain specific educational objectives. Educational Objectives In this article, we present 3 teaching tips aimed at helping clinical learners use clinical prediction rules and to more accurately assess pretest probability in every day practice. The first tip is designed to demonstrate variability in physician estimation of pretest probability. The second tip demonstrates how the estimate of pretest probability influences the interpretation of diagnostic tests and patient management. The third tip exposes learners to various examples and different types of Clinical Prediction Rules (CPR) and how to apply them in practice. Pilot Testing We field tested all 3 tips with 16 learners, a mix of interns and senior residents. Teacher preparatory time was approximately 2 hours. The field test utilized a board and a data projector; 3 handouts were prepared. The tips were felt to be clear and the educational objectives reached. Potential teaching pitfalls were identified. Conclusion Teaching with these tips will help physicians appreciate the importance of applying evidence to their every day decisions. In 2 or 3 short teaching sessions, clinicians can also become familiar with the use of CPRs in applying evidence consistently in everyday practice. PMID:18491194

  9. Estimating transmission probability in schools for the 2009 H1N1 influenza pandemic in Italy.

    Science.gov (United States)

    Clamer, Valentina; Dorigatti, Ilaria; Fumanelli, Laura; Rizzo, Caterina; Pugliese, Andrea

    2016-10-12

    Epidemic models are being extensively used to understand the main pathways of spread of infectious diseases, and thus to assess control methods. Schools are well known to represent hot spots for epidemic spread; hence, understanding typical patterns of infection transmission within schools is crucial for designing adequate control strategies. The attention that was given to the 2009 A/H1N1pdm09 flu pandemic has made it possible to collect detailed data on the occurrence of influenza-like illness (ILI) symptoms in two primary schools of Trento, Italy. The data collected in the two schools were used to calibrate a discrete-time SIR model, which was designed to estimate the probabilities of influenza transmission within the classes, grades and schools using Markov Chain Monte Carlo (MCMC) methods. We found that the virus was mainly transmitted within class, with lower levels of transmission between students in the same grade and even lower, though not significantly so, among different grades within the schools. We estimated median values of R 0 from the epidemic curves in the two schools of 1.16 and 1.40; on the other hand, we estimated the average number of students infected by the first school case to be 0.85 and 1.09 in the two schools. The discrepancy between the values of R 0 estimated from the epidemic curve or from the within-school transmission probabilities suggests that household and community transmission played an important role in sustaining the school epidemics. The high probability of infection between students in the same class confirms that targeting within-class transmission is key to controlling the spread of influenza in school settings and, as a consequence, in the general population.

  10. Subjective Quality Measurement of Speech Its Evaluation, Estimation and Applications

    CERN Document Server

    Kondo, Kazuhiro

    2012-01-01

    It is becoming crucial to accurately estimate and monitor speech quality in various ambient environments to guarantee high quality speech communication. This practical hands-on book shows speech intelligibility measurement methods so that the readers can start measuring or estimating speech intelligibility of their own system. The book also introduces subjective and objective speech quality measures, and describes in detail speech intelligibility measurement methods. It introduces a diagnostic rhyme test which uses rhyming word-pairs, and includes: An investigation into the effect of word familiarity on speech intelligibility. Speech intelligibility measurement of localized speech in virtual 3-D acoustic space using the rhyme test. Estimation of speech intelligibility using objective measures, including the ITU standard PESQ measures, and automatic speech recognizers.

  11. Estimation of the probability of exposure to machining fluids in a population-based case-control study.

    Science.gov (United States)

    Park, Dong-Uk; Colt, Joanne S; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R; Armenti, Karla R; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (0.1->0.9%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and U.S. production levels found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resources: a list of keywords in the occupational histories that were used to link study subjects to the metalworking fluids (MWFs) modules; recommendations from the literature on selection of MWFs based on type of machining operation, the metal being machined and decade; popular additives to MWFs; the number and proportion of controls who

  12. Dynamic Estimation of the Probability of Patient Readmission to the ICU using Electronic Medical Records.

    Science.gov (United States)

    Caballero, Karla; Akella, Ram

    2015-01-01

    In this paper, we propose a framework to dynamically estimate the probability that a patient is readmitted after he is discharged from the ICU and transferred to a lower level care. We model this probability as a latent state which evolves over time using Dynamical Linear Models (DLM). We use as an input a combination of numerical and text features obtained from the patient Electronic Medical Records (EMRs). We process the text from the EMRs to capture different diseases, symptoms and treatments by means of noun phrases and ontologies. We also capture the global context of each text entry using Statistical Topic Models. We fill out the missing values using a Expectation Maximization based method (EM). Experimental results show that our method outperforms other methods in the literature terms of AUC, sensitivity and specificity. In addition, we show that the combination of different features (numerical and text) increases the prediction performance of the proposed approach.

  13. Estimating the Upper Limit of Lifetime Probability Distribution, Based on Data of Japanese Centenarians.

    Science.gov (United States)

    Hanayama, Nobutane; Sibuya, Masaaki

    2016-08-01

    In modern biology, theories of aging fall mainly into two groups: damage theories and programed theories. If programed theories are true, the probability that human beings live beyond a specific age will be zero. In contrast, if damage theories are true, such an age does not exist, and a longevity record will be eventually destroyed. In this article, for examining real state, a special type of binomial model based on the generalized Pareto distribution has been applied to data of Japanese centenarians. From the results, it is concluded that the upper limit of lifetime probability distribution in the Japanese population has been estimated 123 years. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Application of the Unbounded Probability Distribution of the Johnson System for Floods Estimation

    Directory of Open Access Journals (Sweden)

    Campos-Aranda Daniel Francisco

    2015-09-01

    Full Text Available Floods designs constitute a key to estimate the sizing of new water works and to review the hydrological security of existing ones. The most reliable method for estimating their magnitudes associated with certain return periods is to fit a probabilistic model to available records of maximum annual flows. Since such model is at first unknown, several models need to be tested in order to select the most appropriate one according to an arbitrary statistical index, commonly the standard error of fit. Several probability distributions have shown versatility and consistency of results when processing floods records and therefore, its application has been established as a norm or precept. The Johnson System has three families of distributions, one of which is the Log–Normal model with three parameters of fit, which is also the border between the bounded distributions and those with no upper limit. These families of distributions have four adjustment parameters and converge to the standard normal distribution, so that their predictions are obtained with such a model. Having contrasted the three probability distributions established by precept in 31 historical records of hydrological events, the Johnson system is applied to such data. The results of the unbounded distribution of the Johnson system (SJU are compared to the optimal results from the three distributions. It was found that the predictions of the SJU distribution are similar to those obtained with the other models in the low return periods ( 1000 years. Because of its theoretical support, the SJU model is recommended in flood estimation.

  15. Estimating occurrence and detection probabilities for stream-breeding salamanders in the Gulf Coastal Plain

    Science.gov (United States)

    Lamb, Jennifer Y.; Waddle, J. Hardin; Qualls, Carl P.

    2017-01-01

    Large gaps exist in our knowledge of the ecology of stream-breeding plethodontid salamanders in the Gulf Coastal Plain. Data describing where these salamanders are likely to occur along environmental gradients, as well as their likelihood of detection, are important for the prevention and management of amphibian declines. We used presence/absence data from leaf litter bag surveys and a hierarchical Bayesian multispecies single-season occupancy model to estimate the occurrence of five species of plethodontids across reaches in headwater streams in the Gulf Coastal Plain. Average detection probabilities were high (range = 0.432–0.942) and unaffected by sampling covariates specific to the use of litter bags (i.e., bag submergence, sampling season, in-stream cover). Estimates of occurrence probabilities differed substantially between species (range = 0.092–0.703) and were influenced by the size of the upstream drainage area and by the maximum proportion of the reach that dried. The effects of these two factors were not equivalent across species. Our results demonstrate that hierarchical multispecies models successfully estimate occurrence parameters for both rare and common stream-breeding plethodontids. The resulting models clarify how species are distributed within stream networks, and they provide baseline values that will be useful in evaluating the conservation statuses of plethodontid species within lotic systems in the Gulf Coastal Plain.

  16. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  17. Estimating the probability of allelic drop-out of STR alleles in forensic genetics

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    2009-01-01

    In crime cases with available DNA evidence, the amount of DNA is often sparse due to the setting of the crime. In such cases, allelic drop-out of one or more true alleles in STR typing is possible. We present a statistical model for estimating the per locus and overall probability of allelic drop......-out using the results of all STR loci in the case sample as reference. The methodology of logistic regression is appropriate for this analysis, and we demonstrate how to incorporate this in a forensic genetic framework....

  18. Estimating Probable Maximum Precipitation by Considering Combined Effect of Typhoon and Southwesterly Air Flow

    Directory of Open Access Journals (Sweden)

    Cheng-Chin Liu

    2016-01-01

    Full Text Available Typhoon Morakot hit southern Taiwan in 2009, bringing 48-hr of heavy rainfall [close to the Probable Maximum Precipitation (PMP] to the Tsengwen Reservoir catchment. This extreme rainfall event resulted from the combined (co-movement effect of two climate systems (i.e., typhoon and southwesterly air flow. Based on the traditional PMP estimation method (i.e., the storm transposition method, STM, two PMP estimation approaches, i.e., Amplification Index (AI and Independent System (IS approaches, which consider the combined effect are proposed in this work. The AI approach assumes that the southwesterly air flow precipitation in a typhoon event could reach its maximum value. The IS approach assumes that the typhoon and southwesterly air flow are independent weather systems. Based on these assumptions, calculation procedures for the two approaches were constructed for a case study on the Tsengwen Reservoir catchment. The results show that the PMP estimates for 6- to 60-hr durations using the two approaches are approximately 30% larger than the PMP estimates using the traditional STM without considering the combined effect. This work is a pioneer PMP estimation method that considers the combined effect of a typhoon and southwesterly air flow. Further studies on this issue are essential and encouraged.

  19. Estimated Probability of a Cervical Spine Injury During an ISS Mission

    Science.gov (United States)

    Brooker, John E.; Weaver, Aaron S.; Myers, Jerry G.

    2013-01-01

    Introduction: The Integrated Medical Model (IMM) utilizes historical data, cohort data, and external simulations as input factors to provide estimates of crew health, resource utilization and mission outcomes. The Cervical Spine Injury Module (CSIM) is an external simulation designed to provide the IMM with parameter estimates for 1) a probability distribution function (PDF) of the incidence rate, 2) the mean incidence rate, and 3) the standard deviation associated with the mean resulting from injury/trauma of the neck. Methods: An injury mechanism based on an idealized low-velocity blunt impact to the superior posterior thorax of an ISS crewmember was used as the simulated mission environment. As a result of this impact, the cervical spine is inertially loaded from the mass of the head producing an extension-flexion motion deforming the soft tissues of the neck. A multibody biomechanical model was developed to estimate the kinematic and dynamic response of the head-neck system from a prescribed acceleration profile. Logistic regression was performed on a dataset containing AIS1 soft tissue neck injuries from rear-end automobile collisions with published Neck Injury Criterion values producing an injury transfer function (ITF). An injury event scenario (IES) was constructed such that crew 1 is moving through a primary or standard translation path transferring large volume equipment impacting stationary crew 2. The incidence rate for this IES was estimated from in-flight data and used to calculate the probability of occurrence. The uncertainty in the model input factors were estimated from representative datasets and expressed in terms of probability distributions. A Monte Carlo Method utilizing simple random sampling was employed to propagate both aleatory and epistemic uncertain factors. Scatterplots and partial correlation coefficients (PCC) were generated to determine input factor sensitivity. CSIM was developed in the SimMechanics/Simulink environment with a

  20. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2009-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage.

  1. First-passage Probability Estimation of an Earthquake Response of Seismically Isolated Containment Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Hahm, Dae-Gi [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Kwan-Soon [Dongguk University, Seoul (Korea, Republic of); Koh, Hyun-Moo [Seoul National Univ., Seoul (Korea, Republic of)

    2008-10-15

    The awareness of a seismic hazard and risk is being increased rapidly according to the frequent occurrences of the huge earthquakes such as the 2008 Sichuan earthquake which caused about 70,000 confirmed casualties and a 20 billion U.S. dollars economic loss. Since an earthquake load contains various uncertainties naturally, the safety of a structural system under an earthquake excitation has been assessed by probabilistic approaches. In many structural applications for a probabilistic safety assessment, it is often regarded that the failure of a system will occur when the response of the structure firstly crosses the limit barrier within a specified interval of time. The determination of such a failure probability is usually called the 'first-passage problem' and has been extensively studied during the last few decades. However, especially for the structures which show a significant nonlinear dynamic behavior, an effective and accurate method for the estimation of such a failure probability is not fully established yet. In this study, we presented a new approach to evaluate the first-passage probability of an earthquake response of seismically isolated structures. The proposed method is applied to the seismic isolation system for the containment buildings of a nuclear power plant. From the numerical example, we verified that the proposed method shows accurate results with more efficient computational efforts compared to the conventional approaches.

  2. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2010-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish .

  3. Estimation of (n,f) Cross-Sections by Measuring Reaction Probability Ratios

    Energy Technology Data Exchange (ETDEWEB)

    Plettner, C; Ai, H; Beausang, C W; Bernstein, L A; Ahle, L; Amro, H; Babilon, M; Burke, J T; Caggiano, J A; Casten, R F; Church, J A; Cooper, J R; Crider, B; Gurdal, G; Heinz, A; McCutchan, E A; Moody, K; Punyon, J A; Qian, J; Ressler, J J; Schiller, A; Williams, E; Younes, W

    2005-04-21

    Neutron-induced reaction cross-sections on unstable nuclei are inherently difficult to measure due to target activity and the low intensity of neutron beams. In an alternative approach, named the 'surrogate' technique, one measures the decay probability of the same compound nucleus produced using a stable beam on a stable target to estimate the neutron-induced reaction cross-section. As an extension of the surrogate method, in this paper they introduce a new technique of measuring the fission probabilities of two different compound nuclei as a ratio, which has the advantage of removing most of the systematic uncertainties. This method was benchmarked in this report by measuring the probability of deuteron-induced fission events in coincidence with protons, and forming the ratio P({sup 236}U(d,pf))/P({sup 238}U(d,pf)), which serves as a surrogate for the known cross-section ratio of {sup 236}U(n,f)/{sup 238}U(n,f). IN addition, the P({sup 238}U(d,d{prime}f))/P({sup 236}U(d,d{prime}f)) ratio as a surrogate for the {sup 237}U(n,f)/{sup 235}U(n,f) cross-section ratio was measured for the first time in an unprecedented range of excitation energies.

  4. Development of a statistical tool for the estimation of riverbank erosion probability

    Science.gov (United States)

    Varouchakis, Emmanouil

    2016-04-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a innovative statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use, accurate and can be applied to any region and river. Varouchakis, E. A., Giannakis, G. V., Lilli, M. A., Ioannidou, E., Nikolaidis, N. P., and Karatzas, G. P.: Development of a statistical tool for the estimation of riverbank erosion probability, SOIL (EGU), in print, 2016.

  5. Estimating multidimensional probability fields using the Field Estimator for Arbitrary Spaces (FiEstAS) with applications to astrophysics

    Science.gov (United States)

    Ascasibar, Yago

    2010-08-01

    The Field Estimator for Arbitrary Spaces (FiEstAS) computes the continuous probability density field underlying a given discrete data sample in multiple, non-commensurate dimensions. The algorithm works by constructing a metric-independent tessellation of the data space based on a recursive binary splitting. Individual, data-driven bandwidths are assigned to each point, scaled so that a constant “mass”M is enclosed. Kernel density estimation may then be performed for different kernel shapes, and a combination of balloon and sample point estimators is proposed as a compromise between resolution and variance. A bias correction is evaluated for the particular (yet common) case where the density is computed exactly at the locations of the data points rather than at an uncorrelated set of locations. By default, the algorithm combines a top-hat kernel with M=2.0 with the balloon estimator and applies the corresponding bias correction. These settings are shown to yield reasonable results for a simple test case, a two-dimensional ring, that illustrates the performance for oblique distributions, as well as for a six-dimensional Hernquist sphere, a fairly realistic model of the dynamical structure of stellar bulges in galaxies and dark matter haloes in cosmological N-body simulations. Results for different parameter settings are discussed in order to provide a guideline to select an optimal configuration in other cases. Source code is available upon request.

  6. Estimated probability of postwildfire debris flows in the 2012 Whitewater-Baldy Fire burn area, southwestern New Mexico

    Science.gov (United States)

    Tillery, Anne C.; Matherne, Anne Marie; Verdin, Kristine L.

    2012-01-01

    Creek, Iron Creek, and West Fork Mogollon Creek. Drainage basins with estimated debris-flow volumes greater than 100,000 m3 for the 25-year-recurrence event, 24 percent of the basins modeled, also include tributaries to Deep Creek, Mineral Creek, Gilita Creek, West Fork Gila River, Mogollon Creek, and Turkey Creek, among others. Basins with the highest combined probability and volume relative hazard rankings for the 25-year-recurrence rainfall include tributaries to Whitewater Creek, Mineral Creek, Willow Creek, West Fork Gila River, West Fork Mogollon Creek, and Turkey Creek. Debris flows from Whitewater, Mineral, and Willow Creeks could affect the southwestern New Mexico communities of Glenwood, Alma, and Willow Creek. The maps presented herein may be used to prioritize areas where emergency erosion mitigation or other protective measures may be necessary within a 2- to 3-year period of vulnerability following the Whitewater-Baldy Fire. This work is preliminary and is subject to revision. It is being provided because of the need for timely "best science" information. The assessment herein is provided on the condition that neither the U.S. Geological Survey nor the U.S. Government may be held liable for any damages resulting from the authorized or unauthorized use of the assessment.

  7. Overestimating HIV infection: The construction and accuracy of subjective probabilities of HIV infection in rural Malawi

    OpenAIRE

    Anglewicz, Philip; Kohler, Hans-Peter

    2009-01-01

    In the absence of HIV testing, how do rural Malawians assess their HIV status? In this paper, we use a unique dataset that includes respondents' HIV status as well as their subjective likelihood of HIV infection. These data show that many rural Malawians overestimate their likelihood of current HIV infection. The discrepancy between actual and perceived status raises an important question: Why are so many wrong? We begin by identifying determinants of self-assessed HIV status, and then compar...

  8. Estimating probabilities of peptide database identifications to LC-FTICR-MS observations

    Directory of Open Access Journals (Sweden)

    Daly Don S

    2006-02-01

    Full Text Available Abstract Background The field of proteomics involves the characterization of the peptides and proteins expressed in a cell under specific conditions. Proteomics has made rapid advances in recent years following the sequencing of the genomes of an increasing number of organisms. A prominent technology for high throughput proteomics analysis is the use of liquid chromatography coupled to Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR-MS. Meaningful biological conclusions can best be made when the peptide identities returned by this technique are accompanied by measures of accuracy and confidence. Methods After a tryptically digested protein mixture is analyzed by LC-FTICR-MS, the observed masses and normalized elution times of the detected features are statistically matched to the theoretical masses and elution times of known peptides listed in a large database. The probability of matching is estimated for each peptide in the reference database using statistical classification methods assuming bivariate Gaussian probability distributions on the uncertainties in the masses and the normalized elution times. Results A database of 69,220 features from 32 LC-FTICR-MS analyses of a tryptically digested bovine serum albumin (BSA sample was matched to a database populated with 97% false positive peptides. The percentage of high confidence identifications was found to be consistent with other database search procedures. BSA database peptides were identified with high confidence on average in 14.1 of the 32 analyses. False positives were identified on average in just 2.7 analyses. Conclusion Using a priori probabilities that contrast peptides from expected and unexpected proteins was shown to perform better in identifying target peptides than using equally likely a priori probabilities. This is because a large percentage of the target peptides were similar to unexpected peptides which were included to be false positives. The use of

  9. Accuracy of clinicians and models for estimating the probability that a pulmonary nodule is malignant.

    Science.gov (United States)

    Balekian, Alex A; Silvestri, Gerard A; Simkovich, Suzanne M; Mestaz, Peter J; Sanders, Gillian D; Daniel, Jamie; Porcel, Jackie; Gould, Michael K

    2013-12-01

    Management of pulmonary nodules depends critically on the probability of malignancy. Models to estimate probability have been developed and validated, but most clinicians rely on judgment. The aim of this study was to compare the accuracy of clinical judgment with that of two prediction models. Physician participants reviewed up to five clinical vignettes, selected at random from a larger pool of 35 vignettes, all based on actual patients with lung nodules of known final diagnosis. Vignettes included clinical information and a representative slice from computed tomography. Clinicians estimated the probability of malignancy for each vignette. To examine agreement with models, we calculated intraclass correlation coefficients (ICC) and kappa statistics. To examine accuracy, we compared areas under the receiver operator characteristic curve (AUC). Thirty-six participants completed 179 vignettes, 47% of which described patients with malignant nodules. Agreement between participants and models was fair for the Mayo Clinic model (ICC, 0.37; 95% confidence interval [CI], 0.23-0.50) and moderate for the Veterans Affairs model (ICC, 0.46; 95% CI, 0.34-0.57). There was no difference in accuracy between participants (AUC, 0.70; 95% CI, 0.62-0.77) and the Mayo Clinic model (AUC, 0.71; 95% CI, 0.62-0.80; P = 0.90) or the Veterans Affairs model (AUC, 0.72; 95% CI, 0.64-0.80; P = 0.54). In this vignette-based study, clinical judgment and models appeared to have similar accuracy for lung nodule characterization, but agreement between judgment and the models was modest, suggesting that qualitative and quantitative approaches may provide complementary information.

  10. Modeling the relationship between most probable number (MPN) and colony-forming unit (CFU) estimates of fecal coliform concentration.

    Science.gov (United States)

    Gronewold, Andrew D; Wolpert, Robert L

    2008-07-01

    Most probable number (MPN) and colony-forming-unit (CFU) estimates of fecal coliform bacteria concentration are common measures of water quality in coastal shellfish harvesting and recreational waters. Estimating procedures for MPN and CFU have intrinsic variability and are subject to additional uncertainty arising from minor variations in experimental protocol. It has been observed empirically that the standard multiple-tube fermentation (MTF) decimal dilution analysis MPN procedure is more variable than the membrane filtration CFU procedure, and that MTF-derived MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the variability in, and discrepancy between, MPN and CFU measurements. We then compare our model to water quality samples analyzed using both MPN and CFU procedures, and find that the (often large) observed differences between MPN and CFU values for the same water body are well within the ranges predicted by our probabilistic model. Our results indicate that MPN and CFU intra-sample variability does not stem from human error or laboratory procedure variability, but is instead a simple consequence of the probabilistic basis for calculating the MPN. These results demonstrate how probabilistic models can be used to compare samples from different analytical procedures, and to determine whether transitions from one procedure to another are likely to cause a change in quality-based management decisions.

  11. Estimation of Probable Maximum Precipitation in Korea using a Regional Climate Model

    Directory of Open Access Journals (Sweden)

    Jeonghoon Lee

    2017-03-01

    Full Text Available Extreme precipitation events have been extensively applied to the design of social infra structures. Thus, a method to more scientifically estimate the extreme event is required. This paper suggests a method to estimate the extreme precipitation in Korea using a regional climate model. First, several historical extreme events are identified and the most extreme event of Typhoon Rusa (2002 is selected. Second, the selected event is reconstructed through the Weather Research and Forecasting (WRF model, one of the Regional Climate Models (RCMs. Third, the reconstructed event is maximized by adjusting initial and boundary conditions. Finally, the Probable Maximum Precipitation (PMP is obtained. The WRF could successfully simulate the observed precipitation in terms of spatial and temporal distribution (R2 = 0.81. The combination of the WRF Single-Moment (WSM 6-class graupel scheme (of microphysics, the Betts-Miller-Janjic scheme (of cumulus parameterization and the Mellor-Yamada-Janjic Turbulent Kinetic Energy (TKE scheme (of planetary boundary layer was determined to be the best combination to reconstruct Typhoon Rusa. The estimated PMP (RCM_PMP was compared with the existing PMP. The RCM_PMP was generally in good agreement with the PMP. The suggested methodology is expected to provide assessments of the existing PMP and to provide a new alternative for estimating PMP.

  12. Estimating the probability of arsenic occurrence in domestic wells in the United States

    Science.gov (United States)

    Ayotte, J.; Medalie, L.; Qi, S.; Backer, L. F.; Nolan, B. T.

    2016-12-01

    Approximately 43 million people (about 14 percent of the U.S. population) rely on privately owned domestic wells as their source of drinking water. Unlike public water systems, which are regulated by the Safe Drinking Water Act, there is no comprehensive national program to ensure that the water from domestic wells is routinely tested and that is it safe to drink. A study published in 2009 from the National Water-Quality Assessment Program of the U.S. Geological Survey assessed water-quality conditions from 2,100 domestic wells within 48 states and reported that more than one in five (23 percent) of the sampled wells contained one or more contaminants at a concentration greater than a human-health benchmark. In addition, there are many activities such as resource extraction, climate change-induced drought, and changes in land use patterns that could potentially affect the quality of the ground water source for domestic wells. The Health Studies Branch (HSB) of the National Center for Environmental Health, Centers for Disease Control and Prevention, created a Clean Water for Health Program to help address domestic well concerns. The goals of this program are to identify emerging public health issues associated with using domestic wells for drinking water and develop plans to address these issues. As part of this effort, HSB in cooperation with the U.S. Geological Survey has created probability models to estimate the probability of arsenic occurring at various concentrations in domestic wells in the U.S. We will present preliminary results of the project, including estimates of the population supplied by domestic wells that is likely to have arsenic greater than 10 micrograms per liter. Nationwide, we estimate this to be just over 2 million people. Logistic regression model results showing probabilities of arsenic greater than the Maximum Contaminant Level for public supply wells of 10 micrograms per liter in domestic wells in the U.S., based on data for arsenic

  13. Passive systems failure probability estimation by the meta-AK-IS{sup 2} algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Cadini, Francesco, E-mail: francesco.cadini@polimi.it [Dipartimento di Energia, Politecnico di Milano, via Ponzio 34/3, 20133 Milano (Italy); Santos, Francisco [Dipartimento di Energia, Politecnico di Milano, via Ponzio 34/3, 20133 Milano (Italy); Departamento de Energía Nuclear, Politécnica de Madrid, C/ José Gutiérrez Abascal 2, 28006 Madrid (Spain); Zio, Enrico [Dipartimento di Energia, Politecnico di Milano, via Ponzio 34/3, 20133 Milano (Italy); Chair on Systems Science and Energetic Challenge, European Foundation for New Energy-Electricité de France, Ecole Centrale Paris and Supelec, Grande Voie des Vignes, 92295 Chatenay-Malabry Cedex (France)

    2014-10-01

    Highlights: • Many future nuclear reactor concepts rely on passive safety systems. • Uncertainties in physical behavior may give rise to functional failures. • Passive system failures are rare events difficult to estimate by crude Monte Carlo. • We propose a kriging-based importance sampling for estimating failure probabilities. • We compare the results with other variance reduction-based methods of literature. - Abstract: Simplicity of design and independence from external inputs make passive safety systems very attractive both from the economical and safety points of view, for the development of future nuclear reactor concepts. On the other hand, concerns arise due to the not fully understood physical phenomena underlying their (passive) functioning and the scarce operating experience to characterize them, which can give rise to functional failures due to deviations from their modeled, expected behavior. The estimation of the probabilities of these failures requires the propagation of uncertainties in the passive systems functions models, which can be done by classical, crude Monte Carlo schemes. However, the passive system design is such that failure is a rare event, which renders these approaches often impractical due to the computational efforts involved in the repetition of runs of the computer codes numerically encoding the system model. In order to overcome this problem, in this paper we propose to apply the meta-AK-IS{sup 2} algorithm, previously introduced by the authors for obtaining improved computational efficiencies by coupling a kriging-based metamodel to an MC-based importance sampling strategy. The method is developed and demonstrated with reference to a case study of a natural convection-based cooling system of a gas-cooled fast reactor, operating under a post-loss-of-coolant accident (LOCA). A comparison is made with respect to other variance reduction-based methods of literature.

  14. METAPHOR: a machine-learning-based method for the probability density estimation of photometric redshifts

    Science.gov (United States)

    Cavuoti, S.; Amaro, V.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.

    2017-02-01

    A variety of fundamental astrophysical science topics require the determination of very accurate photometric redshifts (photo-z). A wide plethora of methods have been developed, based either on template models fitting or on empirical explorations of the photometric parameter space. Machine-learning-based techniques are not explicitly dependent on the physical priors and able to produce accurate photo-z estimations within the photometric ranges derived from the spectroscopic training set. These estimates, however, are not easy to characterize in terms of a photo-z probability density function (PDF), due to the fact that the analytical relation mapping the photometric parameters on to the redshift space is virtually unknown. We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method designed to provide a reliable PDF of the error distribution for empirical techniques. The method is implemented as a modular workflow, whose internal engine for photo-z estimation makes use of the MLPQNA neural network (Multi Layer Perceptron with Quasi Newton learning rule), with the possibility to easily replace the specific machine-learning model chosen to predict photo-z. We present a summary of results on SDSS-DR9 galaxy data, used also to perform a direct comparison with PDFs obtained by the LE PHARE spectral energy distribution template fitting. We show that METAPHOR is capable to estimate the precision and reliability of photometric redshifts obtained with three different self-adaptive techniques, i.e. MLPQNA, Random Forest and the standard K-Nearest Neighbors models.

  15. Impact of Patient Affect on Physician Estimate of Probability of Serious Illness and Test Ordering

    Science.gov (United States)

    Neumann, Dawn; Raad, Samih; Schriger, David L.; Hall, Cassandra L.; Capito, Jake; Kammer, David

    2017-01-01

    Purpose The authors hypothesize patient facial affect may influence clinician pretest probability (PTP) estimate of cardiopulmonary emergency (CPE) and desire to order a computerized tomographic pulmonary angiogram (CTPA). Method This prospective study was conducted at three Indiana University–affiliated hospitals in two parts: collecting videos of patients undergoing CTPA for suspected acute pulmonary embolism watching a humorous video (August 2014–April 2015) and presenting the medical histories and videos to clinicians to determine the impact of patient facial affect on physicians’ PTP estimate of CPE and desire to order a CTPA (June–November 2015). Patient outcomes were adjudicated as CPE+ or CPE− by three independent reviewers. Physicians completed a standardized test of facial affect recognition, read standardized medical histories, then viewed videos of the patients’ faces. Clinicians marked their PTP estimate of CPE and desire for a CTPA before and after seeing the video on a visual analog scale (VAS). Results Fifty physicians completed all 73 videos. Seeing the patient’s face produced a > 10% absolute change in PTP estimate of CPE in 1,204/3,650 (33%) cases and desire for a CTPA in 1,095/3,650 (30%) cases. The mean area under the receiver operating characteristic curve for CPE estimate was 0.55 ± 0.15, and the change in CPE VAS was negatively correlated with physicians’ standardized test scores (r = −0.23). Conclusions Clinicians may use patients’ faces to make clinically important inferences about presence of serious illness and need for diagnostic testing. However, these inferences may fail to align with actual patient outcomes. PMID:28403005

  16. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  17. Comparison of Subjective and Objective Sleep Estimations in Patients with Bipolar Disorder and Healthy Control Subjects

    Directory of Open Access Journals (Sweden)

    Philipp S. Ritter

    2016-01-01

    Full Text Available Background. Several studies have described but not formally tested discrepancies between subjective and objective measures of sleep. Study Objectives. To test the hypothesis that patients with bipolar disorder display a systematic bias to underestimate sleep duration and overestimate sleep latency. Methods. Actimetry was used to assess sleep latency and duration in 49 euthymic participants (bipolar = 21; healthy controls = 28 for 5–7 days. Participants simultaneously recorded estimated sleep duration and sleep latency on a daily basis via an online sleep diary. Group differences in the discrepancy between subjective and objective parameters were calculated using t-tests and corrected for multiple comparisons. Results. Patients with bipolar disorder significantly underestimated their sleep duration but did not overestimate their sleep latency compared to healthy controls. Conclusions. Studies utilizing diaries or questionnaires alone in patients with bipolar disorders may systematically underestimate sleep duration compared to healthy controls. The additional use of objective assessment methods such as actimetry is advisable.

  18. Estimating the probabilities of rare arrhythmic events in multiscale computational models of cardiac cells and tissue.

    Directory of Open Access Journals (Sweden)

    Mark A Walker

    2017-11-01

    Full Text Available Ectopic heartbeats can trigger reentrant arrhythmias, leading to ventricular fibrillation and sudden cardiac death. Such events have been attributed to perturbed Ca2+ handling in cardiac myocytes leading to spontaneous Ca2+ release and delayed afterdepolarizations (DADs. However, the ways in which perturbation of specific molecular mechanisms alters the probability of ectopic beats is not understood. We present a multiscale model of cardiac tissue incorporating a biophysically detailed three-dimensional model of the ventricular myocyte. This model reproduces realistic Ca2+ waves and DADs driven by stochastic Ca2+ release channel (RyR gating and is used to study mechanisms of DAD variability. In agreement with previous experimental and modeling studies, key factors influencing the distribution of DAD amplitude and timing include cytosolic and sarcoplasmic reticulum Ca2+ concentrations, inwardly rectifying potassium current (IK1 density, and gap junction conductance. The cardiac tissue model is used to investigate how random RyR gating gives rise to probabilistic triggered activity in a one-dimensional myocyte tissue model. A novel spatial-average filtering method for estimating the probability of extreme (i.e. rare, high-amplitude stochastic events from a limited set of spontaneous Ca2+ release profiles is presented. These events occur when randomly organized clusters of cells exhibit synchronized, high amplitude Ca2+ release flux. It is shown how reduced IK1 density and gap junction coupling, as observed in heart failure, increase the probability of extreme DADs by multiple orders of magnitude. This method enables prediction of arrhythmia likelihood and its modulation by alterations of other cellular mechanisms.

  19. Estimating the ground-state probability of a quantum simulation with product-state measurements

    Directory of Open Access Journals (Sweden)

    Bryce eYoshimura

    2015-10-01

    Full Text Available .One of the goals in quantum simulation is to adiabatically generate the ground state of a complicated Hamiltonian by starting with the ground state of a simple Hamiltonian and slowly evolving the system to the complicated one. If the evolution is adiabatic and the initial and final ground states are connected due to having the same symmetry, then the simulation will be successful. But in most experiments, adiabatic simulation is not possible because it would take too long, and the system has some level of diabatic excitation. In this work, we quantify the extent of the diabatic excitation even if we do not know {it a priori} what the complicated ground state is. Since many quantum simulator platforms, like trapped ions, can measure the probabilities to be in a product state, we describe techniques that can employ these simple measurements to estimate the probability of being in the ground state of the system after the diabatic evolution. These techniques do not require one to know any properties about the Hamiltonian itself, nor to calculate its eigenstate properties. All the information is derived by analyzing the product-state measurements as functions of time.

  20. Estimating landholders' probability of participating in a stewardship program, and the implications for spatial conservation priorities.

    Directory of Open Access Journals (Sweden)

    Vanessa M Adams

    Full Text Available The need to integrate social and economic factors into conservation planning has become a focus of academic discussions and has important practical implications for the implementation of conservation areas, both private and public. We conducted a survey in the Daly Catchment, Northern Territory, to inform the design and implementation of a stewardship payment program. We used a choice model to estimate the likely level of participation in two legal arrangements--conservation covenants and management agreements--based on payment level and proportion of properties required to be managed. We then spatially predicted landholders' probability of participating at the resolution of individual properties and incorporated these predictions into conservation planning software to examine the potential for the stewardship program to meet conservation objectives. We found that the properties that were least costly, per unit area, to manage were also the least likely to participate. This highlights a tension between planning for a cost-effective program and planning for a program that targets properties with the highest probability of participation.

  1. A method for Bayesian estimation of the probability of local intensity for some cities in Japan

    Directory of Open Access Journals (Sweden)

    G. C. Koravos

    2002-06-01

    Full Text Available Seismic hazard in terms of probability of exceedance of a given intensity in a given time span,was assessed for 12 sites in Japan.The method does not use any attenuation law.Instead,the dependence of local intensity on epicentral intensity I 0 is calculated directly from the data,using a Bayesian model.According to this model (Meroni et al., 1994,local intensity follows the binomial distribution with parameters (I 0 ,p .The parameter p is considered as a random variable following the Beta distribution.This manner of Bayesian estimates of p are assessed for various values of epicentral intensity and epicentral distance.In order to apply this model for the assessment of seismic hazard,the area under consideration is divided into seismic sources (zonesof known seismicity.The contribution of each source on the seismic hazard at every site is calculated according to the Bayesian model and the result is the combined effect of all the sources.High probabilities of exceedance were calculated for the sites that are in the central part of the country,with hazard decreasing slightly towards the north and the south parts.

  2. Estimation of the nuclear fuel assembly eigenfrequencies in the probability sense

    Directory of Open Access Journals (Sweden)

    Zeman V.

    2014-12-01

    Full Text Available The paper deals with upper and lower limits estimation of the nuclear fuel assembly eigenfrequencies, whose design and operation parameters are random variables. Each parameter is defined by its mean value and standard deviation or by a range of values. The gradient and three sigma criterion approach is applied to the calculation of the upper and lower limits of fuel assembly eigenfrequencies in the probability sense. Presented analytical approach used for the calculation of eigenfrequencies sensitivity is based on the modal synthesis method and the fuel assembly decomposition into six identical revolved fuel rod segments, centre tube and load-bearing skeleton linked by spacer grids. The method is applied for the Russian TVSA-T fuel assembly in the WWER1000/320 type reactor core in the Czech nuclear power plant Temelín.

  3. Simultaneous pixel detection probabilities and spatial resolution estimation of pixelized detectors by means of correlation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Grabski, V. [Instituto de Fisica, Universidad Nacional Autonoma de Mexico, A.P. 20-364, 01000 Mexico, DF (Mexico)], E-mail: varlen.grabski@cern.ch

    2008-02-21

    On the basis of the determination of statistical correlations between neighboring detector pixels, a novel method of estimating the simultaneous detection probability of pixels and the spatial resolution of pixelized detectors is proposed. The correlations are determined using noise variance measurement for isolated pixels and for the difference between neighboring pixels. The method is validated using images from two image-acquisition devices, a General Electric Senographe 2000D and a SD mammographic unit. The pixelized detector is irradiated with X-rays over its entire surface. It is shown that the simultaneous pixel detection probabilities can be estimated with an accuracy of 0.001-0.003, with an estimated systematic error of less than 0.005. The two-dimensional pre-sampled point-spread function (PSF{sup 0}) is determined using a single Gaussian approximation and a sum of two Gaussian approximations. The results obtained for the pre-sampled PSF{sup 0} show that the single Gaussian approximation is not appropriate, and the sum of two Gaussian approximations providing the best fit predicts the existence of a large ({approx}50%) narrow component. Support for this observation can be found in the recent simulation study of columnar indirect digital detectors by Badano et al. The sampled two-dimensional PSF is determined using Monte Carlo simulation for the L-shaped, uniformly distributed acceptance function for different fill-factor values. The calculation of the pre-sampled modulation transfer function based on the estimated PSF{sup 0} shows that the observed data can be reproduced only by the single Gaussian approximation, and that when the sum of two Gaussians is used, significantly larger values are apparent in the higher-frequency region for images from both detection devices. The proposed method does not require a precisely, constructed tool. It is insensitive to beam collimation and to system physical size and may be indispensable in cases where thin

  4. Simultaneous pixel detection probabilities and spatial resolution estimation of pixelized detectors by means of correlation measurements

    Science.gov (United States)

    Grabski, V.

    2008-02-01

    On the basis of the determination of statistical correlations between neighboring detector pixels, a novel method of estimating the simultaneous detection probability of pixels and the spatial resolution of pixelized detectors is proposed. The correlations are determined using noise variance measurement for isolated pixels and for the difference between neighboring pixels. The method is validated using images from two image-acquisition devices, a General Electric Senographe 2000D and a SD mammographic unit. The pixelized detector is irradiated with X-rays over its entire surface. It is shown that the simultaneous pixel detection probabilities can be estimated with an accuracy of 0.001-0.003, with an estimated systematic error of less than 0.005. The two-dimensional pre-sampled point-spread function (PSF 0) is determined using a single Gaussian approximation and a sum of two Gaussian approximations. The results obtained for the pre-sampled PSF 0 show that the single Gaussian approximation is not appropriate, and the sum of two Gaussian approximations providing the best fit predicts the existence of a large (˜50%) narrow component. Support for this observation can be found in the recent simulation study of columnar indirect digital detectors by Badano et al. The sampled two-dimensional PSF is determined using Monte Carlo simulation for the L-shaped, uniformly distributed acceptance function for different fill-factor values. The calculation of the pre-sampled modulation transfer function based on the estimated PSF 0 shows that the observed data can be reproduced only by the single Gaussian approximation, and that when the sum of two Gaussians is used, significantly larger values are apparent in the higher-frequency region for images from both detection devices. The proposed method does not require a precisely, constructed tool. It is insensitive to beam collimation and to system physical size and may be indispensable in cases where thin absorption slits or edges are

  5. Integrating hyper-parameter uncertainties in a multi-fidelity Bayesian model for the estimation of a probability of failure

    OpenAIRE

    Stroh, Rémi; Bect, Julien; Demeyer, Séverine; Fischer, Nicolas; Vazquez, Emmanuel

    2017-01-01

    International audience; A multi-fidelity simulator is a numerical model, in which one of the inputs controls a trade-off between the realism and the computational cost of the simulation. Our goal is to estimate the probability of exceeding a given threshold on a multi-fidelity stochastic simulator. We propose a fully Bayesian approach based on Gaussian processes to compute the posterior probability distribution of this probability. We pay special attention to the hyper-parameters of the model...

  6. PItcHPERFeCT: Primary Intracranial Hemorrhage Probability Estimation using Random Forests on CT.

    Science.gov (United States)

    Muschelli, John; Sweeney, Elizabeth M; Ullman, Natalie L; Vespa, Paul; Hanley, Daniel F; Crainiceanu, Ciprian M

    2017-01-01

    Intracerebral hemorrhage (ICH), where a blood vessel ruptures into areas of the brain, accounts for approximately 10-15% of all strokes. X-ray computed tomography (CT) scanning is largely used to assess the location and volume of these hemorrhages. Manual segmentation of the CT scan using planimetry by an expert reader is the gold standard for volume estimation, but is time-consuming and has within- and across-reader variability. We propose a fully automated segmentation approach using a random forest algorithm with features extracted from X-ray computed tomography (CT) scans. The Minimally Invasive Surgery plus rt-PA in ICH Evacuation (MISTIE) trial was a multi-site Phase II clinical trial that tested the safety of hemorrhage removal using recombinant-tissue plasminogen activator (rt-PA). For this analysis, we use 112 baseline CT scans from patients enrolled in the MISTE trial, one CT scan per patient. ICH was manually segmented on these CT scans by expert readers. We derived a set of imaging predictors from each scan. Using 10 randomly-selected scans, we used a first-pass voxel selection procedure based on quantiles of a set of predictors and then built 4 models estimating the voxel-level probability of ICH. The models used were: 1) logistic regression, 2) logistic regression with a penalty on the model parameters using LASSO, 3) a generalized additive model (GAM) and 4) a random forest classifier. The remaining 102 scans were used for model validation.For each validation scan, the model predicted the probability of ICH at each voxel. These voxel-level probabilities were then thresholded to produce binary segmentations of the hemorrhage. These masks were compared to the manual segmentations using the Dice Similarity Index (DSI) and the correlation of hemorrhage volume of between the two segmentations. We tested equality of median DSI using the Kruskal-Wallis test across the 4 models. We tested equality of the median DSI from sets of 2 models using a Wilcoxon

  7. Estimation of the probability of bacterial population survival: Development of a probability model to describe the variability in time to inactivation of Salmonella enterica.

    Science.gov (United States)

    Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu

    2017-12-01

    Despite the development of numerous predictive microbial inactivation models, a model focusing on the variability in time to inactivation for a bacterial population has not been developed. Additionally, an appropriate estimation of the risk of there being any remaining bacterial survivors in foods after the application of an inactivation treatment has not yet been established. Here, Gamma distribution, as a representative probability distribution, was used to estimate the variability in time to inactivation for a bacterial population. Salmonella enterica serotype Typhimurium was evaluated for survival in a low relative humidity environment. We prepared bacterial cells with an initial concentration that was adjusted to 2 × 10n colony-forming units/2 μl (n = 1, 2, 3, 4, 5) by performing a serial 10-fold dilution, and then we placed 2 μl of the inocula into each well of 96-well microplates. The microplates were stored in a desiccated environment at 10-20% relative humidity at 5, 15, or 25 °C. The survival or death of bacterial cells for each well in the 96-well microplate was confirmed by adding tryptic soy broth as an enrichment culture. The changes in the death probability of the 96 replicated bacterial populations were described as a cumulative Gamma distribution. The variability in time to inactivation was described by transforming the cumulative Gamma distribution into a Gamma distribution. We further examined the bacterial inactivation on almond kernels and radish sprout seeds. Additionally, we described certainty levels of bacterial inactivation that ensure the death probability of a bacterial population at six decimal reduction levels, ranging from 90 to 99.9999%. Consequently, the probability model developed in the present study enables us to estimate the death probability of bacterial populations in a desiccated environment over time. This probability model may be useful for risk assessment to estimate the amount of remaining bacteria in a given

  8. Herpes simplex virus-2 transmission probability estimates based on quantity of viral shedding.

    Science.gov (United States)

    Schiffer, Joshua T; Mayer, Bryan T; Fong, Youyi; Swan, David A; Wald, Anna

    2014-06-06

    Herpes simplex virus (HSV)-2 is periodically shed in the human genital tract, most often asymptomatically, and most sexual transmissions occur during asymptomatic shedding. It would be helpful to identify a genital viral load threshold necessary for transmission, as clinical interventions that maintain viral quantity below this level would be of high utility. However, because viral expansion, decay and re-expansion kinetics are extremely rapid during shedding episodes, it is impossible to directly measure genital viral load at the time of sexual activity. We developed a mathematical model based on reproducing shedding patterns in transmitting partners, and median number of sex acts prior to transmission in discordant couples, to estimate infectivity of single viral particles in the negative partner's genital tract. We then inferred probability estimates for transmission at different levels of genital tract viral load in the transmitting partner. We predict that transmission is unlikely at viral loads less than 10(4) HSV DNA copies. Moreover, most transmissions occur during prolonged episodes with high viral copy numbers. Many shedding episodes that result in transmission do not reach the threshold of clinical detection, because the ulcer remains very small, highlighting one reason why HSV-2 spreads so effectively within populations.

  9. Average bit error probability of binary coherent signaling over generalized fading channels subject to additive generalized gaussian noise

    KAUST Repository

    Soury, Hamza

    2012-06-01

    This letter considers the average bit error probability of binary coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closed form expression in terms of the Fox\\'s H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading and Nakagami-m fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters. © 2012 IEEE.

  10. Exact Symbol Error Probability of Square M-QAM Signaling over Generalized Fading Channels subject to Additive Generalized Gaussian Noise

    KAUST Repository

    Soury, Hamza

    2013-07-01

    This paper considers the average symbol error probability of square Quadrature Amplitude Modulation (QAM) coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closedform expression in terms of the Fox H function and the bivariate Fox H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading, Nakagami-m fading, and Rayleigh fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters.

  11. Joint Bayesian Estimation of Quasar Continua and the Lyα Forest Flux Probability Distribution Function

    Science.gov (United States)

    Eilers, Anna-Christina; Hennawi, Joseph F.; Lee, Khee-Gan

    2017-08-01

    We present a new Bayesian algorithm making use of Markov Chain Monte Carlo sampling that allows us to simultaneously estimate the unknown continuum level of each quasar in an ensemble of high-resolution spectra, as well as their common probability distribution function (PDF) for the transmitted Lyα forest flux. This fully automated PDF regulated continuum fitting method models the unknown quasar continuum with a linear principal component analysis (PCA) basis, with the PCA coefficients treated as nuisance parameters. The method allows one to estimate parameters governing the thermal state of the intergalactic medium (IGM), such as the slope of the temperature-density relation γ -1, while marginalizing out continuum uncertainties in a fully Bayesian way. Using realistic mock quasar spectra created from a simplified semi-numerical model of the IGM, we show that this method recovers the underlying quasar continua to a precision of ≃ 7 % and ≃ 10 % at z = 3 and z = 5, respectively. Given the number of principal component spectra, this is comparable to the underlying accuracy of the PCA model itself. Most importantly, we show that we can achieve a nearly unbiased estimate of the slope γ -1 of the IGM temperature-density relation with a precision of +/- 8.6 % at z = 3 and +/- 6.1 % at z = 5, for an ensemble of ten mock high-resolution quasar spectra. Applying this method to real quasar spectra and comparing to a more realistic IGM model from hydrodynamical simulations would enable precise measurements of the thermal and cosmological parameters governing the IGM, albeit with somewhat larger uncertainties, given the increased flexibility of the model.

  12. Agricultural Fragility Estimates Subjected to Volcanic Ash Fall Hazards

    Science.gov (United States)

    Ham, H. J.; Lee, S.; Choi, S. H.; Yun, W. S.

    2015-12-01

    Agricultural Fragility Estimates Subjected to Volcanic Ash Fall Hazards Hee Jung Ham1, Seung-Hun Choi1, Woo-Seok Yun1, Sungsu Lee2 1Department of Architectural Engineering, Kangwon National University, Korea 2Division of Civil Engineering, Chungbuk National University, Korea ABSTRACT In this study, fragility functions are developed to estimate expected volcanic ash damages of the agricultural sector in Korea. The fragility functions are derived from two approaches: 1) empirical approach based on field observations of impacts to agriculture from the 2006 eruption of Merapi volcano in Indonesia and 2) the FOSM (first-order second-moment) analytical approach based on distribution and thickness of volcanic ash observed from the 1980 eruption of Mt. Saint Helens and agricultural facility specifications in Korea. Fragility function to each agricultural commodity class is presented by a cumulative distribution function of the generalized extreme value distribution. Different functions are developed to estimate production losses from outdoor and greenhouse farming. Seasonal climate influences vulnerability of each agricultural crop and is found to be a crucial component in determining fragility of agricultural commodities to an ash fall. In the study, the seasonality coefficient is established as a multiplier of fragility function to consider the seasonal vulnerability. Yields of the different agricultural commodities are obtained from Korean Statistical Information Service to create a baseline for future agricultural volcanic loss estimation. Numerically simulated examples of scenario ash fall events at Mt. Baekdu volcano are utilized to illustrate the application of the developed fragility functions. Acknowledgements This research was supported by a grant 'Development of Advanced Volcanic Disaster Response System considering Potential Volcanic Risk around Korea' [MPSS-NH-2015-81] from the Natural Hazard Mitigation Research Group, Ministry of Public Safety and Security of

  13. Estimating the probabilities of making a smoking quit attempt in Italy: stall in smoking cessation levels, 1986-2009

    Directory of Open Access Journals (Sweden)

    Carreras Giulia

    2012-03-01

    Full Text Available Abstract Background No data on annual smoking cessation probability (i.e., the probability of successfully quit in a given year are available for Italy at a population level. Mathematical models typically used to estimate smoking cessation probabilities do not account for smoking relapse. In this paper, we developed a mathematical model to estimate annual quitting probabilities, taking into account smoking relapse and time since cessation. Methods We developed a dynamic model describing the evolution of current, former, and never smokers. We estimated probabilities of smoking cessation by fitting the model with observed smoking prevalence in Italy, 1986-2009. Results Annual cessation probabilities were higher than 5% only in elderly persons and in women aged Conclusions Over the last 20 years, cessation probabilities among Italian smokers, particularly for those aged 30-59 years, have been very low and stalled. Quitting in Italy is considered as a practicable strategy only by women in the age of pregnancy and by elderly persons, when it’s likely that symptoms of tobacco-related diseases have already appeared. In order to increase cessation probabilities, smoking cessation treatment policies (introducing total reimbursement of cessation treatments, with a further development of quitlines and smoking cessation services should be empowered and a country-wide mass media campaign targeting smokers aged 30-59 years and focusing on promotion of quitting should be implemented.

  14. Relationship between Future Time Orientation and Item Nonresponse on Subjective Probability Questions: A Cross-Cultural Analysis

    Science.gov (United States)

    Lee, Sunghee; Liu, Mingnan; Hu, Mengyao

    2017-01-01

    Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals’ time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying “I don’t know” item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research. PMID:28781381

  15. Estimating the probability of recontamination via the air using Monte Carlo simulations.

    Science.gov (United States)

    den Aantrekker, Esther D; Beumer, Rijkelt R; van Gerwen, Suzanne J C; Zwietering, Marcel H; van Schothorst, Mick; Boom, Remko M

    2003-10-15

    Recontamination of food products can cause foodborne illnesses or spoilage of foods. It is therefore useful to quantify this recontamination so that it can be incorporated in microbiological risk assessments (MRA). This paper describes a first attempt to quantify one of the recontamination routes: via the air. Data on the number of airborne microorganisms were collected from literature and industries. The settling velocities of different microorganisms were calculated for different products by combining the data on aerial concentrations with sedimentation counts assuming that settling is under the influence of gravity only. Air movement is not explicitly considered in this study. Statistical analyses were performed to clarify the effect of different products and seasons on the number of airborne microorganisms and the settling velocity. For both bacteria and moulds, three significantly different product categories with regard to the level of airborne organisms were identified. The statistical distribution in these categories was described by a lognormal distribution. The settling velocity did not depend on the product, the season of sampling or the type of microorganism, and had a geometrical mean value of 2.7 mm/s. The statistical distribution of the settling velocity was described by a lognormal distribution as well. The probability of recontamination via the air was estimated by the product of the number of bacteria in the air, the settling velocity, and the exposed area and time of the product. For three example products, the contamination level as a result of airborne recontamination was estimated using Monte Carlo simulations. What-if scenarios were used to exemplify determination of design criteria to control a specified contamination level.

  16. Programming errors contribute to death from patient-controlled analgesia: case report and estimate of probability.

    Science.gov (United States)

    Vicente, Kim J; Kada-Bekhaled, Karima; Hillel, Gillian; Cassano, Andrea; Orser, Beverley A

    2003-04-01

    To identify the factors that threaten patient safety when using patient-controlled analgesia (PCA) and to obtain an evidence-based estimate of the probability of death from user programming errors associated with PCA. A 19-yr-old woman underwent Cesarean section and delivered a healthy infant. Postoperatively, morphine sulfate (2 mg bolus, lockout interval of six minutes, four-hour limit of 30 mg) was ordered, to be delivered by an Abbott Lifecare 4100 Plus II Infusion Pump. A drug cassette containing 1 mg.mL(-1) solution of morphine was unavailable, so the nurse used a cassette that contained a more concentrated solution (5 mg.mL(-1)). 7.5 hr after the PCA was started, the patient was pronounced dead. Blood samples were obtained and autopsy showed a toxic concentration of morphine. The available evidence is consistent with a concentration programming error where morphine 1 mg.mL(-1) was entered instead of 5 mg.mL(-1). Based on a search of such incidents in the Food and Drug Administration MDR database and other sources and on a denominator of 22,000,000 provided by the device manufacturer, mortality from user programming errors with this device was estimated to be a low likelihood event (ranging from 1 in 33,000 to 1 in 338,800), but relatively numerous in absolute terms (ranging from 65-667 deaths). Anesthesiologists, nurses, human factors engineers, and device manufacturers can work together to enhance the safety of PCA pumps by redesigning user interfaces, drug cassettes, and hospital operating procedures to minimize programming errors and to enhance their detection before patients are harmed.

  17. EstimatingTP53Mutation Carrier Probability in Families with Li-Fraumeni Syndrome Using LFSPRO.

    Science.gov (United States)

    Peng, Gang; Bojadzieva, Jasmina; Ballinger, Mandy L; Li, Jialu; Blackford, Amanda L; Mai, Phuong L; Savage, Sharon A; Thomas, David M; Strong, Louise C; Wang, Wenyi

    2017-06-01

    Background: Li-Fraumeni syndrome (LFS) is associated with germline TP53 mutations and a very high lifetime cancer risk. Algorithms that assess a patient's risk of inherited cancer predisposition are often used in clinical counseling. The existing LFS criteria have limitations, suggesting the need for an advanced prediction tool to support clinical decision making for TP53 mutation testing and LFS management. Methods: Based on a Mendelian model, LFSPRO estimates TP53 mutation probability through the Elston-Stewart algorithm and consequently estimates future risk of cancer. With independent datasets of 1,353 tested individuals from 867 families, we evaluated the prediction performance of LFSPRO. Results: LFSPRO accurately predicted TP53 mutation carriers in a pediatric sarcoma cohort from MD Anderson Cancer Center in the United States, the observed to expected ratio (OE) = 1.35 (95% confidence interval, 0.99-1.80); area under the receiver operating characteristic curve (AUC) = 0.85 (0.75-0.93); a population-based sarcoma cohort from the International Sarcoma Kindred Study in Australia, OE = 1.62 (1.03-2.55); AUC = 0.67 (0.54-0.79); and the NCI LFS study cohort, OE = 1.28 (1.17-1.39); AUC = 0.82 (0.78-0.86). LFSPRO also showed higher sensitivity and specificity than the classic LFS and Chompret criteria. LFSPRO is freely available through the R packages LFSPRO and BayesMendel. Conclusions: LFSPRO shows good performance in predicting TP53 mutations in individuals and families in varied situations. Impact: LFSPRO is more broadly applicable than the current clinical criteria and may improve clinical management for individuals and families with LFS. Cancer Epidemiol Biomarkers Prev; 26(6); 837-44. ©2017 AACR . ©2017 American Association for Cancer Research.

  18. Estimation of Day-Specific Probabilities of Conception during Natural Cycle in Women from Babylon

    Directory of Open Access Journals (Sweden)

    Hanan Al-taee

    2017-10-01

    Full Text Available Background: Identifying predictors of the probabilities of conception related to the timing and frequency of intercourse in the menstrual cycle is essential for couples attempting pregnancy, users of natural family planning methods, and clinicians diagnosing for possible causes of infertility. The aim of this study is to estimate the days in which the likelihood of conception happened by using first trimester ultrasound fetal biometry in natural cycles and spontaneous pregnancy, and to explore some factors that may affect them. Materials and Methods: This study is retrospective cohort study, with random sampling. It involved 60 pregnant ladies at first trimester; the date of conception was estimated using: i. Crown-rump length biometry (routine ultrasound examinations were performed at a median of 70 days following Last menstrual period or equivalently 10 weeks, ii. Date of last menstrual cycle. Only women with previous infertility and now conceiving naturally with a certain date of Last menstrual period were selected. Results: The distribution of conception showed a sharp rise from day 8 onwards, reaching its maximum at day 13 and decreasing to zero by day 30 of Last menstrual period. The older and obese women had conceive earlier than younger women but there was insignificants difference between the two groups (P>0.05. According to the type of infertility, the women with secondary infertility had conceived earlier than those with primary infertility. There was a significant difference between the two groups (P<0.05. Conclusion: Day specific of conception may be affected by factors such as age, BMI, and type of infertility. This may be confirmed by larger sample size in metacentric study.

  19. Estimating Route Choice Models from Stochastically Generated Choice Sets on Large-Scale Networks Correcting for Unequal Sampling Probability

    DEFF Research Database (Denmark)

    Vacca, Alessandro; Prato, Carlo Giacomo; Meloni, Italo

    2015-01-01

    is the dependency of the parameter estimates from the choice set generation technique. Bias introduced in model estimation has been corrected only for the random walk algorithm, which has problematic applicability to large-scale networks. This study proposes a correction term for the sampling probability of routes...

  20. A probability model for evaluating the bias and precision of influenza vaccine effectiveness estimates from case-control studies.

    Science.gov (United States)

    Haber, M; An, Q; Foppa, I M; Shay, D K; Ferdinands, J M; Orenstein, W A

    2015-05-01

    As influenza vaccination is now widely recommended, randomized clinical trials are no longer ethical in many populations. Therefore, observational studies on patients seeking medical care for acute respiratory illnesses (ARIs) are a popular option for estimating influenza vaccine effectiveness (VE). We developed a probability model for evaluating and comparing bias and precision of estimates of VE against symptomatic influenza from two commonly used case-control study designs: the test-negative design and the traditional case-control design. We show that when vaccination does not affect the probability of developing non-influenza ARI then VE estimates from test-negative design studies are unbiased even if vaccinees and non-vaccinees have different probabilities of seeking medical care against ARI, as long as the ratio of these probabilities is the same for illnesses resulting from influenza and non-influenza infections. Our numerical results suggest that in general, estimates from the test-negative design have smaller bias compared to estimates from the traditional case-control design as long as the probability of non-influenza ARI is similar among vaccinated and unvaccinated individuals. We did not find consistent differences between the standard errors of the estimates from the two study designs.

  1. Using of bayesian networks to estimate the probability of "NATECH" scenario occurrence

    Science.gov (United States)

    Dobes, Pavel; Dlabka, Jakub; Jelšovská, Katarína; Polorecká, Mária; Baudišová, Barbora; Danihelka, Pavel

    2015-04-01

    In the twentieth century, implementation of Bayesian statistics and probability was not much used (may be it wasn't a preferred approach) in the area of natural and industrial risk analysis and management. Neither it was used within analysis of so called NATECH accidents (chemical accidents triggered by natural events, such as e.g. earthquakes, floods, lightning etc.; ref. E. Krausmann, 2011, doi:10.5194/nhess-11-921-2011). Main role, from the beginning, played here so called "classical" frequentist probability (ref. Neyman, 1937), which rely up to now especially on the right/false results of experiments and monitoring and didn't enable to count on expert's beliefs, expectations and judgements (which is, on the other hand, one of the once again well known pillars of Bayessian approach to probability). In the last 20 or 30 years, there is possible to observe, through publications and conferences, the Renaissance of Baysssian statistics into many scientific disciplines (also into various branches of geosciences). The necessity of a certain level of trust in expert judgment within risk analysis is back? After several decades of development on this field, it could be proposed following hypothesis (to be checked): "We couldn't estimate probabilities of complex crisis situations and their TOP events (many NATECH events could be classified as crisis situations or emergencies), only by classical frequentist approach, but also by using of Bayessian approach (i.e. with help of prestaged Bayessian Network including expert belief and expectation as well as classical frequentist inputs). Because - there is not always enough quantitative information from monitoring of historical emergencies, there could be several dependant or independant variables necessary to consider and in generally - every emergency situation always have a little different run." In this topic, team of authors presents its proposal of prestaged typized Bayessian network model for specified NATECH scenario

  2. Estimation of the Probable Maximum Flood for a Small Lowland River in Poland

    Science.gov (United States)

    Banasik, K.; Hejduk, L.

    2009-04-01

    The planning, designe and use of hydrotechnical structures often requires the assesment of maximu flood potentials. The most common term applied to this upper limit of flooding is the probable maximum flood (PMF). The PMP/UH (probable maximum precipitation/unit hydrograph) method has been used in the study to predict PMF from a small agricultural lowland river basin of Zagozdzonka (left tributary of Vistula river) in Poland. The river basin, located about 100 km south of Warsaw, with an area - upstream the gauge of Plachty - of 82 km2, has been investigated by Department of Water Engineering and Environmenal Restoration of Warsaw University of Life Sciences - SGGW since 1962. Over 40-year flow record was used in previous investigation for predicting T-year flood discharge (Banasik et al., 2003). The objective here was to estimate the PMF using the PMP/UH method and to compare the results with the 100-year flood. A new relation of depth-duration curve of PMP for the local climatic condition has been developed based on Polish maximum observed rainfall data (Ozga-Zielinska & Ozga-Zielinski, 2003). Exponential formula, with the value of exponent of 0.47, i.e. close to the exponent in formula for world PMP and also in the formula of PMP for Great Britain (Wilson, 1993), gives the rainfall depth about 40% lower than the Wilson's one. The effective rainfall (runoff volume) has been estimated from the PMP of various duration using the CN-method (USDA-SCS, 1986). The CN value as well as parameters of the IUH model (Nash, 1957) have been established from the 27 rainfall-runoff events, recorded in the river basin in the period 1980-2004. Varibility of the parameter values with the size of the events will be discussed in the paper. The results of the analyse have shown that the peak discharge of the PMF is 4.5 times larger then 100-year flood, and volume ratio of the respective direct hydrographs caused by rainfall events of critical duration is 4.0. References 1.Banasik K

  3. Moment-Based Probability Modeling and Extreme Response Estimation, The FITS Routine Version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    MANUEL,LANCE; KASHEF,TINA; WINTERSTEIN,STEVEN R.

    1999-11-01

    This report documents the use of the FITS routine, which provides automated fits of various analytical, commonly used probability models from input data. It is intended to complement the previously distributed FITTING routine documented in RMS Report 14 (Winterstein et al., 1994), which implements relatively complex four-moment distribution models whose parameters are fit with numerical optimization routines. Although these four-moment fits can be quite useful and faithful to the observed data, their complexity can make them difficult to automate within standard fitting algorithms. In contrast, FITS provides more robust (lower moment) fits of simpler, more conventional distribution forms. For each database of interest, the routine estimates the distribution of annual maximum response based on the data values and the duration, T, over which they were recorded. To focus on the upper tails of interest, the user can also supply an arbitrary lower-bound threshold, {chi}{sub low}, above which a shifted distribution model--exponential or Weibull--is fit.

  4. Estimating Recovery Failure Probabilities in Off-normal Situations from Full-Scope Simulator Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    As part of this effort, KAERI developed the Human Reliability data EXtraction (HuREX) framework and is collecting full-scope simulator-based human reliability data into the OPERA (Operator PErformance and Reliability Analysis) database. In this study, with the series of estimation research for HEPs or PSF effects, significant information for a quantitative HRA analysis, recovery failure probabilities (RFPs), were produced from the OPERA database. Unsafe acts can occur at any time in safety-critical systems and the operators often manage the systems by discovering their errors and eliminating or mitigating them. To model the recovery processes or recovery strategies, there were several researches that categorize the recovery behaviors. Because the recent human error trends are required to be considered during a human reliability analysis, Jang et al. can be seen as an essential effort of the data collection. However, since the empirical results regarding soft controls were produced from a controlled laboratory environment with student participants, it is necessary to analyze a wide range of operator behaviors using full-scope simulators. This paper presents the statistics related with human error recovery behaviors obtained from the full-scope simulations that in-site operators participated in. In this study, the recovery effects by shift changes or technical support centers were not considered owing to a lack of simulation data.

  5. Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data.

    Science.gov (United States)

    Das, Jayajit; Mukherjee, Sayak; Hodge, Susan E

    2015-07-01

    A common statistical situation concerns inferring an unknown distribution Q(x) from a known distribution P(y), where X (dimension n), and Y (dimension m) have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt) approach that estimates Q(x) based only on the available data, namely, P(y). The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.

  6. Clinical radiobiology of glioblastoma multiforme. Estimation of tumor control probability from various radiotherapy fractionation schemes

    Energy Technology Data Exchange (ETDEWEB)

    Pedicini, Piernicola [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy); Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Fiorentino, Alba [Sacro Cuore - Don Calabria Hospital, Radiation Oncology Department, Negrar, Verona (Italy); Simeon, Vittorio [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Laboratory of Preclinical and Translational Research, Rionero-in-Vulture (Italy); Tini, Paolo; Pirtoli, Luigi [University of Siena and Tuscany Tumor Institute, Unit of Radiation Oncology, Department of Medicine Surgery and Neurological Sciences, Siena (Italy); Chiumento, Costanza [Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Salvatore, Marco [I.R.C.C.S. SDN Foundation, Unit of Nuclear Medicine, Napoli (Italy); Storto, Giovanni [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy)

    2014-10-15

    The aim of this study was to estimate a radiobiological set of parameters from the available clinical data on glioblastoma (GB). A number of clinical trial outcomes from patients affected by GB and treated with surgery and adjuvant radiochemotherapy were analyzed to estimate a set of radiobiological parameters for a tumor control probability (TCP) model. The analytical/graphical method employed to fit the clinical data allowed us to estimate the intrinsic tumor radiosensitivity (α), repair capability (b), and repopulation doubling time (T{sub d}) in a first phase, and subsequently the number of clonogens (N) and kick-off time for accelerated proliferation (T{sub k}). The results were used to formulate a hypothesis for a scheduleexpected to significantly improve local control. The 95 % confidence intervals (CI{sub 95} {sub %}) of all parameters are also discussed. The pooled analysis employed to estimate the parameters summarizes the data of 559 patients, while the studies selected to verify the results summarize data of 104 patients. The best estimates and the CI{sub 95} {sub %} are α = 0.12 Gy{sup -1} (0.10-0.14), b = 0.015 Gy{sup -2} (0.013-0.020), α/b = 8 Gy (5.0-10.8), T{sub d} = 15.4 days (13.2-19.5), N = 1 . 10{sup 4} (1.2 . 10{sup 3} - 1 . 10{sup 5}), and T{sub k} = 37 days (29-46). The dose required to offset the repopulation occurring after 1 day (D{sub prolif}) and starting after T{sub k} was estimated as 0.30 Gy/day (0.22-0.39). The analysis confirms a high value for the α/b ratio. Moreover, a high intrinsic radiosensitivity together with a long kick-off time for accelerated repopulation and moderate repopulation kinetics were found. The results indicate a substantial independence of the duration of the overall treatment and an improvement in the treatment effectiveness by increasing the total dose without increasing the dose fraction. (orig.) [German] Schaetzung eines strahlenbiologischen Parametersatzes auf der Grundlage klinischer Daten bei

  7. Exact closed form expressions for outage probability of GSC receivers over Rayleigh fading channel subject to self-interference

    KAUST Repository

    Nam, Sungsik

    2010-11-01

    Previous work on performance analyses of generalized selection combining (GSC) RAKE receivers based on the signal to noise ratio focused on the development of methodologies to derive exact closed-form expressions for various performance measures. However, some open problems related to the performance evaluation of GSC RAKE receivers still remain to be solved such that an assessment of the impact of self-interference on the performance of GSC RAKE receivers. To have a full and exact understanding of the performance of GSC RAKE receivers, the outage probability of GSC RAKE receivers needs to be analyzed as closed-form expressions. The major difficulty in this problem is to derive some joint statistics of ordered exponential variates. With this motivation in mind, we capitalize in this paper on some new order statistics results to derive exact closed-form expressions for outage probability of GSC RAKE receivers subject to self-interference over independent and identically distributed Rayleigh fading channels. © 2010 IEEE.

  8. Small-area estimation of the probability of toxocariasis in New York City based on sociodemographic neighborhood composition.

    Directory of Open Access Journals (Sweden)

    Michael G Walsh

    Full Text Available Toxocariasis is increasingly recognized as an important neglected infection of poverty (NIP in developed countries, and may constitute the most important NIP in the United States (US given its association with chronic sequelae such as asthma and poor cognitive development. Its potential public health burden notwithstanding, toxocariasis surveillance is minimal throughout the US and so the true burden of disease remains uncertain in many areas. The Third National Health and Nutrition Examination Survey conducted a representative serologic survey of toxocariasis to estimate the prevalence of infection in diverse US subpopulations across different regions of the country. Using the NHANES III surveillance data, the current study applied the predicted probabilities of toxocariasis to the sociodemographic composition of New York census tracts to estimate the local probability of infection across the city. The predicted probability of toxocariasis ranged from 6% among US-born Latino women with a university education to 57% among immigrant men with less than a high school education. The predicted probability of toxocariasis exhibited marked spatial variation across the city, with particularly high infection probabilities in large sections of Queens, and smaller, more concentrated areas of Brooklyn and northern Manhattan. This investigation is the first attempt at small-area estimation of the probability surface of toxocariasis in a major US city. While this study does not define toxocariasis risk directly, it does provide a much needed tool to aid the development of toxocariasis surveillance in New York City.

  9. Estimating Probability of Default on Peer to Peer Market – Survival Analysis Approach

    Directory of Open Access Journals (Sweden)

    Đurović Andrija

    2017-05-01

    Full Text Available Arguably a cornerstone of credit risk modelling is the probability of default. This article aims is to search for the evidence of relationship between loan characteristics and probability of default on peer-to-peer (P2P market. In line with that, two loan characteristics are analysed: 1 loan term length and 2 loan purpose. The analysis is conducted using survival analysis approach within the vintage framework. Firstly, 12 months probability of default through the cycle is used to compare riskiness of analysed loan characteristics. Secondly, log-rank test is employed in order to compare complete survival period of cohorts. Findings of the paper suggest that there is clear evidence of relationship between analysed loan characteristics and probability of default. Longer term loans are more risky than the shorter term ones and the least risky loans are those used for credit card payoff.

  10. The probability estimate of the defects of the asynchronous motors based on the complex method of diagnostics

    Science.gov (United States)

    Zhukovskiy, Yu L.; Korolev, N. A.; Babanova, I. S.; Boikov, A. V.

    2017-10-01

    This article is devoted to the development of a method for probability estimate of failure of an asynchronous motor as a part of electric drive with a frequency converter. The proposed method is based on a comprehensive method of diagnostics of vibration and electrical characteristics that take into account the quality of the supply network and the operating conditions. The developed diagnostic system allows to increase the accuracy and quality of diagnoses by determining the probability of failure-free operation of the electromechanical equipment, when the parameters deviate from the norm. This system uses an artificial neural networks (ANNs). The results of the system for estimator the technical condition are probability diagrams of the technical state and quantitative evaluation of the defects of the asynchronous motor and its components.

  11. Statistics of Natural Populations. II. Estimating an Allele Probability in Families Descended from Cryptic Mothers.

    Science.gov (United States)

    Arnold, J; Morrison, M L

    1985-04-01

    In population studies, adults are frequently difficult or inconvenient to identify for genotype, but a family profile of genotypes can be obtained from an unidentified female crossed with a single unidentified male. The problem is to estimate an allele frequency in the cryptic parental gene pool from the observed family profiles. For example, a worker may wish to estimate inversion frequencies in Drosophila; inversion karyotypes are cryptic in adults but visible in salivary gland squashes from larvae. A simple mixture model, which assumes the Hardy-Weinberg law, Mendelian laws and a single randomly chosen mate per female, provides the vehicle for studying three competing estimators of an allele frequency. A simple, heuristically appealing estimator called the Dobzhansky estimator is compared with the maximum likelihood estimator and a close relative called the grouped profiles estimator. The Dobzhansky estimator is computationally simple, consistent and highly efficient and is recommended in practice over its competitors.

  12. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri.

    Science.gov (United States)

    2014-01-01

    Regression analysis techniques were used to develop a : set of equations for rural ungaged stream sites for estimating : discharges with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent : annual exceedance probabilities, which are equivalent to : ann...

  13. Introduction to the life estimation of materials - Application of the theory of probability statistics of extreme values to corrosion

    Energy Technology Data Exchange (ETDEWEB)

    Kowaka, M.

    1984-01-01

    The book contains a history of the application of statistics of extreme values of corrosion, fundamentals of statistics, probability of corrosion phenomena, exercises to understand the theory. The corrosion phenomena are described and the quantum analysis of localized corrosion and life estimation of materials are available by using the method.

  14. Experimental estimation of the photons visiting probability profiles in time-resolved diffuse reflectance measurement.

    Science.gov (United States)

    Sawosz, P; Kacprzak, M; Weigl, W; Borowska-Solonynko, A; Krajewski, P; Zolek, N; Ciszek, B; Maniewski, R; Liebert, A

    2012-12-07

    A time-gated intensified CCD camera was applied for time-resolved imaging of light penetrating in an optically turbid medium. Spatial distributions of light penetration probability in the plane perpendicular to the axes of the source and the detector were determined at different source positions. Furthermore, visiting probability profiles of diffuse reflectance measurement were obtained by the convolution of the light penetration distributions recorded at different source positions. Experiments were carried out on homogeneous phantoms, more realistic two-layered tissue phantoms based on the human skull filled with Intralipid-ink solution and on cadavers. It was noted that the photons visiting probability profiles depend strongly on the source-detector separation, the delay between the laser pulse and the photons collection window and the complex tissue composition of the human head.

  15. Empirical estimation of the conditional probability of natech events within the United States.

    Science.gov (United States)

    Santella, Nicholas; Steinberg, Laura J; Aguirra, Gloria Andrea

    2011-06-01

    Natural disasters are the cause of a sizeable number of hazmat releases, referred to as "natechs." An enhanced understanding of natech probability, allowing for predictions of natech occurrence, is an important step in determining how industry and government should mitigate natech risk. This study quantifies the conditional probabilities of natechs at TRI/RMP and SICS 1311 facilities given the occurrence of hurricanes, earthquakes, tornadoes, and floods. During hurricanes, a higher probability of releases was observed due to storm surge (7.3 releases per 100 TRI/RMP facilities exposed vs. 6.2 for SIC 1311) compared to category 1-2 hurricane winds (5.6 TRI, 2.6 SIC 1311). Logistic regression confirms the statistical significance of the greater propensity for releases at RMP/TRI facilities, and during some hurricanes, when controlling for hazard zone. The probability of natechs at TRI/RMP facilities during earthquakes increased from 0.1 releases per 100 facilities at MMI V to 21.4 at MMI IX. The probability of a natech at TRI/RMP facilities within 25 miles of a tornado was small (∼0.025 per 100 facilities), reflecting the limited area directly affected by tornadoes. Areas inundated during flood events had a probability of 1.1 releases per 100 facilities but demonstrated widely varying natech occurrence during individual events, indicating that factors not quantified in this study such as flood depth and speed are important for predicting flood natechs. These results can inform natech risk analysis, aid government agencies responsible for planning response and remediation after natural disasters, and should be useful in raising awareness of natech risk within industry. © 2011 Society for Risk Analysis.

  16. Procedures for using expert judgment to estimate human-error probabilities in nuclear power plant operations. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Seaver, D.A.; Stillwell, W.G.

    1983-03-01

    This report describes and evaluates several procedures for using expert judgment to estimate human-error probabilities (HEPs) in nuclear power plant operations. These HEPs are currently needed for several purposes, particularly for probabilistic risk assessments. Data do not exist for estimating these HEPs, so expert judgment can provide these estimates in a timely manner. Five judgmental procedures are described here: paired comparisons, ranking and rating, direct numerical estimation, indirect numerical estimation and multiattribute utility measurement. These procedures are evaluated in terms of several criteria: quality of judgments, difficulty of data collection, empirical support, acceptability, theoretical justification, and data processing. Situational constraints such as the number of experts available, the number of HEPs to be estimated, the time available, the location of the experts, and the resources available are discussed in regard to their implications for selecting a procedure for use.

  17. Maximum a posteriori probability estimates in infinite-dimensional Bayesian inverse problems

    Science.gov (United States)

    Helin, T.; Burger, M.

    2015-08-01

    A demanding challenge in Bayesian inversion is to efficiently characterize the posterior distribution. This task is problematic especially in high-dimensional non-Gaussian problems, where the structure of the posterior can be very chaotic and difficult to analyse. Current inverse problem literature often approaches the problem by considering suitable point estimators for the task. Typically the choice is made between the maximum a posteriori (MAP) or the conditional mean (CM) estimate. The benefits of either choice are not well-understood from the perspective of infinite-dimensional theory. Most importantly, there exists no general scheme regarding how to connect the topological description of a MAP estimate to a variational problem. The recent results by Dashti and others (Dashti et al 2013 Inverse Problems 29 095017) resolve this issue for nonlinear inverse problems in Gaussian framework. In this work we improve the current understanding by introducing a novel concept called the weak MAP (wMAP) estimate. We show that any MAP estimate in the sense of Dashti et al (2013 Inverse Problems 29 095017) is a wMAP estimate and, moreover, how the wMAP estimate connects to a variational formulation in general infinite-dimensional non-Gaussian problems. The variational formulation enables to study many properties of the infinite-dimensional MAP estimate that were earlier impossible to study. In a recent work by the authors (Burger and Lucka 2014 Maximum a posteriori estimates in linear inverse problems with logconcave priors are proper bayes estimators preprint) the MAP estimator was studied in the context of the Bayes cost method. Using Bregman distances, proper convex Bayes cost functions were introduced for which the MAP estimator is the Bayes estimator. Here, we generalize these results to the infinite-dimensional setting. Moreover, we discuss the implications of our results for some examples of prior models such as the Besov prior and hierarchical prior.

  18. Estimating the Probabilities of Low-Weight Differential and Linear Approximations on PRESENT-like Ciphers

    DEFF Research Database (Denmark)

    Abdelraheem, Mohamed Ahmed

    2012-01-01

    We use large but sparse correlation and transition-difference-probability submatrices to find the best linear and differential approximations respectively on PRESENT-like ciphers. This outperforms the branch and bound algorithm when the number of low-weight differential and linear characteristics...

  19. Effects of population variability on the accuracy of detection probability estimates

    DEFF Research Database (Denmark)

    Ordonez Gloria, Alejandro

    2011-01-01

    Observing a constant fraction of the population over time, locations, or species is virtually impossible. Hence, quantifying this proportion (i.e. detection probability) is an important task in quantitative population ecology. In this study we determined, via computer simulations, the ef- fect of...

  20. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  1. Improving accuracy and efficiency of mutual information for multi-modal retinal image registration using adaptive probability density estimation.

    Science.gov (United States)

    Legg, P A; Rosin, P L; Marshall, D; Morgan, J E

    2013-01-01

    Mutual information (MI) is a popular similarity measure for performing image registration between different modalities. MI makes a statistical comparison between two images by computing the entropy from the probability distribution of the data. Therefore, to obtain an accurate registration it is important to have an accurate estimation of the true underlying probability distribution. Within the statistics literature, many methods have been proposed for finding the 'optimal' probability density, with the aim of improving the estimation by means of optimal histogram bin size selection. This provokes the common question of how many bins should actually be used when constructing a histogram. There is no definitive answer to this. This question itself has received little attention in the MI literature, and yet this issue is critical to the effectiveness of the algorithm. The purpose of this paper is to highlight this fundamental element of the MI algorithm. We present a comprehensive study that introduces methods from statistics literature and incorporates these for image registration. We demonstrate this work for registration of multi-modal retinal images: colour fundus photographs and scanning laser ophthalmoscope images. The registration of these modalities offers significant enhancement to early glaucoma detection, however traditional registration techniques fail to perform sufficiently well. We find that adaptive probability density estimation heavily impacts on registration accuracy and runtime, improving over traditional binning techniques. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Clinician gestalt estimate of pretest probability for acute coronary syndrome and pulmonary embolism in patients with chest pain and dyspnea.

    Science.gov (United States)

    Kline, Jeffrey A; Stubblefield, William B

    2014-03-01

    Pretest probability helps guide diagnostic testing for patients with suspected acute coronary syndrome and pulmonary embolism. Pretest probability derived from the clinician's unstructured gestalt estimate is easier and more readily available than methods that require computation. We compare the diagnostic accuracy of physician gestalt estimate for the pretest probability of acute coronary syndrome and pulmonary embolism with a validated, computerized method. This was a secondary analysis of a prospectively collected, multicenter study. Patients (N=840) had chest pain, dyspnea, nondiagnostic ECGs, and no obvious diagnosis. Clinician gestalt pretest probability for both acute coronary syndrome and pulmonary embolism was assessed by visual analog scale and from the method of attribute matching using a Web-based computer program. Patients were followed for outcomes at 90 days. Clinicians had significantly higher estimates than attribute matching for both acute coronary syndrome (17% versus 4%; Pprobability but on receiver operating curve analysis were as accurate for pulmonary embolism but not acute coronary syndrome. Copyright © 2013 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.

  3. Geospatial tools effectively estimate nonexceedance probabilities of daily streamflow at ungauged and intermittently gauged locations in Ohio

    Science.gov (United States)

    Farmer, William H.; Koltun, Greg

    2017-01-01

    Study regionThe state of Ohio in the United States, a humid, continental climate.Study focusThe estimation of nonexceedance probabilities of daily streamflows as an alternative means of establishing the relative magnitudes of streamflows associated with hydrologic and water-quality observations.New hydrological insights for the regionSeveral methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index) and geospatial tools (kriging and topological kriging). These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.

  4. Estimation of Subjective Stress in Acute Myocardial Infarction

    Directory of Open Access Journals (Sweden)

    Chockalingam A

    2003-01-01

    Full Text Available BACKGROUND and AIMS: Mental stress is considered to be a precipitating factor in acute coronary events. We aimed to assess the association of subjective or 'perceived' mental stress with the occurrence of acute coronary events. SETTINGS AND DESIGN: Prospective case-control survey was carried out in a referral teaching hospital. subjects & METHODS: Consecutive patients with acute myocardial infarction and ST elevation on electrocardiogram who were admitted to the Coronary Care Unit of a referral teaching hospital were enrolled in the study as cases. Controls were unmatched and were enrolled from amongst patients with coronary artery disease who did not have recent acute coronary events. Subjective Stress Functional Classification (SS-FC for the preceding 2-4 weeks was assessed and assigned four grades from I to IV as follows: I - baseline, II - more than usual but not affecting daily routine, III - significantly high stress affecting daily routine and IV - worst stress in life. STATISTICAL ANALYSIS: Proportions of different characteristics were compared using chi-square test with Yates continuity correction. Student's unpaired t test was applied for mean age. 'p' value of < 0.05 was considered statistically significant. RESULTS: SS-FC could be reliably (99% and easily assessed. Eighty (53% of the total 150 patients with acute MI reported 'high' levels of stress (stress class III and IV. This is in contrast to only 30 (20% of 150 healthy controls reporting high stress for the same period (p value < 0.001. CONCLUSION: Patients with acute myocardial infarction report a higher subjective mental stress during 2 to 4 weeks preceding the acute coronary event.

  5. Estimating a Logistic Discrimination Functions When One of the Training Samples Is Subject to Misclassification: A Maximum Likelihood Approach.

    Directory of Open Access Journals (Sweden)

    Nico Nagelkerke

    Full Text Available The problem of discrimination and classification is central to much of epidemiology. Here we consider the estimation of a logistic regression/discrimination function from training samples, when one of the training samples is subject to misclassification or mislabeling, e.g. diseased individuals are incorrectly classified/labeled as healthy controls. We show that this leads to zero-inflated binomial model with a defective logistic regression or discrimination function, whose parameters can be estimated using standard statistical methods such as maximum likelihood. These parameters can be used to estimate the probability of true group membership among those, possibly erroneously, classified as controls. Two examples are analyzed and discussed. A simulation study explores properties of the maximum likelihood parameter estimates and the estimates of the number of mislabeled observations.

  6. FUSE BEE: Fusion of Subjective Opinions through Behavior Estimation

    Science.gov (United States)

    2017-02-01

    estimated behaviors of sources. Through extensive simulations, we have shown that our approach has a lower computational complexity, and achieves ...If the source is dishonest, it may not share its genuine opinion; instead, it may provide a random opinion (or an opinion un - correlated with the...that the proposed approach achieves very low error rate around 0.01 when more than 80 sources are queried. For lower number of source, the fusion

  7. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Onar, Arzu; Bayliss, Jon; Ludwig, Larry

    2009-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This empirical model can be used to improve existing risk simulation models. FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  8. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  9. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni

    2015-08-28

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise / offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

  10. Use of portable antennas to estimate abundance of PIT-tagged fish in small streams: Factors affecting detection probability

    Science.gov (United States)

    O'Donnell, Matthew J.; Horton, Gregg E.; Letcher, Benjamin H.

    2010-01-01

    Portable passive integrated transponder (PIT) tag antenna systems can be valuable in providing reliable estimates of the abundance of tagged Atlantic salmon Salmo salar in small streams under a wide range of conditions. We developed and employed PIT tag antenna wand techniques in two controlled experiments and an additional case study to examine the factors that influenced our ability to estimate population size. We used Pollock's robust-design capture–mark–recapture model to obtain estimates of the probability of first detection (p), the probability of redetection (c), and abundance (N) in the two controlled experiments. First, we conducted an experiment in which tags were hidden in fixed locations. Although p and c varied among the three observers and among the three passes that each observer conducted, the estimates of N were identical to the true values and did not vary among observers. In the second experiment using free-swimming tagged fish, p and c varied among passes and time of day. Additionally, estimates of N varied between day and night and among age-classes but were within 10% of the true population size. In the case study, we used the Cormack–Jolly–Seber model to examine the variation in p, and we compared counts of tagged fish found with the antenna wand with counts collected via electrofishing. In that study, we found that although p varied for age-classes, sample dates, and time of day, antenna and electrofishing estimates of N were similar, indicating that population size can be reliably estimated via PIT tag antenna wands. However, factors such as the observer, time of day, age of fish, and stream discharge can influence the initial and subsequent detection probabilities.

  11. On Using Maximum a Posteriori Probability Based on a Bayesian Model for Oscillometric Blood Pressure Estimation

    Directory of Open Access Journals (Sweden)

    Soojeong Lee

    2013-10-01

    Full Text Available The maximum amplitude algorithm (MAA is generally utilized in the estimation of the pressure values, and it uses heuristically obtained ratios of systolic and diastolic oscillometric amplitude to the mean arterial pressure (known as systolic and diastolic ratios in order to estimate the systolic and diastolic pressures. This paper proposes a Bayesian model to estimate the systolic and diastolic ratios. These ratios are an improvement over the single fixed systolic and diastolic ratios used in the algorithms that are available in the literature. The proposed method shows lower mean difference (MD with standard deviation (SD compared to the MAA for both SBP and DBP consistently in all the five measurements.

  12. On Using Maximum a Posteriori Probability Based on a Bayesian Model for Oscillometric Blood Pressure Estimation

    Science.gov (United States)

    Lee, Soojeong; Jeon, Gwanggil; Lee, Gangseong

    2013-01-01

    The maximum amplitude algorithm (MAA) is generally utilized in the estimation of the pressure values, and it uses heuristically obtained ratios of systolic and diastolic oscillometric amplitude to the mean arterial pressure (known as systolic and diastolic ratios) in order to estimate the systolic and diastolic pressures. This paper proposes a Bayesian model to estimate the systolic and diastolic ratios. These ratios are an improvement over the single fixed systolic and diastolic ratios used in the algorithms that are available in the literature. The proposed method shows lower mean difference (MD) with standard deviation (SD) compared to the MAA for both SBP and DBP consistently in all the five measurements. PMID:24152924

  13. Bistatic-radar estimation of surface-slope probability distributions with applications to the moon.

    Science.gov (United States)

    Parker, M. N.; Tyler, G. L.

    1973-01-01

    A method for extracting surface-slope frequency distributions from bistatic-radar data has been developed and applied to the lunar surface. Telemetry transmissions from orbiting Apollo spacecraft were received on the earth after reflection from the lunar surface. The echo-frequency spectrum was related analytically to the probability distribution of lunar slopes. Standard regression techniques were used to solve the inverse problem of finding slope distributions from observed echo-frequency spectra. Data taken simultaneously at two wavelengths, 13 and 116 cm, have yielded diverse slope statistics.

  14. Estimating the benefits of single value and probability forecasting for flood warning

    Directory of Open Access Journals (Sweden)

    J. S. Verkade

    2011-12-01

    Full Text Available Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS. These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed events. This forecasting uncertainty decreases the potential reduction of flood risk, but is seldom accounted for in estimates of the benefits of FFWRSs. In the present paper, a method to estimate the benefits of (imperfect FFWRSs in reducing flood risk is presented. The method is based on a hydro-economic model of expected annual damage (EAD due to flooding, combined with the concept of Relative Economic Value (REV. The estimated benefits include not only the reduction of flood losses due to a warning response, but also consider the costs of the warning response itself, as well as the costs associated with forecasting uncertainty. The method allows for estimation of the benefits of FFWRSs that use either deterministic or probabilistic forecasts. Through application to a case study, it is shown that FFWRSs using a probabilistic forecast have the potential to realise higher benefits at all lead-times. However, it is also shown that provision of warning at increasing lead-time does not necessarily lead to an increasing reduction of flood risk, but rather that an optimal lead-time at which warnings are provided can be established as a function of forecast uncertainty and the cost-loss ratio of the user receiving and responding to the warning.

  15. EVALUATING PROBABILITY SAMPLING STRATEGIES FOR ESTIMATING REDD COUNTS: AN EXAMPLE WITH CHINOOK SALMON (Oncorhynchus tshawytscha)

    Science.gov (United States)

    Precise, unbiased estimates of population size are an essential tool for fisheries management. For a wide variety of salmonid fishes, redd counts from a sample of reaches are commonly used to monitor annual trends in abundance. Using a 9-year time series of georeferenced censuses...

  16. Estimated Probability of Traumatic Abdominal Injury During an International Space Station Mission

    Science.gov (United States)

    Lewandowski, Beth E.; Brooker, John E.; Weavr, Aaron S.; Myers, Jerry G., Jr.; McRae, Michael P.

    2013-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to spaceflight mission planners and medical system designers when assessing risks and optimizing medical systems. The IMM project maintains a database of medical conditions that could occur during a spaceflight. The IMM project is in the process of assigning an incidence rate, the associated functional impairment, and a best and a worst case end state for each condition. The purpose of this work was to develop the IMM Abdominal Injury Module (AIM). The AIM calculates an incidence rate of traumatic abdominal injury per person-year of spaceflight on the International Space Station (ISS). The AIM was built so that the probability of traumatic abdominal injury during one year on ISS could be predicted. This result will be incorporated into the IMM Abdominal Injury Clinical Finding Form and used within the parent IMM model.

  17. Maximum likelihood estimates with order restrictions on probabilities and odds ratios: A geometric programming approach

    Directory of Open Access Journals (Sweden)

    D. L. Bricker

    1997-01-01

    Full Text Available The problem of assigning cell probabilities to maximize a multinomial likelihood with order restrictions on the probabilies and/or restrictions on the local odds ratios is modeled as a posynomial geometric program (GP, a class of nonlinear optimization problems with a well-developed duality theory and collection of algorithms. (Local odds ratios provide a measure of association between categorical random variables. A constrained multinomial MLE example from the literature is solved, and the quality of the solution is compared with that obtained by the iterative method of El Barmi and Dykstra, which is based upon Fenchel duality. Exploiting the proximity of the GP model of MLE problems to linear programming (LP problems, we also describe as an alternative, in the absence of special-purpose GP software, an easily implemented successive LP approximation method for solving this class of MLE problems using one of the readily available LP solvers.

  18. Sensitivity of Reliability Estimates in Partially Damaged RC Structures subject to Earthquakes, using Reduced Hysteretic Models

    DEFF Research Database (Denmark)

    Iwankiewicz, R.; Nielsen, Søren R. K.; Skjærbæk, P. S.

    The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation.......The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation....

  19. ESTIMATION PILOTING ERRORS STATISTICS AND PROBABILITY OF SATELLITE NAVIGATION SYSTEM FAILURE

    Directory of Open Access Journals (Sweden)

    V. L. Kuznetsov

    2014-01-01

    Full Text Available A new approach to the problem of co-processing of data about the aircraft position with the help of satellite navigation systems and secondary surveillance radar system is been developed. The purpose of the task is to obtain estimates of mistakes distribution for control systems and piloting errors. Possibility of a statistical relationship between piloting errors and mistakes of satellite navigation system is taken into account.

  20. Odds and Probabilities Estimation for the Survival of Breast Cancer Patients with Cancer Stages 2 & 3

    Directory of Open Access Journals (Sweden)

    Urrutia Jackie D.

    2016-01-01

    Full Text Available Breast cancer is one of the leading causes of death in the Philippines. One out of four who are diagnosed with breast cancer die within the first five years, and no less than 40 percent die within 10 years and it has continous rise as time passes by. Therefore, it is very important to know the factors that can help for the survival rate of the patients. The purpose of this study is to identify the best possible treatment or combination of treatments. The researchers considered four independent variables namely: Completed Surgery, Completed Chemotherapy, Completed Hormonotherapy and Completed Radiotherapy. The researchers limit this study for only 160 patients with stage 2 and 135 with stage 3 for a total of 295 patients considering the data gathered from three hospitals from Metro Manila. The names of the hospitals were not declared due to confidentiality of data. In identifying the best treatment or combination of treatments, odds, probabilities and odds ratios of patients, Logistic Regression Analysis was used.

  1. The probability estimation of the electronic lesson implementation taking into account software reliability

    Science.gov (United States)

    Gurov, V. V.

    2017-01-01

    Software tools for educational purposes, such as e-lessons, computer-based testing system, from the point of view of reliability, have a number of features. The main ones among them are the need to ensure a sufficiently high probability of their faultless operation for a specified time, as well as the impossibility of their rapid recovery by the way of replacing it with a similar running program during the classes. The article considers the peculiarities of reliability evaluation of programs in contrast to assessments of hardware reliability. The basic requirements to reliability of software used for carrying out practical and laboratory classes in the form of computer-based training programs are given. The essential requirements applicable to the reliability of software used for conducting the practical and laboratory studies in the form of computer-based teaching programs are also described. The mathematical tool based on Markov chains, which allows to determine the degree of debugging of the training program for use in the educational process by means of applying the graph of the software modules interaction, is presented.

  2. ESTIMATION OF BANKRUPTCY PROBABILITIES BY USING FUZZY LOGIC AND MERTON MODEL: AN APPLICATION ON USA COMPANIES

    Directory of Open Access Journals (Sweden)

    Çiğdem ÖZARİ

    2018-01-01

    Full Text Available In this study, we have worked on developing a brand-new index called Fuzzy-bankruptcy index. The aim of this index is to find out the default probability of any company X, independent from the sector it belongs. Fuzzy logic is used to state the financial ratiointerruption change related with time and inside different sectors, the new index is created to eliminate the number of the relativity of financial ratios. The four input variables inside the five main input variables used for the fuzzy process, are chosen from both factor analysis and clustering and the last input variable calculated from Merton Model. As we analyze in the past cases of the default history of companies, one could explore different reasons such as managerial arrogance, fraud and managerial mistakes, that are responsible for the very poor endings of prestigious companies like Enron, K-Mart. Because of these kind of situations, we try to design a model which one could be able to get a better view of a company’s financial position, and it couldbe prevent credit loan companies from investing in the wrong company and possibly from losing all investments using our Fuzzy-bankruptcy index.

  3. Probable maximum precipitation 24 hours estimation: A case study of Zanjan province of Iran

    Directory of Open Access Journals (Sweden)

    Azim Shirdeli

    2012-10-01

    Full Text Available One of the primary concerns in designing civil structures such as water storage dams and irrigation and drainage networks is to find economic scale based on possibility of natural incidents such as floods, earthquake, etc. Probable maximum precipitation (PMP is one of well known methods, which helps design a civil structure, properly. In this paper, we study the maximum one-day precipitation using 17 to 50 years of information in 13 stations located in province of Zanjan, Iran. The proposed study of this paper uses two Hershfield methods, where the first one yields 18.17 to 18.48 for precipitation where the PMP24 was between 170.14 mm and 255.28 mm. The second method reports precipitation between 2.29 and 4.95 while PMP24 was between 62.33 mm and 92.08 mm. In addition, when the out of range data were deleted from the study of the second method, precipitation rates were calculated between 2.29 and 4.31 while PMP24 was between 76.08 mm and 117.28 mm. The preliminary results indicate that the second Hershfield method provide more stable results than the first one.

  4. Simultaneous estimation of b-values and detection rates of earthquakes for the application to aftershock probability forecasting

    Science.gov (United States)

    Katsura, K.; Ogata, Y.

    2004-12-01

    Reasenberg and Jones [Science, 1989, 1994] proposed the aftershock probability forecasting based on the joint distribution [Utsu, J. Fac. Sci. Hokkaido Univ., 1970] of the modified Omori formula of aftershock decay and Gutenberg-Richter law of magnitude frequency, where the respective parameters are estimated by the maximum likelihood method [Ogata, J. Phys. Earth, 1983; Utsu, Geophys Bull. Hokkaido Univ., 1965, Aki, Bull. Earthq. Res. Inst., 1965]. The public forecast has been implemented by the responsible agencies in California and Japan. However, a considerable difficulty in the above procedure is that, due to the contamination of arriving seismic waves, detection rate of aftershocks is extremely low during a period immediately after the main shock, say, during the first day, when the forecasting is most critical for public in the affected area. Therefore, for the forecasting of a probability during such a period, they adopt a generic model with a set of the standard parameter values in California or Japan. For an effective and realistic estimation, I propose to utilize the statistical model introduced by Ogata and Katsura [Geophys. J. Int., 1993] for the simultaneous estimation of the b-values of Gutenberg-Richter law together with detection-rate (probability) of earthquakes of each magnitude-band from the provided data of all detected events, where the both parameters are allowed for changing in time. Thus, by using all detected aftershocks from the beginning of the period, we can estimate the underlying modified Omori rate of both detected and undetected events and their b-value changes, taking the time-varying missing rates of events into account. The similar computation is applied to the ETAS model for complex aftershock activity or regional seismicity where substantial missing events are expected immediately after a large aftershock or another strong earthquake in the vicinity. Demonstrations of the present procedure will be shown for the recent examples

  5. Subjective Versus Objective Estimated Cardiovascular Disease Risk and Adherence to Physical Activity in African American Women.

    Science.gov (United States)

    Robinson, Nadia; Miller, Arlene; Wilbur, JoEllen; Fogg, Louis

    2017-07-18

    Cardiovascular disease (CVD) is the leading cause of death for African American (AA) women in the United States. Despite high prevalence of CVD risk factors, AA women perceive their CVD risk as low. Our objectives were to (1) identify relationships between subjective (self-reported perceived) CVD risk and objective CVD risk estimated by the American College of Cardiology/American Heart Association atherosclerotic CVD (ASCVD) risk estimator, (2) identify demographic and psychosocial factors associated with subjective perceived risk and discrepancy with objective estimated CVD risk, and (3) determine whether subjective perceived CVD risk was associated with physical activity (PA) adherence. This was a secondary data analysis of data collected from a 12-month lifestyle PA intervention conducted with 281 AA women. Subjective perceived CVD risk was measured by 1 question; objective estimated CVD risk was calculated using the ASCVD score. Women were categorized by congruence or discrepancy between subjective perceived and objective estimated CVD risk. Subjective perceived CVD risk and objective ASCVD risk scores were both low. Approximately 20% subjectively perceived their risk as lower than objective ASCVD scores. Atherosclerotic CVD risk discrepancy groups differed by depressed mood symptoms. Participants reported many perceived barriers to PA. Perceived CVD risk was not related to PA adherence. The significance of associated CVD risk factors may be underestimated by AA women, leading to discrepancy between subjective and objective risk estimates. Research is needed to clarify relationships among perceived risk, estimated risk using risk calculators such as ASCVD, and health behavior.

  6. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers-Part I

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Bayliss, Jon; Ludwig, Larry

    2008-01-01

    Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that has a currently unknown probability associated with it. Due to contact resistance, electrical shorts may not occur at lower voltage levels. In this experiment, we study the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From this data, we can estimate the probability of an electrical short, as a function of voltage, given that a free tin whisker has bridged two adjacent exposed electrical conductors. In addition, three tin whiskers grown from the same Space Shuttle Orbiter card guide used in the aforementioned experiment were cross sectioned and studied using a focused ion beam (FIB).

  7. Estimation of default probability for corporate entities in Republic of Serbia

    Directory of Open Access Journals (Sweden)

    Vujnović Miloš

    2016-01-01

    Full Text Available In this paper a quantitative PD model development has been excercised according to the Basel Capital Accord standards. The modeling dataset is based on the financial statements information from the Republic of Serbia. The goal of the paper is to develop a credit scoring model capable of producing PD estimate with high predictive power on the sample of corporate entities. The modeling is based on 5 years of end-of-year financial statements data of available Serbian corporate entities. Weight of evidence (WOE approach has been applied to quantitatively transform and prepare financial ratios. Correlation analysis has been utilized to reduce long list of variables and to remove highly interdependent variables from training and validation datasets. According to the best banking practice and academic literature, the final model is provided by using adjusted stepwise Logistic regression. The finally proposed model and its financial ratio constituents have been discussed and benchmarked against examples from relevant academic literature.

  8. A population-based tissue probability map-driven level set method for fully automated mammographic density estimations.

    Science.gov (United States)

    Kim, Youngwoo; Hong, Byung Woo; Kim, Seung Ja; Kim, Jong Hyo

    2014-07-01

    A major challenge when distinguishing glandular tissues on mammograms, especially for area-based estimations, lies in determining a boundary on a hazy transition zone from adipose to glandular tissues. This stems from the nature of mammography, which is a projection of superimposed tissues consisting of different structures. In this paper, the authors present a novel segmentation scheme which incorporates the learned prior knowledge of experts into a level set framework for fully automated mammographic density estimations. The authors modeled the learned knowledge as a population-based tissue probability map (PTPM) that was designed to capture the classification of experts' visual systems. The PTPM was constructed using an image database of a selected population consisting of 297 cases. Three mammogram experts extracted regions for dense and fatty tissues on digital mammograms, which was an independent subset used to create a tissue probability map for each ROI based on its local statistics. This tissue class probability was taken as a prior in the Bayesian formulation and was incorporated into a level set framework as an additional term to control the evolution and followed the energy surface designed to reflect experts' knowledge as well as the regional statistics inside and outside of the evolving contour. A subset of 100 digital mammograms, which was not used in constructing the PTPM, was used to validate the performance. The energy was minimized when the initial contour reached the boundary of the dense and fatty tissues, as defined by experts. The correlation coefficient between mammographic density measurements made by experts and measurements by the proposed method was 0.93, while that with the conventional level set was 0.47. The proposed method showed a marked improvement over the conventional level set method in terms of accuracy and reliability. This result suggests that the proposed method successfully incorporated the learned knowledge of the experts

  9. Bayesian probability estimates are not necessary to make choices satisfying Bayes’ rule in elementary situations

    Directory of Open Access Journals (Sweden)

    Artur eDomurat

    2015-08-01

    Full Text Available This paper has two aims. First, we investigate how often people make choices conforming to Bayes’ rule when natural sampling is applied. Second, we show that using Bayes’ rule is not necessary to make choices satisfying Bayes’ rule. Simpler methods, even fallacious heuristics, might prescribe correct choices reasonably often under specific circumstances. We considered elementary situations with binary sets of hypotheses and data. We adopted an ecological approach and prepared two-stage computer tasks resembling natural sampling. Probabilistic relations were to be inferred from a set of pictures, followed by a choice between the data which was made to maximize a chance for a preferred outcome. Using Bayes’ rule was deduced indirectly from choices.Study 1 (N=60 followed a 2 (gender: female vs. male x 2 (education: humanities vs. pure sciences between-subjects factorial design with balanced cells, and a number of correct choices as a dependent variable. Choices satisfying Bayes’ rule were dominant. To investigate ways of making choices more directly, we replicated Study 1, adding a task with a verbal report. In Study 2 (N=76 choices conforming to Bayes’ rule dominated again. However, the verbal reports revealed use of a new, non-inverse rule, which always renders correct choices, but is easier than Bayes’ rule to apply. It does not require inversing conditions (transforming P(H and P(D|H into P(H|D when computing chances. Study 3 examined efficiency of the three fallacious heuristics (pre-Bayesian, representativeness, and evidence-only in producing choices concordant with Bayes’ rule. Computer-simulated scenarios revealed that the heuristics produce correct choices reasonably often under specific base rates and likelihood ratios. Summing up we conclude that natural sampling leads to most choices conforming to Bayes’ rule. However, people tend to replace Bayes’ rule with simpler methods, and even use of fallacious heuristics may

  10. Probability of vertical transmission of Chlamydia trachomatis estimated from national registry data.

    Science.gov (United States)

    Honkila, Minna; Wikström, Erika; Renko, Marjo; Surcel, Heljä-Marja; Pokka, Tytti; Ikäheimo, Irma; Uhari, Matti; Tapiainen, Terhi

    2017-09-01

    Chlamydia trachomatis colonisation is common in pregnant women, and it has been claimed that mother-to-child transmission may occur in 10%-70% of deliveries. C. trachomatis infections are nevertheless rarely encountered in infants in clinical practice. In order to evaluate the reason for this discrepancy, we designed a nationwide study of the C. trachomatis vertical transmission. Children with a possible C. trachomatis infection were identified from two national health registries in 1996-2011. Copies of the children's medical records were reviewed and maternal serum bank samples obtained during the index pregnancies were analysed for C. trachomatis antibodies. The risk of vertical transmission was calculated using data from two earlier studies in which nucleic acid amplification test (NAAT) positivity and seroconversion rates among women in the general population were reported. Altogether 206 children had a possible C. trachomatis infection, which represents 0.22 per 1000 live births (95% CI 0.19 to 0.25). The risk of vertical transmission among the estimated 24 901 NAAT-positive mothers was 0.8% (95% CI 0.7 to 0.9). Based on the annual seroconversion rate of maternal antitrachomatis antibodies, the risk of vertical transmission was 1.8% (95% CI 1.5 to 2.0). Altogether 35% of the maternal serum samples obtained in the first trimester of a pregnancy leading to a C. trachomatis infection in the infant were negative, implying that the infection was acquired during pregnancy. C. trachomatis infections in infants were rare, with a population-based occurrence of 0.22 per 1000 live births. The risk of vertical transmission of C. trachomatis in the population was <2%, which is significantly lower than reported earlier. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  11. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....

  12. Benchmarks for detecting 'breakthroughs' in clinical trials: empirical assessment of the probability of large treatment effects using kernel density estimation.

    Science.gov (United States)

    Miladinovic, Branko; Kumar, Ambuj; Mhaskar, Rahul; Djulbegovic, Benjamin

    2014-10-21

    To understand how often 'breakthroughs,' that is, treatments that significantly improve health outcomes, can be developed. We applied weighted adaptive kernel density estimation to construct the probability density function for observed treatment effects from five publicly funded cohorts and one privately funded group. 820 trials involving 1064 comparisons and enrolling 331,004 patients were conducted by five publicly funded cooperative groups. 40 cancer trials involving 50 comparisons and enrolling a total of 19,889 patients were conducted by GlaxoSmithKline. We calculated that the probability of detecting treatment with large effects is 10% (5-25%), and that the probability of detecting treatment with very large treatment effects is 2% (0.3-10%). Researchers themselves judged that they discovered a new, breakthrough intervention in 16% of trials. We propose these figures as the benchmarks against which future development of 'breakthrough' treatments should be measured. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  13. Developing an Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2009-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish.

  14. PDE-Foam - a probability-density estimation method using self-adapting phase-space binning

    CERN Document Server

    Dannheim, Dominik; Voigt, Alexander; Grahn, Karl-Johan; Speckmayer, Peter

    2009-01-01

    Probability-Density Estimation (PDE) is a multivariate discrimination technique based on sampling signal and background densities defined by event samples from data or Monte-Carlo (MC) simulations in a multi-dimensional phase space. To efficiently use large event samples to estimate the probability density, a binary search tree (range searching) is used in the PDE-RS implementation. It is a generalisation of standard likelihood methods and a powerful classification tool for problems with highly non-linearly correlated observables. In this paper, we present an innovative improvement of the PDE method that uses a self-adapting binning method to divide the multi-dimensional phase space in a finite number of hyper-rectangles (cells). The binning algorithm adjusts the size and position of a predefined number of cells inside the multidimensional phase space, minimizing the variance of the signal and background densities inside the cells. The binned density information is stored in binary trees, allowing for a very ...

  15. Effect of subjective estimate of self-fitness on motivation for sport participation

    OpenAIRE

    杉本, 龍勇; Sugimoto, Tatsuo; 渡部, 近志; Watabe, Chikashi

    2015-01-01

    The focus of this study is Effect of subjective Estimate of self-fitness on motivation for sport participation of university students. Three different characteristics were identified form factor analysis. 1. Subjective Estimate of self-fitness or Results of “Shin Tairyoku Test (New fitness Test)” is not effective to motivate for regular sport participation in university students. 2. The Higher frequency of sports participation increases motivation of regular sport participation in university ...

  16. Building vulnerability to hydro-geomorphic hazards: Estimating damage probability from qualitative vulnerability assessment using logistic regression

    Science.gov (United States)

    Ettinger, Susanne; Mounaud, Loïc; Magill, Christina; Yao-Lafourcade, Anne-Françoise; Thouret, Jean-Claude; Manville, Vern; Negulescu, Caterina; Zuccaro, Giulio; De Gregorio, Daniela; Nardone, Stefano; Uchuchoque, Juan Alexis Luque; Arguedas, Anita; Macedo, Luisa; Manrique Llerena, Nélida

    2016-10-01

    The focus of this study is an analysis of building vulnerability through investigating impacts from the 8 February 2013 flash flood event along the Avenida Venezuela channel in the city of Arequipa, Peru. On this day, 124.5 mm of rain fell within 3 h (monthly mean: 29.3 mm) triggering a flash flood that inundated at least 0.4 km2 of urban settlements along the channel, affecting more than 280 buildings, 23 of a total of 53 bridges (pedestrian, vehicle and railway), and leading to the partial collapse of sections of the main road, paralyzing central parts of the city for more than one week. This study assesses the aspects of building design and site specific environmental characteristics that render a building vulnerable by considering the example of a flash flood event in February 2013. A statistical methodology is developed that enables estimation of damage probability for buildings. The applied method uses observed inundation height as a hazard proxy in areas where more detailed hydrodynamic modeling data is not available. Building design and site-specific environmental conditions determine the physical vulnerability. The mathematical approach considers both physical vulnerability and hazard related parameters and helps to reduce uncertainty in the determination of descriptive parameters, parameter interdependency and respective contributions to damage. This study aims to (1) enable the estimation of damage probability for a certain hazard intensity, and (2) obtain data to visualize variations in damage susceptibility for buildings in flood prone areas. Data collection is based on a post-flood event field survey and the analysis of high (sub-metric) spatial resolution images (Pléiades 2012, 2013). An inventory of 30 city blocks was collated in a GIS database in order to estimate the physical vulnerability of buildings. As many as 1103 buildings were surveyed along the affected drainage and 898 buildings were included in the statistical analysis. Univariate and

  17. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  18. Increased probability of repetitive spinal motoneuron activation by transcranial magnetic stimulation after muscle fatigue in healthy subjects

    DEFF Research Database (Denmark)

    Andersen, Birgit; Felding, Ulrik Ascanius; Krarup, Christian

    2012-01-01

    Triple stimulation technique (TST) has previously shown that transcranial magnetic stimulation (TMS) fails to activate a proportion of spinal motoneurons (MNs) during motor fatigue. The TST response depression without attenuation of the conventional motor evoked potential suggested increased prob...... the muscle is fatigued. Repetitive MN firing may provide an adaptive mechanism to maintain motor unit activation and task performance during sustained voluntary activity.......Triple stimulation technique (TST) has previously shown that transcranial magnetic stimulation (TMS) fails to activate a proportion of spinal motoneurons (MNs) during motor fatigue. The TST response depression without attenuation of the conventional motor evoked potential suggested increased...... probability of repetitive spinal MN activation during exercise even if some MNs failed to discharge by the brain stimulus. Here we used a modified TST (Quadruple stimulation; QuadS and Quintuple stimulation; QuintS) to examine the influence of fatiguing exercise on second and third MN discharges after...

  19. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  20. Inclusion probability for DNA mixtures is a subjective one-sided match statistic unrelated to identification information

    Directory of Open Access Journals (Sweden)

    Mark William Perlin

    2015-01-01

    Full Text Available Background: DNA mixtures of two or more people are a common type of forensic crime scene evidence. A match statistic that connects the evidence to a criminal defendant is usually needed for court. Jurors rely on this strength of match to help decide guilt or innocence. However, the reliability of unsophisticated match statistics for DNA mixtures has been questioned. Materials and Methods: The most prevalent match statistic for DNA mixtures is the combined probability of inclusion (CPI, used by crime labs for over 15 years. When testing 13 short tandem repeat (STR genetic loci, the CPI -1 value is typically around a million, regardless of DNA mixture composition. However, actual identification information, as measured by a likelihood ratio (LR, spans a much broader range. This study examined probability of inclusion (PI mixture statistics for 517 locus experiments drawn from 16 reported cases and compared them with LR locus information calculated independently on the same data. The log(PI -1 values were examined and compared with corresponding log(LR values. Results: The LR and CPI methods were compared in case examples of false inclusion, false exclusion, a homicide, and criminal justice outcomes. Statistical analysis of crime laboratory STR data shows that inclusion match statistics exhibit a truncated normal distribution having zero center, with little correlation to actual identification information. By the law of large numbers (LLN, CPI -1 increases with the number of tested genetic loci, regardless of DNA mixture composition or match information. These statistical findings explain why CPI is relatively constant, with implications for DNA policy, criminal justice, cost of crime, and crime prevention. Conclusions: Forensic crime laboratories have generated CPI statistics on hundreds of thousands of DNA mixture evidence items. However, this commonly used match statistic behaves like a random generator of inclusionary values, following the LLN

  1. Evaluation of test-strategies for estimating probability of low prevalence of paratuberculosis in Danish dairy herds

    DEFF Research Database (Denmark)

    Sergeant, E.S.G.; Nielsen, Søren S.; Toft, Nils

    2008-01-01

    Paratuberculosis is a chronic infection affecting cattle and other ruminants. In the dairy industry, losses due to paratuberculosis can be substantial in infected herds and several countries have implemented national programmes based on herd-classification to manage the disease. The aim of this s......Paratuberculosis is a chronic infection affecting cattle and other ruminants. In the dairy industry, losses due to paratuberculosis can be substantial in infected herds and several countries have implemented national programmes based on herd-classification to manage the disease. The aim...... of this study was to develop a method to estimate the probability of low within-herd prevalence of paratuberculosis for Danish dairy herds. A stochastic simulation model was developed using the R(R) programming environment. Features of this model included: use of age-specific estimates of test....... Using this model, five herd-testing strategies were evaluated: (1) milk-ELISA on all lactating cows; (2) milk-ELISA on lactating cows 4 years old; (4) faecal culture on all lactating cows; and (5) milk-ELISA plus faecal culture in series on all lactating cows. The five testing strategies were evaluated...

  2. Subjectivity

    Directory of Open Access Journals (Sweden)

    Jesús Vega Encabo

    2015-11-01

    Full Text Available In this paper, I claim that subjectivity is a way of being that is constituted through a set of practices in which the self is subject to the dangers of fictionalizing and plotting her life and self-image. I examine some ways of becoming subject through narratives and through theatrical performance before others. Through these practices, a real and active subjectivity is revealed, capable of self-knowledge and self-transformation. 

  3. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri

    Science.gov (United States)

    Southard, Rodney E.; Veilleux, Andrea G.

    2014-01-01

    Regression analysis techniques were used to develop a set of equations for rural ungaged stream sites for estimating discharges with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. Basin and climatic characteristics were computed using geographic information software and digital geospatial data. A total of 35 characteristics were computed for use in preliminary statewide and regional regression analyses. Annual exceedance-probability discharge estimates were computed for 278 streamgages by using the expected moments algorithm to fit a log-Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data from water year 1844 to 2012. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized multiple Grubbs-Beck test was used to detect potentially influential low floods. Annual peak flows less than a minimum recordable discharge at a streamgage were incorporated into the at-site station analyses. An updated regional skew coefficient was determined for the State of Missouri using Bayesian weighted least-squares/generalized least squares regression analyses. At-site skew estimates for 108 long-term streamgages with 30 or more years of record and the 35 basin characteristics defined for this study were used to estimate the regional variability in skew. However, a constant generalized-skew value of -0.30 and a mean square error of 0.14 were determined in this study. Previous flood studies indicated that the distinct physical features of the three physiographic provinces have a pronounced effect on the magnitude of flood peaks. Trends in the magnitudes of the residuals from preliminary statewide regression analyses from previous studies confirmed that regional analyses in this study were

  4. Comparison of methods of estimating body fat in normal subjects and cancer patients

    Energy Technology Data Exchange (ETDEWEB)

    Cohn, S.H. (Brookhaven National Lab., Upton, NY); Ellis, K.J.; Vartsky, D.; Sawitsky, A.; Gartenhaus, W.; Yasumura, S.; Vaswani, A.N.

    1981-12-01

    Total body fat can be indirectly estimated by the following noninvasive techniques: determination of lean body mass by measurement of body potassium or body water, and determination of density by underwater weighing or by skinfold measurements. The measurement of total body nitrogen by neutron activation provides another technique for estimating lean body mass and hence body fat. The nitrogen measurement can also be combined with the measurement of total body potassium in a two compartment model of the lean body mass from which another estimate of body fat can be derived. All of the above techniques are subject to various errors and are based on a number of assumptions, some of which are incompletely validated. These techniques were applied to a population of normal subjects and to a group of cancer patients. The advantages and disadvantages of each method are discussed in terms of their ability to estimate total body fat.

  5. Comparison of methods of estimating body fat in normal subjects and cancer patients.

    Science.gov (United States)

    Cohn, S H; Ellis, K J; Vartsky, D; Sawitsky, A; Gartenhaus, W; Yasumura, S; Vaswani, A N

    1981-12-01

    Total body fat can be indirectly estimated by the following noninvasive techniques: determination of lean body mass by measurement of body potassium or body water, and determination of density by underwater weighing or by skinfold measurements. The measurement of total body nitrogen by neutron activation provides another technique for estimating lean body mass and hence body fat. The nitrogen measurement can also be combined with the measurement of total body potassium in a two compartment model of the lean body mass from which another estimate of body fat can be derived. All of the above techniques are subject to various errors and are based on a number of assumptions, some of which are incompletely validated. These techniques were applied to a population of normal subjects and to a group of cancer patients. The advantages and disadvantages of each method are discussed in terms of their ability to estimate total body fat.

  6. [Prevalence of osteoporosis, estimation of probability of fracture and bone metabolism study in patients with newly diagnosed prostate cancer in the health area of Lugo].

    Science.gov (United States)

    Miguel-Carrera, Jonatan; García-Porrua, Carlos; de Toro Santos, Francisco Javier; Picallo-Sánchez, Jose Antonio

    2017-06-16

    To study the prevalence of osteoporosis and fracture probability in patients diagnosed with prostate cancer. Observational descriptive transversal study. SITE: Study performed from Primary Care of Lugo in collaboration with Rheumatology and Urology Services of our referral hospital. Patients diagnosed with prostate cancer without bone metastatic disease from January to December 2012. Epidemiologic, clinical, laboratory and densitometric variables involved in osteoporosis were collected. The likelihood of fracture was estimated by FRAX ® Tool. Eighty-three patients met the inclusion criteria. None was excluded. The average age was 67 years. The Body Mass Index was 28.28. Twenty-five patients (30.1%) had previous osteoporotic fractures. Other prevalent risk factors were alcohol (26.5%) and smoking (22.9%). Eighty-two subjects had vitamin D below normal level (98.80%). Femoral Neck densitometry showed that 8.9% had osteoporosis and 54% osteopenia. The average fracture risk in this population, estimated by FRAX ® , was 2.63% for hip fracture and 5.28% for major fracture. Cut level for FRAX ® major fracture value without DXA >5% and ≥7.5% proposed by Azagra et al. showed 24 patients (28.92%) and 8 patients (9.64%) respectively. The prevalence of osteoporosis in this population was very high. The more frequent risk factors associated with osteoporosis were: previous osteoporotic fracture, alcohol consumption, smoking and family history of previous fracture. The probability of fracture using femoral neck FRAX ® tool was low. Vitamin D deficiency was very common (98.8%). Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  7. Probability Distribution Estimated From the Minimum, Maximum, and Most Likely Values: Applied to Turbine Inlet Temperature Uncertainty

    Science.gov (United States)

    Holland, Frederic A., Jr.

    2004-01-01

    Modern engineering design practices are tending more toward the treatment of design parameters as random variables as opposed to fixed, or deterministic, values. The probabilistic design approach attempts to account for the uncertainty in design parameters by representing them as a distribution of values rather than as a single value. The motivations for this effort include preventing excessive overdesign as well as assessing and assuring reliability, both of which are important for aerospace applications. However, the determination of the probability distribution is a fundamental problem in reliability analysis. A random variable is often defined by the parameters of the theoretical distribution function that gives the best fit to experimental data. In many cases the distribution must be assumed from very limited information or data. Often the types of information that are available or reasonably estimated are the minimum, maximum, and most likely values of the design parameter. For these situations the beta distribution model is very convenient because the parameters that define the distribution can be easily determined from these three pieces of information. Widely used in the field of operations research, the beta model is very flexible and is also useful for estimating the mean and standard deviation of a random variable given only the aforementioned three values. However, an assumption is required to determine the four parameters of the beta distribution from only these three pieces of information (some of the more common distributions, like the normal, lognormal, gamma, and Weibull distributions, have two or three parameters). The conventional method assumes that the standard deviation is a certain fraction of the range. The beta parameters are then determined by solving a set of equations simultaneously. A new method developed in-house at the NASA Glenn Research Center assumes a value for one of the beta shape parameters based on an analogy with the normal

  8. Methods for estimating annual exceedance-probability streamflows for streams in Kansas based on data through water year 2015

    Science.gov (United States)

    Painter, Colin C.; Heimann, David C.; Lanning-Rush, Jennifer L.

    2017-08-14

    A study was done by the U.S. Geological Survey in cooperation with the Kansas Department of Transportation and the Federal Emergency Management Agency to develop regression models to estimate peak streamflows of annual exceedance probabilities of 50, 20, 10, 4, 2, 1, 0.5, and 0.2 percent at ungaged locations in Kansas. Peak streamflow frequency statistics from selected streamgages were related to contributing drainage area and average precipitation using generalized least-squares regression analysis. The peak streamflow statistics were derived from 151 streamgages with at least 25 years of streamflow data through 2015. The developed equations can be used to predict peak streamflow magnitude and frequency within two hydrologic regions that were defined based on the effects of irrigation. The equations developed in this report are applicable to streams in Kansas that are not substantially affected by regulation, surface-water diversions, or urbanization. The equations are intended for use for streams with contributing drainage areas ranging from 0.17 to 14,901 square miles in the nonirrigation effects region and, 1.02 to 3,555 square miles in the irrigation-affected region, corresponding to the range of drainage areas of the streamgages used in the development of the regional equations.

  9. Bias Corrections for Standardized Effect Size Estimates Used with Single-Subject Experimental Designs

    Science.gov (United States)

    Ugille, Maaike; Moeyaert, Mariola; Beretvas, S. Natasha; Ferron, John M.; Van den Noortgate, Wim

    2014-01-01

    A multilevel meta-analysis can combine the results of several single-subject experimental design studies. However, the estimated effects are biased if the effect sizes are standardized and the number of measurement occasions is small. In this study, the authors investigated 4 approaches to correct for this bias. First, the standardized effect…

  10. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  11. Estimation of Subjective Difficulty and Psychological Stress by Ambient Sensing of Desk Panel Vibrations

    Science.gov (United States)

    Hamaguchi, Nana; Yamamoto, Keiko; Iwai, Daisuke; Sato, Kosuke

    We investigate ambient sensing techniques that recognize writer's psychological states by measuring vibrations of handwriting on a desk panel using a piezoelectric contact sensor attached to its underside. In particular, we describe a technique for estimating the subjective difficulty of a question for a student as the ratio of the time duration of thinking to the total amount of time spent on the question. Through experiments, we confirm that our technique correctly recognizes whether or not a person writes something down on paper by measured vibration data at the accuracy of over 80 %, and that the order of computed subjective difficulties of three questions is coincident with that reported by the subject in 60 % of experiments. We also propose a technique to estimate a writer's psychological stress by using the standard deviation of the spectrum of the measured vibration. Results of a proof-of-concept experiment show that the proposed technique correctly estimates whether or not the subject feels stress at least 90 % of the time.

  12. Normative perceptual estimates for 91 healthy subjects age 60-75

    DEFF Research Database (Denmark)

    Wilms, Inge Linda; Nielsen, Simon

    2014-01-01

    Visual perception serves as the basis for much of the higher level cognitive processing as well as human activity in general. Here we present normative estimates for the following components of visual perception: the visual perceptual threshold, the visual short-term memory capacity and the visual...... perceptual encoding/decoding speed (processing speed) of visual short-term memory based on an assessment of 91 healthy subjects aged 60-75. The estimates are presented at total sample level as well as at gender level. The estimates were modelled from input from a whole-report assessment based on A Theory...... speed of Visual Short-term Memory (VTSM) but not the capacity of VSTM nor the visual threshold. The estimates will be useful for future studies into the effects of various types of intervention and training on cognition in general and visual attention in particular....

  13. Model approach to estimate the probability of accepting a lot of heterogeneously contaminated powdered food using different sampling strategies.

    Science.gov (United States)

    Valero, Antonio; Pasquali, Frédérique; De Cesare, Alessandra; Manfreda, Gerardo

    2014-08-01

    Current sampling plans assume a random distribution of microorganisms in food. However, food-borne pathogens are estimated to be heterogeneously distributed in powdered foods. This spatial distribution together with very low level of contaminations raises concern of the efficiency of current sampling plans for the detection of food-borne pathogens like Cronobacter and Salmonella in powdered foods such as powdered infant formula or powdered eggs. An alternative approach based on a Poisson distribution of the contaminated part of the lot (Habraken approach) was used in order to evaluate the probability of falsely accepting a contaminated lot of powdered food when different sampling strategies were simulated considering variables such as lot size, sample size, microbial concentration in the contaminated part of the lot and proportion of contaminated lot. The simulated results suggest that a sample size of 100g or more corresponds to the lower number of samples to be tested in comparison with sample sizes of 10 or 1g. Moreover, the number of samples to be tested greatly decrease if the microbial concentration is 1CFU/g instead of 0.1CFU/g or if the proportion of contamination is 0.05 instead of 0.01. Mean contaminations higher than 1CFU/g or proportions higher than 0.05 did not impact on the number of samples. The Habraken approach represents a useful tool for risk management in order to design a fit-for-purpose sampling plan for the detection of low levels of food-borne pathogens in heterogeneously contaminated powdered food. However, it must be outlined that although effective in detecting pathogens, these sampling plans are difficult to be applied since the huge number of samples that needs to be tested. Sampling does not seem an effective measure to control pathogens in powdered food. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Estimating the Probability of Human Error by Incorporating Component Failure Data from User-Induced Defects in the Development of Complex Electrical Systems.

    Science.gov (United States)

    Majewicz, Peter J; Blessner, Paul; Olson, Bill; Blackburn, Timothy

    2017-04-05

    This article proposes a methodology for incorporating electrical component failure data into the human error assessment and reduction technique (HEART) for estimating human error probabilities (HEPs). The existing HEART method contains factors known as error-producing conditions (EPCs) that adjust a generic HEP to a more specific situation being assessed. The selection and proportioning of these EPCs are at the discretion of an assessor, and are therefore subject to the assessor's experience and potential bias. This dependence on expert opinion is prevalent in similar HEP assessment techniques used in numerous industrial areas. The proposed method incorporates factors based on observed trends in electrical component failures to produce a revised HEP that can trigger risk mitigation actions more effectively based on the presence of component categories or other hazardous conditions that have a history of failure due to human error. The data used for the additional factors are a result of an analysis of failures of electronic components experienced during system integration and testing at NASA Goddard Space Flight Center. The analysis includes the determination of root failure mechanisms and trend analysis. The major causes of these defects were attributed to electrostatic damage, electrical overstress, mechanical overstress, or thermal overstress. These factors representing user-induced defects are quantified and incorporated into specific hardware factors based on the system's electrical parts list. This proposed methodology is demonstrated with an example comparing the original HEART method and the proposed modified technique. © 2017 Society for Risk Analysis.

  15. Bayesian pretest probability estimation for primary malignant bone tumors based on the Surveillance, Epidemiology and End Results Program (SEER) database.

    Science.gov (United States)

    Benndorf, Matthias; Neubauer, Jakob; Langer, Mathias; Kotter, Elmar

    2017-03-01

    In the diagnostic process of primary bone tumors, patient age, tumor localization and to a lesser extent sex affect the differential diagnosis. We therefore aim to develop a pretest probability calculator for primary malignant bone tumors based on population data taking these variables into account. We access the SEER (Surveillance, Epidemiology and End Results Program of the National Cancer Institute, 2015 release) database and analyze data of all primary malignant bone tumors diagnosed between 1973 and 2012. We record age at diagnosis, tumor localization according to the International Classification of Diseases (ICD-O-3) and sex. We take relative probability of the single tumor entity as a surrogate parameter for unadjusted pretest probability. We build a probabilistic (naïve Bayes) classifier to calculate pretest probabilities adjusted for age, tumor localization and sex. We analyze data from 12,931 patients (647 chondroblastic osteosarcomas, 3659 chondrosarcomas, 1080 chordomas, 185 dedifferentiated chondrosarcomas, 2006 Ewing's sarcomas, 281 fibroblastic osteosarcomas, 129 fibrosarcomas, 291 fibrous malignant histiocytomas, 289 malignant giant cell tumors, 238 myxoid chondrosarcomas, 3730 osteosarcomas, 252 parosteal osteosarcomas, 144 telangiectatic osteosarcomas). We make our probability calculator accessible at http://ebm-radiology.com/bayesbone/index.html . We provide exhaustive tables for age and localization data. Results from tenfold cross-validation show that in 79.8 % of cases the pretest probability is correctly raised. Our approach employs population data to calculate relative pretest probabilities for primary malignant bone tumors. The calculator is not diagnostic in nature. However, resulting probabilities might serve as an initial evaluation of probabilities of tumors on the differential diagnosis list.

  16. Predictive Models to Estimate Probabilities of Injuries and Adverse Performance Outcomes in U.S. Army Basic Combat Training

    Science.gov (United States)

    2014-03-01

    7 APFT Data Collection...5 Figure 2. Receiver operating characteristic curve for logistic regression model predicting probability of final APFT failure...Predictive equation for the Final APFT Failure model.. .................................. 19 Table 11. Predictive equation for the Attrition model

  17. A Method to Estimate the Probability That Any Individual Lightning Stroke Contacted the Surface Within Any Radius of Any Point

    Science.gov (United States)

    Huddleston, Lisa L.; Roeder, William; Merceret, Francis J.

    2010-01-01

    A technique has been developed to calculate the probability that any nearby lightning stroke is within any radius of any point of interest. In practice, this provides the probability that a nearby lightning stroke was within a key distance of a facility, rather than the error ellipses centered on the stroke. This process takes the current bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to get the probability that the stroke is inside any specified radius. This new facility-centric technique will be much more useful to the space launch customers and may supersede the lightning error ellipse approach discussed in [5], [6].

  18. Estimation of Partial Safety Factors and Target Failure Probability Based on Cost Optimization of Rubble Mound Breakwaters

    DEFF Research Database (Denmark)

    Kim, Seung-Woo; Suh, Kyung-Duck; Burcharth, Hans F.

    2010-01-01

    The breakwaters are designed by considering the cost optimization because a human risk is seldom considered. Most breakwaters, however, were constructed without considering the cost optimization. In this study, the optimum return period, target failure probability and the partial safety factors...... were evaluated by applying the cost optimization to the rubble mound breakwaters in Korea. The applied method was developed by Hans F. Burcharth and John D. Sorensen in relation to the PIANC Working Group 47. The optimum return period was determined as 50 years in many cases and was found as 100 years...... in the case of high real interest rate. Target failure probability was suggested by using the probabilities of failure corresponding to the optimum return period and those of reliability analysis of existing structures. The final target failure probability is about 60% for the initial limit state...

  19. Estimation of Circadian Body Temperature Rhythm Based on Heart Rate in Healthy, Ambulatory Subjects.

    Science.gov (United States)

    Sim, Soo Young; Joo, Kwang Min; Kim, Han Byul; Jang, Seungjin; Kim, Beomoh; Hong, Seungbum; Kim, Sungwan; Park, Kwang Suk

    2017-03-01

    Core body temperature is a reliable marker for circadian rhythm. As characteristics of the circadian body temperature rhythm change during diverse health problems, such as sleep disorder and depression, body temperature monitoring is often used in clinical diagnosis and treatment. However, the use of current thermometers in circadian rhythm monitoring is impractical in daily life. As heart rate is a physiological signal relevant to thermoregulation, we investigated the feasibility of heart rate monitoring in estimating circadian body temperature rhythm. Various heart rate parameters and core body temperature were simultaneously acquired in 21 healthy, ambulatory subjects during their routine life. The performance of regression analysis and the extended Kalman filter on daily body temperature and circadian indicator (mesor, amplitude, and acrophase) estimation were evaluated. For daily body temperature estimation, mean R-R interval (RRI), mean heart rate (MHR), or normalized MHR provided a mean root mean square error of approximately 0.40 °C in both techniques. The mesor estimation regression analysis showed better performance than the extended Kalman filter. However, the extended Kalman filter, combined with RRI or MHR, provided better accuracy in terms of amplitude and acrophase estimation. We suggest that this noninvasive and convenient method for estimating the circadian body temperature rhythm could reduce discomfort during body temperature monitoring in daily life. This, in turn, could facilitate more clinical studies based on circadian body temperature rhythm.

  20. An Approach to Mathematical Modeling and Estimation of Probe-Drogue Docking Success Probability for UAV Autonomous Aerial Refueling

    OpenAIRE

    Xufeng Wang; Jianmin Li; Xingwei Kong; Xinmin Dong; Bo Zhang

    2017-01-01

    One of the keys to the success of aerial refueling for probe-drogue aerial refueling system (PDARS) is the successful docking between the probe and drogue. The study of probe-drogue docking success probability offers an important support to achieving successful docking. During the docking phase of PDARS, based on prior information and reasonable assumptions for the movements of the drogue under atmospheric disturbance, the probe-drogue docking success probability is converted to the probabili...

  1. Hate Crimes and Stigma-Related Experiences among Sexual Minority Adults in the United States: Prevalence Estimates from a National Probability Sample

    Science.gov (United States)

    Herek, Gregory M.

    2009-01-01

    Using survey responses collected via the Internet from a U.S. national probability sample of gay, lesbian, and bisexual adults (N = 662), this article reports prevalence estimates of criminal victimization and related experiences based on the target's sexual orientation. Approximately 20% of respondents reported having experienced a person or…

  2. Estimating the transitional probabilities of smoking stages with cross-sectional data and 10-year projection for smoking behavior in Iranian adolescents

    Directory of Open Access Journals (Sweden)

    Ahmad Khosravi

    2016-01-01

    Conclusions: The present study showed a moderately but concerning prevalence of current smoking in Iranian adolescents and introduced a novel method for estimation of transitional probabilities from a cross-sectional study. The increasing trend of cigarette use among adolescents indicated the necessity of paying more attention to this group.

  3. An Approach to Mathematical Modeling and Estimation of Probe-Drogue Docking Success Probability for UAV Autonomous Aerial Refueling

    Directory of Open Access Journals (Sweden)

    Xufeng Wang

    2017-01-01

    Full Text Available One of the keys to the success of aerial refueling for probe-drogue aerial refueling system (PDARS is the successful docking between the probe and drogue. The study of probe-drogue docking success probability offers an important support to achieving successful docking. During the docking phase of PDARS, based on prior information and reasonable assumptions for the movements of the drogue under atmospheric disturbance, the probe-drogue docking success probability is converted to the probability of the drogue center located in a specific area. A model of the probe-drogue docking success probability is established with and without actuation error, respectively. The curves of the probe-drogue docking success probability with the standard deviation of the drogue central position, the maximum distance from the drogue center position to the equilibrium position, the actuation error, and the standard deviation of the actuation error are obtained through simulations. The study has referential value for the docking maneuver decision of aerial refueling for PDARS.

  4. Evaluating probability forecasts

    OpenAIRE

    Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo

    2011-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...

  5. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    The proportionator is a novel and radically different approach to sampling with microscopes based on well-known statistical theory (probability proportional to size - PPS sampling). It uses automatic image analysis, with a large range of options, to assign to every field of view in the section a ...

  6. Estimating detection probability for Canada lynx Lynx canadensis using snow-track surveys in the northern Rocky Mountains, Montana, USA

    Science.gov (United States)

    John R. Squires; Lucretia E. Olson; David L. Turner; Nicholas J. DeCesare; Jay A. Kolbe

    2012-01-01

    We used snow-tracking surveys to determine the probability of detecting Canada lynx Lynx canadensis in known areas of lynx presence in the northern Rocky Mountains, Montana, USA during the winters of 2006 and 2007. We used this information to determine the minimum number of survey replicates necessary to infer the presence and absence of lynx in areas of similar lynx...

  7. Estimation of False Alarm Probabilities in Cell Averaging Constant False Alarm Rate Detectors via Monte Carlo Methods

    Science.gov (United States)

    2004-11-01

    ET, e-’J. This is the same as the second Monte Carlo estimator PMC2 considered previously. Before considering IS estimation in more detail, we...and given Y. Hence P’A = Eo [g(7oY)], suggesting that IS estimation can be performed by biasing Y and estinmating the expecta- tion of g(roY). The...clutter and noise is non-Gaussian. The IS estimation techniques are also not restricted to CA-CFAR, but can be applied to other CFAR schemes. This will be

  8. A joint probability approach using a 1-D hydrodynamic model for estimating high water level frequencies in the Lower Rhine Delta

    Directory of Open Access Journals (Sweden)

    H. Zhong

    2013-07-01

    Full Text Available The Lower Rhine Delta, a transitional area between the River Rhine and Meuse and the North Sea, is at risk of flooding induced by infrequent events of a storm surge or upstream flooding, or by more infrequent events of a combination of both. A joint probability analysis of the astronomical tide, the wind induced storm surge, the Rhine flow and the Meuse flow at the boundaries is established in order to produce the joint probability distribution of potential flood events. Three individual joint probability distributions are established corresponding to three potential flooding causes: storm surges and normal Rhine discharges, normal sea levels and high Rhine discharges, and storm surges and high Rhine discharges. For each category, its corresponding joint probability distribution is applied, in order to stochastically simulate a large number of scenarios. These scenarios can be used as inputs to a deterministic 1-D hydrodynamic model in order to estimate the high water level frequency curves at the transitional locations. The results present the exceedance probability of the present design water level for the economically important cities of Rotterdam and Dordrecht. The calculated exceedance probability is evaluated and compared to the governmental norm. Moreover, the impact of climate change on the high water level frequency curves is quantified for the year 2050 in order to assist in decisions regarding the adaptation of the operational water management system and the flood defense system.

  9. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  10. A two-part model for reference curve estimation subject to a limit of detection.

    Science.gov (United States)

    Zhang, Z; Addo, O Y; Himes, J H; Hediger, M L; Albert, P S; Gollenberg, A L; Lee, P A; Louis, G M Buck

    2011-05-30

    Reference curves are commonly used to identify individuals with extreme values of clinically relevant variables or stages of progression which depend naturally on age or maturation. Estimation of reference curves can be complicated by a technical limit of detection (LOD) that censors the measurement from the left, as is the case in our study of reproductive hormone levels in boys around the time of the onset of puberty. We discuss issues with common approaches to the LOD problem in the context of our pubertal hormone study, and propose a two-part model that addresses these issues. One part of the proposed model specifies the probability of a measurement exceeding the LOD as a function of age. The other part of the model specifies the conditional distribution of a measurement given that it exceeds the LOD, again as a function of age. Information from the two parts can be combined to estimate the identifiable portion (i.e. above the LOD) of a reference curve and to calculate the relative standing of a given measurement above the LOD. Unlike some common approaches to LOD problems, the two-part model is free of untestable assumptions involving unobservable quantities, flexible for modeling the observable data, and easy to implement with existing software. The method is illustrated with hormone data from the Third National Health and Nutrition Examination Survey. This article is a U.S. Government work and is in the public domain in the U.S.A. Published in 2011 by John Wiley & Sons, Ltd.

  11. Estimation of Extreme Responses and Failure Probability of Wind Turbines under Normal Operation by Controlled Monte Carlo Simulation

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri

    Extreme value predictions for application in wind turbine design are often based on asymptotic results. Assuming that the extreme values of a wind turbine responses, i.e. maximum values of the mud-line moment or blades’ root stress, follow a certain but unknown probability density (mass...... order statistical moments. The results obtained by extrapolation of the extreme values to the stipulated design period of the wind turbine depend strongly on the relevance of these adopted extreme value distributions. The problem is that this relevance cannot be decided from the data obtained....... The solution of the Fokker-Planck-Kolmogorov (FPK) equation for systems governed by a stochastic differential equation driven by Gaussian white noise will give the sought time variation of the probability density function. However the analytical solution of the FPK is available for only a few dynamic systems...

  12. Double-ended break probability estimate for the 304 stainless steel main circulation piping of a production reactor

    Energy Technology Data Exchange (ETDEWEB)

    Mehta, H.S. [General Electric Co., San Jose, CA (United States); Daugherty, W.L.; Awadalla, N.G.; Sindelar, R.L. [Westinghouse Savannah River Co., Aiken, SC (United States)

    1991-12-31

    The large break frequency resulting from intergranular stress corrosion cracking in the main circulation piping of the Savannah River Site (SRS) production reactors has been estimated. Four factors are developed to describe the likelihood that a crack exists that is not identified by ultrasonic inspection, and that grows to instability prior to growing through-wall and being detected by the ensuing leakage. The estimated large break frequency is 3.4 {times} 10{sup {minus}8} per reactor-year.

  13. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  14. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-01-01

    Objective To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. Methods We compared 148 MSM aged 18–64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010–2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. Results MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%–95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. Conclusions National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. PMID:26965869

  15. FuzzyStatProb: An R Package for the Estimation of Fuzzy Stationary Probabilities from a Sequence of Observations of an Unknown Markov Chain

    Directory of Open Access Journals (Sweden)

    Pablo J. Villacorta

    2016-07-01

    Full Text Available Markov chains are well-established probabilistic models of a wide variety of real systems that evolve along time. Countless examples of applications of Markov chains that successfully capture the probabilistic nature of real problems include areas as diverse as biology, medicine, social science, and engineering. One interesting feature which characterizes certain kinds of Markov chains is their stationary distribution, which stands for the global fraction of time the system spends in each state. The computation of the stationary distribution requires precise knowledge of the transition probabilities. When the only information available is a sequence of observations drawn from the system, such probabilities have to be estimated. Here we review an existing method to estimate fuzzy transition probabilities from observations and, with them, obtain the fuzzy stationary distribution of the resulting fuzzy Markov chain. The method also works when the user directly provides fuzzy transition probabilities. We provide an implementation in the R environment that is the first available to the community and serves as a proof of concept. We demonstrate the usefulness of our proposal with computational experiments on a toy problem, namely a time-homogeneous Markov chain that guides the randomized movement of an autonomous robot that patrols a small area.

  16. Interactive RadioEpidemiological Program (IREP): a web-based tool for estimating probability of causation/assigned share of radiogenic cancers.

    Science.gov (United States)

    Kocher, David C; Apostoaei, A Iulian; Henshaw, Russell W; Hoffman, F Owen; Schubauer-Berigan, Mary K; Stancescu, Daniel O; Thomas, Brian A; Trabalka, John R; Gilbert, Ethel S; Land, Charles E

    2008-07-01

    The Interactive RadioEpidemiological Program (IREP) is a Web-based, interactive computer code that is used to estimate the probability that a given cancer in an individual was induced by given exposures to ionizing radiation. IREP was developed by a Working Group of the National Cancer Institute and Centers for Disease Control and Prevention, and was adopted and modified by the National Institute for Occupational Safety and Health (NIOSH) for use in adjudicating claims for compensation for cancer under the Energy Employees Occupational Illness Compensation Program Act of 2000. In this paper, the quantity calculated in IREP is referred to as "probability of causation/assigned share" (PC/AS). PC/AS for a given cancer in an individual is calculated on the basis of an estimate of the excess relative risk (ERR) associated with given radiation exposures and the relationship PC/AS = ERR/ERR+1. IREP accounts for uncertainties in calculating probability distributions of ERR and PC/AS. An accounting of uncertainty is necessary when decisions about granting claims for compensation for cancer are made on the basis of an estimate of the upper 99% credibility limit of PC/AS to give claimants the "benefit of the doubt." This paper discusses models and methods incorporated in IREP to estimate ERR and PC/AS. Approaches to accounting for uncertainty are emphasized, and limitations of IREP are discussed. Although IREP is intended to provide unbiased estimates of ERR and PC/AS and their uncertainties to represent the current state of knowledge, there are situations described in this paper in which NIOSH, as a matter of policy, makes assumptions that give a higher estimate of the upper 99% credibility limit of PC/AS than other plausible alternatives and, thus, are more favorable to claimants.

  17. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  18. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  19. Cost and Benefit of Control Strategies - Estimation of Benefit functions, enforcement-probability function and enforcement-cost function

    DEFF Research Database (Denmark)

    Kronbak, Lone Grønbæk; Jensen, Frank

    levels and 4) the connection between different enforcement levels and costs. The purpose of estimating the functional relationships are for future application in the COBECOS computer modeling in order to carry out an cost-benefit analysis of control strategies and thereby find the optimal mix and level...

  20. Modeling and classification of knee-joint vibroarthrographic signals using probability density functions estimated with Parzen windows.

    Science.gov (United States)

    Rangayyan, Rangaraj M; Wu, Yunfeng

    2008-01-01

    Diagnostic information related to the articular cartilage surfaces of knee-joints may be derived from vibro-arthrographic (VAG) signals. Although several studies have proposed many different types of parameters for the analysis and classification of VAG signals, no statistical modeling methods have been explored to represent the fundamental distinctions between normal and abnormal VAG signals. In the present work, we derive models of probability density functions (PDFs), using the Parzen-window approach, to represent the basic statistical characteristics of normal and abnormal VAG signals. The Kullback-Leibler distance (KLD) is then computed between the PDF of the signal to be classified and the PDF models for normal and abnormal VAG signals. A classification accuracy of 73.03% was obtained with a database of 89 VAG signals. The screening efficiency was derived to be 0.6724, in terms of the area under the receiver operating characteristics curve.

  1. Estimated probabilities, volumes, and inundation areas depths of potential postwildfire debris flows from Carbonate, Slate, Raspberry, and Milton Creeks, near Marble, Gunnison County, Colorado

    Science.gov (United States)

    Stevens, Michael R.; Flynn, Jennifer L.; Stephens, Verlin C.; Verdin, Kristine L.

    2011-01-01

    During 2009, the U.S. Geological Survey, in cooperation with Gunnison County, initiated a study to estimate the potential for postwildfire debris flows to occur in the drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble, Colorado. Currently (2010), these drainage basins are unburned but could be burned by a future wildfire. Empirical models derived from statistical evaluation of data collected from recently burned basins throughout the intermountain western United States were used to estimate the probability of postwildfire debris-flow occurrence and debris-flow volumes for drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble. Data for the postwildfire debris-flow models included drainage basin area; area burned and burn severity; percentage of burned area; soil properties; rainfall total and intensity for the 5- and 25-year-recurrence, 1-hour-duration-rainfall; and topographic and soil property characteristics of the drainage basins occupied by the four creeks. A quasi-two-dimensional floodplain computer model (FLO-2D) was used to estimate the spatial distribution and the maximum instantaneous depth of the postwildfire debris-flow material during debris flow on the existing debris-flow fans that issue from the outlets of the four major drainage basins. The postwildfire debris-flow probabilities at the outlet of each drainage basin range from 1 to 19 percent for the 5-year-recurrence, 1-hour-duration rainfall, and from 3 to 35 percent for 25-year-recurrence, 1-hour-duration rainfall. The largest probabilities for postwildfire debris flow are estimated for Raspberry Creek (19 and 35 percent), whereas estimated debris-flow probabilities for the three other creeks range from 1 to 6 percent. The estimated postwildfire debris-flow volumes at the outlet of each creek range from 7,500 to 101,000 cubic meters for the 5-year-recurrence, 1-hour-duration rainfall, and from 9,400 to 126,000 cubic meters for

  2. Estimating User Influence in Online Social Networks Subject to Information Overload

    Science.gov (United States)

    Li, Pei; Sun, Yunchuan; Chen, Yingwen; Tian, Zhi

    2014-11-01

    Online social networks have attracted remarkable attention since they provide various approaches for hundreds of millions of people to stay connected with their friends. Due to the existence of information overload, the research on diffusion dynamics in epidemiology cannot be adopted directly to that in online social networks. In this paper, we consider diffusion dynamics in online social networks subject to information overload, and model the information-processing process of a user by a queue with a batch arrival and a finite buffer. We use the average number of times a message is processed after it is generated by a given user to characterize the user influence, which is then estimated through theoretical analysis for a given network. We validate the accuracy of our estimation by simulations, and apply the results to study the impacts of different factors on the user influence. Among the observations, we find that the impact of network size on the user influence is marginal while the user influence decreases with assortativity due to information overload, which is particularly interesting.

  3. Improved children's motor learning of the basketball free shooting pattern by associating subjective error estimation and extrinsic feedback.

    Science.gov (United States)

    Silva, Leandro de Carvalho da; Pereira-Monfredini, Carla Ferro; Teixeira, Luis Augusto

    2017-09-01

    This study aimed at assessing the interaction between subjective error estimation and frequency of extrinsic feedback in the learning of the basketball free shooting pattern by children. 10- to 12-year olds were assigned to 1 of 4 groups combining subjective error estimation and relative frequency of extrinsic feedback (33% × 100%). Analysis of performance was based on quality of movement pattern. Analysis showed superior learning of the group combining error estimation and 100% feedback frequency, both groups receiving feedback on 33% of trials achieved intermediate results, and the group combining no requirement of error estimation and 100% feedback frequency had the poorest learning. Our results show the benefit of subjective error estimation in association with high frequency of extrinsic feedback in children's motor learning of a sport motor pattern.

  4. A Method to Estimate the Probability that Any Individual Cloud-to-Ground Lightning Stroke was Within Any Radius of Any Point

    Science.gov (United States)

    Huddleston, Lisa; Roeder, WIlliam P.; Merceret, Francis J.

    2011-01-01

    A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station. Future applications could include forensic meteorology.

  5. A Review of Mycotoxins in Food and Feed Products in Portugal and Estimation of Probable Daily Intakes.

    Science.gov (United States)

    Abrunhosa, Luís; Morales, Héctor; Soares, Célia; Calado, Thalita; Vila-Chã, Ana Sofia; Pereira, Martinha; Venâncio, Armando

    2016-01-01

    Mycotoxins are toxic secondary metabolites produced by filamentous fungi that occur naturally in agricultural commodities worldwide. Aflatoxins, ochratoxin A, patulin, fumonisins, zearalenone, trichothecenes, and ergot alkaloids are presently the most important for food and feed safety. These compounds are produced by several species that belong to the Aspergillus, Penicillium, Fusarium, and Claviceps genera and can be carcinogenic, mutagenic, teratogenic, cytotoxic, neurotoxic, nephrotoxic, estrogenic, and immunosuppressant. Human and animal exposure to mycotoxins is generally assessed by taking into account data on the occurrence of mycotoxins in food and feed as well as data on the consumption patterns of the concerned population. This evaluation is crucial to support measures to reduce consumer exposure to mycotoxins. This work reviews the occurrence and levels of mycotoxins in Portuguese food and feed to provide a global overview of this issue in Portugal. With the information collected, the exposure of the Portuguese population to those mycotoxins is assessed, and the estimated dietary intakes are presented.

  6. Probability estimation of rare extreme events in the case of small samples: Technique and examples of analysis of earthquake catalogs

    Science.gov (United States)

    Pisarenko, V. F.; Rodkin, M. V.; Rukavishnikova, T. A.

    2017-11-01

    The most general approach to studying the recurrence law in the area of the rare largest events is associated with the use of limit law theorems of the theory of extreme values. In this paper, we use the Generalized Pareto Distribution (GPD). The unknown GPD parameters are typically determined by the method of maximal likelihood (ML). However, the ML estimation is only optimal for the case of fairly large samples (>200-300), whereas in many practical important cases, there are only dozens of large events. It is shown that in the case of a small number of events, the highest accuracy in the case of using the GPD is provided by the method of quantiles (MQs). In order to illustrate the obtained methodical results, we have formed the compiled data sets characterizing the tails of the distributions for typical subduction zones, regions of intracontinental seismicity, and for the zones of midoceanic (MO) ridges. This approach paves the way for designing a new method for seismic risk assessment. Here, instead of the unstable characteristics—the uppermost possible magnitude M max—it is recommended to use the quantiles of the distribution of random maxima for a future time interval. The results of calculating such quantiles are presented.

  7. Estimating the probability distribution of the incubation period for rabies using data from the 1948-1954 rabies epidemic in Tokyo.

    Science.gov (United States)

    Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K

    2016-01-01

    Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Random sampler M-estimator algorithm with sequential probability ratio test for robust function approximation via feed-forward neural networks.

    Science.gov (United States)

    El-Melegy, Moumen T

    2013-07-01

    This paper addresses the problem of fitting a functional model to data corrupted with outliers using a multilayered feed-forward neural network. Although it is of high importance in practical applications, this problem has not received careful attention from the neural network research community. One recent approach to solving this problem is to use a neural network training algorithm based on the random sample consensus (RANSAC) framework. This paper proposes a new algorithm that offers two enhancements over the original RANSAC algorithm. The first one improves the algorithm accuracy and robustness by employing an M-estimator cost function to decide on the best estimated model from the randomly selected samples. The other one improves the time performance of the algorithm by utilizing a statistical pretest based on Wald's sequential probability ratio test. The proposed algorithm is successfully evaluated on synthetic and real data, contaminated with varying degrees of outliers, and compared with existing neural network training algorithms.

  9. On the consideration of scaling properties of extreme rainfall in Madrid (Spain) for developing a generalized intensity-duration-frequency equation and assessing probable maximum precipitation estimates

    Science.gov (United States)

    Casas-Castillo, M. Carmen; Rodríguez-Solà, Raúl; Navarro, Xavier; Russo, Beniamino; Lastra, Antonio; González, Paula; Redaño, Angel

    2018-01-01

    The fractal behavior of extreme rainfall intensities registered between 1940 and 2012 by the Retiro Observatory of Madrid (Spain) has been examined, and a simple scaling regime ranging from 25 min to 3 days of duration has been identified. Thus, an intensity-duration-frequency (IDF) master equation of the location has been constructed in terms of the simple scaling formulation. The scaling behavior of probable maximum precipitation (PMP) for durations between 5 min and 24 h has also been verified. For the statistical estimation of the PMP, an envelope curve of the frequency factor ( k m ) based on a total of 10,194 station-years of annual maximum rainfall from 258 stations in Spain has been developed. This curve could be useful to estimate suitable values of PMP at any point of the Iberian Peninsula from basic statistical parameters (mean and standard deviation) of its rainfall series. [Figure not available: see fulltext.

  10. Population pharmacokinetics and maximum a posteriori probability Bayesian estimator of abacavir: application of individualized therapy in HIV-infected infants and toddlers.

    Science.gov (United States)

    Zhao, Wei; Cella, Massimo; Della Pasqua, Oscar; Burger, David; Jacqz-Aigrain, Evelyne

    2012-04-01

    Abacavir is used to treat HIV infection in both adults and children. The recommended paediatric dose is 8 mg kg(-1) twice daily up to a maximum of 300 mg twice daily. Weight was identified as the central covariate influencing pharmacokinetics of abacavir in children. A population pharmacokinetic model was developed to describe both once and twice daily pharmacokinetic profiles of abacavir in infants and toddlers. Standard dosage regimen is associated with large interindividual variability in abacavir concentrations. A maximum a posteriori probability Bayesian estimator of AUC(0-) (t) based on three time points (0, 1 or 2, and 3 h) is proposed to support area under the concentration-time curve (AUC) targeted individualized therapy in infants and toddlers. To develop a population pharmacokinetic model for abacavir in HIV-infected infants and toddlers, which will be used to describe both once and twice daily pharmacokinetic profiles, identify covariates that explain variability and propose optimal time points to optimize the area under the concentration-time curve (AUC) targeted dosage and individualize therapy. The pharmacokinetics of abacavir was described with plasma concentrations from 23 patients using nonlinear mixed-effects modelling (NONMEM) software. A two-compartment model with first-order absorption and elimination was developed. The final model was validated using bootstrap, visual predictive check and normalized prediction distribution errors. The Bayesian estimator was validated using the cross-validation and simulation-estimation method. The typical population pharmacokinetic parameters and relative standard errors (RSE) were apparent systemic clearance (CL) 13.4 () h−1 (RSE 6.3%), apparent central volume of distribution 4.94 () (RSE 28.7%), apparent peripheral volume of distribution 8.12 () (RSE14.2%), apparent intercompartment clearance 1.25 () h−1 (RSE 16.9%) and absorption rate constant 0.758 h−1 (RSE 5.8%). The covariate analysis

  11. Multi-scale occupancy approach to estimate Toxoplasma gondii prevalence and detection probability in tissues: an application and guide for field sampling.

    Science.gov (United States)

    Elmore, Stacey A; Huyvaert, Kathryn P; Bailey, Larissa L; Iqbal, Asma; Su, Chunlei; Dixon, Brent R; Alisauskas, Ray T; Gajadhar, Alvin A; Jenkins, Emily J

    2016-08-01

    Increasingly, birds are recognised as important hosts for the ubiquitous parasite Toxoplasma gondii, although little experimental evidence exists to determine which tissues should be tested to maximise the detection probability of T. gondii. Also, Arctic-nesting geese are suspected to be important sources of T. gondii in terrestrial Arctic ecosystems, but the parasite has not previously been reported in the tissues of these geese. Using a domestic goose model, we applied a multi-scale occupancy framework to demonstrate that the probability of detection of T. gondii was highest in the brain (0.689, 95% confidence interval=0.486, 0.839) and the heart (0.809, 95% confidence interval=0.693, 0.888). Inoculated geese had an estimated T. gondii infection probability of 0.849, (95% confidence interval=0.643, 0.946), highlighting uncertainty in the system, even under experimental conditions. Guided by these results, we tested the brains and hearts of wild Ross's Geese (Chen rossii, n=50) and Lesser Snow Geese (Chen caerulescens, n=50) from Karrak Lake, Nunavut, Canada. We detected 51 suspected positive tissue samples from 33 wild geese using real-time PCR with melt-curve analysis. The wild goose prevalence estimates generated by our multi-scale occupancy analysis were higher than the naïve estimates of prevalence, indicating that multiple PCR repetitions on the same organs and testing more than one organ could improve T. gondii detection. Genetic characterisation revealed Type III T. gondii alleles in six wild geese and Sarcocystis spp. in 25 samples. Our study demonstrates that Arctic nesting geese are capable of harbouring T. gondii in their tissues and could transport the parasite from their southern overwintering grounds into the Arctic region. We demonstrate how a multi-scale occupancy framework can be used in a domestic animal model to guide resource-limited sample collection and tissue analysis in wildlife. Secondly, we confirm the value of traditional occupancy in

  12. Alpha-1 antitrypsin Pi*SZ genotype: estimated prevalence and number of SZ subjects worldwide

    Directory of Open Access Journals (Sweden)

    Blanco I

    2017-06-01

    Full Text Available Ignacio Blanco,1 Patricia Bueno,2 Isidro Diego,3 Sergio Pérez-Holanda,4 Beatriz Lara,5 Francisco Casas-Maldonado,6 Cristina Esquinas,7 Marc Miravitlles7,8 1Alpha1-Antitrypsin Deficiency Spanish Registry (REDAAT, Lung Foundation Breathe, Spanish Society of Pneumology (SEPAR, Barcelona, Spain; 2Internal Medicine Department, County Hospital of Jarrio, Principality of Asturias, Spain; 3Materials and Energy Department, School of Mining Engineering, Oviedo University, Principality of Asturias, Spain; 4Surgical Department, University Central Hospital of Asturias, Oviedo, Spain; 5Respiratory Medicine Department, Coventry and Warwickshire University Hospital, Coventry, UK; 6Pneumology Department, University Hospital San Cecilio, Granada, Spain; 7Pneumology Department, Hospital Universitari Vall d’Hebron, Barcelona, Spain; 8CIBER de Enfermedades Respiratorias (CIBERES, Barcelona, Spain Abstract: The alpha-1 antitrypsin (AAT haplotype Pi*S, when inherited along with the Pi*Z haplotype to form a Pi*SZ genotype, can be associated with pulmonary emphysema in regular smokers, and less frequently with liver disease, panniculitis, and systemic vasculitis in a small percentage of people, but this connection is less well established. Since the detection of cases can allow the application of preventive measures in patients and relatives with this congenital disorder, the objective of this study was to update the prevalence of the SZ genotype to achieve accurate estimates of the number of Pi*SZ subjects worldwide, based on studies performed according to the following criteria: 1 samples representative of the general population, 2 AAT phenotyping characterized by adequate methods, and 3 selection of studies with reliable results assessed with a coefficient of variation calculated from the sample size and 95% confidence intervals. Studies fulfilling these criteria were used to develop tables and maps with an inverse distance-weighted (IDW interpolation method, to

  13. A model to estimate the probability of human immunodeficiency virus and hepatitis C infection despite negative nucleic acid testing among increased-risk organ donors.

    Science.gov (United States)

    Annambhotla, Pallavi D; Gurbaxani, Brian M; Kuehnert, Matthew J; Basavaraju, Sridhar V

    2017-04-01

    In 2013, guidelines were released for reducing the risk of viral bloodborne pathogen transmission through organ transplantation. Eleven criteria were described that result in a donor being designated at increased infectious risk. Human immunodeficiency virus (HIV) and hepatitis C virus (HCV) transmission risk from an increased-risk donor (IRD), despite negative nucleic acid testing (NAT), likely varies based on behavior type and timing. We developed a Monte Carlo risk model to quantify probability of HIV among IRDs. The model included NAT performance, viral load dynamics, and per-act risk of acquiring HIV by each behavior. The model also quantifies the probability of HCV among IRDs by non-medical intravenous drug use (IVDU). Highest risk is among donors with history of unprotected, receptive anal male-to-male intercourse with partner of unknown HIV status (MSM), followed by sex with an HIV-infected partner, IVDU, and sex with a commercial sex worker. With NAT screening, the estimated risk of undetected HIV remains small even at 1 day following a risk behavior. The estimated risk for HCV transmission through IVDU is likewise small and decreases quicker with time owing to the faster viral growth dynamics of HCV compared with HIV. These findings may allow for improved organ allocation, utilization, and recipient informed consent. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Estimating reach-specific fish movement probabilities in rivers with a Bayesian state-space model: application to sea lamprey passage and capture at dams

    Science.gov (United States)

    Holbrook, Christopher M.; Johnson, Nicholas S.; Steibel, Juan P.; Twohey, Michael B.; Binder, Thomas R.; Krueger, Charles C.; Jones, Michael L.

    2014-01-01

    Improved methods are needed to evaluate barriers and traps for control and assessment of invasive sea lamprey (Petromyzon marinus) in the Great Lakes. A Bayesian state-space model provided reach-specific probabilities of movement, including trap capture and dam passage, for 148 acoustic tagged invasive sea lamprey in the lower Cheboygan River, Michigan, a tributary to Lake Huron. Reach-specific movement probabilities were combined to obtain estimates of spatial distribution and abundance needed to evaluate a barrier and trap complex for sea lamprey control and assessment. Of an estimated 21 828 – 29 300 adult sea lampreys in the river, 0%–2%, or 0–514 untagged lampreys, could have passed upstream of the dam, and 46%–61% were caught in the trap. Although no tagged lampreys passed above the dam (0/148), our sample size was not sufficient to consider the lock and dam a complete barrier to sea lamprey. Results also showed that existing traps are in good locations because 83%–96% of the population was vulnerable to existing traps. However, only 52%–69% of lampreys vulnerable to traps were caught, suggesting that traps can be improved. The approach used in this study was a novel use of Bayesian state-space models that may have broader applications, including evaluation of barriers for other invasive species (e.g., Asian carp (Hypophthalmichthys spp.)) and fish passage structures for other diadromous fishes.

  15. Estimating the probability and level of contamination with Salmonella of feed for finishing pigs produced in Switzerland--the impact of the production pathway.

    Science.gov (United States)

    Sauli, I; Danuser, J; Geeraerd, A H; Van Impe, J F; Rüfenacht, J; Bissig-Choisat, B; Wenk, C; Stärk, K D C

    2005-04-15

    Contaminated feed is a source of infection with Salmonella for livestock, including pigs. Because pigs rarely show clinical signs of salmonellosis, undetected carriers can enter the food production chain. In a "Farm to Fork" food safety concept, safe feed is the first step for ensuring safe food. Heat treatment or adding organic acids are process steps for reducing or eliminating a contamination with Salmonella. The aims of this study were (I) to estimate the probability and the level of Salmonella contamination in batches of feed for finishing pigs in Swiss mills and (II) to assess the efficacy of specific process steps for reducing the level of contamination with Salmonella. A quantitative release assessment was performed by gathering and combining data on the various parameters having an influence on the final contamination of feed. Fixed values and probability distributions attributed to these parameters were used as input values for a Monte Carlo simulation. The simulation showed that-depending on the production pathway-the probability that a batch of feed for finishing pigs contains Salmonella ranged from 34% (for feed on which no specific decontaminating step was applied) to 0% (for feed in which organic acids were added and a heat treatment was implemented). If contamination occurred, the level of contamination ranged from a few Salmonella kg(-1) feed to a maximum of 8E+04 Salmonella kg(-1) feed. Probability and levels of contamination were highest when no production process able to reduce or eliminate the pathogen was implemented. However, most of the Swiss production was shown to undergo some kind of decontaminating step. A heat treatment, in combination with the use of organic acids, was found as a solution of choice for the control of Salmonella in feed.

  16. Estimating the Transitional Probabilities of Smoking Stages with Cross-sectional Data and 10-Year Projection for Smoking Behavior in Iranian Adolescents.

    Science.gov (United States)

    Khosravi, Ahmad; Mansournia, Mohammad Ali; Mahmoodi, Mahmood; Pouyan, Ali Akbar; Holakouie-Naieni, Kourosh

    2016-01-01

    Cigarette smoking is one of the most important health-related risk factors in terms of morbidity and mortality. In this study, we introduced a new method for deriving the transitional probabilities of smoking stages from a cross-sectional study and simulated a long-term smoking behavior for adolescents. In this study in 2010, a total of 4853 high school students were randomly selected and were completed a self-administered questionnaire about cigarette smoking. We used smoothed age- and sex-specific prevalence of smoking stages in a probabilistic discrete event system for estimating of transitional probabilities. A nonhomogenous discrete time Markov chain analysis was used to model the progression of the smoking in 10 years ahead in the same population. The mean age of the students was 15.69 ± 0.73 years (range: 14-19). The smoothed prevalence proportion of current smoking varies between 3.58 and 26.14%. The age-adjusted odds of initiation in boys is 8.9 (95% confidence interval [CI]: 7.9-10.0) times of the odds of initiation of smoking in girls. Our study predicted that the prevalence proportion of current smokers increased from 7.55% in 2010 to 20.31% (95% CI: 19.44-21.37) for 2019. The present study showed a moderately but concerning prevalence of current smoking in Iranian adolescents and introduced a novel method for estimation of transitional probabilities from a cross-sectional study. The increasing trend of cigarette use among adolescents indicated the necessity of paying more attention to this group.

  17. A scenario tree model for the Canadian Notifiable Avian Influenza Surveillance System and its application to estimation of probability of freedom and sample size determination.

    Science.gov (United States)

    Christensen, Jette; Stryhn, Henrik; Vallières, André; El Allaki, Farouk

    2011-05-01

    In 2008, Canada designed and implemented the Canadian Notifiable Avian Influenza Surveillance System (CanNAISS) with six surveillance activities in a phased-in approach. CanNAISS was a surveillance system because it had more than one surveillance activity or component in 2008: passive surveillance; pre-slaughter surveillance; and voluntary enhanced notifiable avian influenza surveillance. Our objectives were to give a short overview of two active surveillance components in CanNAISS; describe the CanNAISS scenario tree model and its application to estimation of probability of populations being free of NAI virus infection and sample size determination. Our data from the pre-slaughter surveillance component included diagnostic test results from 6296 serum samples representing 601 commercial chicken and turkey farms collected from 25 August 2008 to 29 January 2009. In addition, we included data from a sub-population of farms with high biosecurity standards: 36,164 samples from 55 farms sampled repeatedly over the 24 months study period from January 2007 to December 2008. All submissions were negative for Notifiable Avian Influenza (NAI) virus infection. We developed the CanNAISS scenario tree model, so that it will estimate the surveillance component sensitivity and the probability of a population being free of NAI at the 0.01 farm-level and 0.3 within-farm-level prevalences. We propose that a general model, such as the CanNAISS scenario tree model, may have a broader application than more detailed models that require disease specific input parameters, such as relative risk estimates. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  18. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...

  19. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  20. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  1. Estimations of cholesterol, triglycerides and fractionation of lipoproteins in serum samples of some Nigerian female subjects

    Directory of Open Access Journals (Sweden)

    E.I. Adeyeye

    2011-04-01

    Full Text Available Blood samples (serum were collected to determine some biochemical parameters: total glycerides (TG, total cholesterol (TC, high density lipoprotein-cholesterol (HDL-C, low density lipoprotein-cholesterol (LDL-C and very low density lipoprotein-cholesterol (VLDL-C in 53 female subjects in Warri, Delta State, Nigeria using the Reflotron® (an auto analyser, supported with the use of questionnaire to get information on age and sex. Age range of the subjects was 18–80 years. The TG levels in all the subjects were < 200 mg/dL; only one subject (1.89% had TC < 200 mg/dL; nine subjects (17.0% had HDL-C ≤ 35 mg/dL; for LDL-C only one subject (1.89% had a desirable level of < 130 mg/dL; for VLDL-C 29 subjects (54.7% had values 17.2 mg/dL and above. For therapeutic decision-making, TC/HDL-C and LDL-C/HDL-C, were calculated. In TC/HDL-C, three subjects (5.66% had values < 4.4 and in LDL-C/HDL-C, 41 subjects (77.4% had values < 4.5. Hence, TC, HDL-C, LDL-C, TC/HDL-C and slightly LDL-C/HDL-C and VLDL-C in the subjects could lead to increase coronary heart diseases. Results were matched for the age and sex of subjects.

  2. Composite Estimation for Single-Index Models with Responses Subject to Detection Limits

    KAUST Repository

    Tang, Yanlin

    2017-11-03

    We propose a semiparametric estimator for single-index models with censored responses due to detection limits. In the presence of left censoring, the mean function cannot be identified without any parametric distributional assumptions, but the quantile function is still identifiable at upper quantile levels. To avoid parametric distributional assumption, we propose to fit censored quantile regression and combine information across quantile levels to estimate the unknown smooth link function and the index parameter. Under some regularity conditions, we show that the estimated link function achieves the non-parametric optimal convergence rate, and the estimated index parameter is asymptotically normal. The simulation study shows that the proposed estimator is competitive with the omniscient least squares estimator based on the latent uncensored responses for data with normal errors but much more efficient for heavy-tailed data under light and moderate censoring. The practical value of the proposed method is demonstrated through the analysis of a human immunodeficiency virus antibody data set.

  3. Probability estimates of heavy precipitation events in a flood-prone central-European region with enhanced influence of Mediterranean cyclones

    Science.gov (United States)

    Kysely, J.; Picek, J.

    2007-07-01

    Due to synoptic-climatological reasons as well as a specific configuration of mountain ranges, the northeast part of the Czech Republic is an area with an enhanced influence of low-pressure systems of the Mediterranean origin. They are associated with an upper-level advection of warm and moist air and often lead to heavy precipitation events. Particularities of this area are evaluated using a regional frequency analysis. The northeast region is identified as a homogeneous one according to tests on statistical characteristics of precipitation extremes (annual maxima of 1- to 7-day amounts), and observed distributions follow a different model compared to the surrounding area. Noteworthy is the heavy tail of distributions of multi-day events, reflected also in inapplicability of the L-moment estimators for the general 4-parameter kappa distribution utilized in Monte Carlo simulations in regional homogeneity and goodness-of-fit tests. We overcome this issue by using the maximum likelihood estimation. The Generalized Logistic distribution is identified as the most suitable one for modelling annual maxima; advantages of the regional over local approach to the frequency analysis consist mainly in reduced uncertainty of the growth curves and design value estimates. The regional growth curves are used to derive probabilities of recurrence of recent heavy precipitation events associated with major floods in the Odra river basin.

  4. Estimation of physical activity levels using cell phone questionnaires: a comparison with accelerometry for evaluation of between-subject and within-subject variations.

    Science.gov (United States)

    Bexelius, Christin; Sandin, Sven; Trolle Lagerros, Ylva; Litton, Jan-Eric; Löf, Marie

    2011-09-25

    Physical activity promotes health and longevity. Further elaboration of the role of physical activity for human health in epidemiological studies on large samples requires accurate methods that are easy to use, cheap, and possible to repeat. The use of telecommunication technologies such as cell phones is highly interesting in this respect. In an earlier report, we showed that physical activity level (PAL) assessed using a cell phone procedure agreed well with corresponding estimates obtained using the doubly labeled water method. However, our earlier study indicated high within-subject variation in relation to between-subject variations in PAL using cell phones, but we could not assess if this was a true variation of PAL or an artifact of the cell phone technique. Our objective was to compare within- and between-subject variations in PAL by means of cell phones with corresponding estimates using an accelerometer. In addition, we compared the agreement of daily PAL values obtained using the cell phone questionnaire with corresponding data obtained using an accelerometer. PAL was measured both with the cell phone questionnaire and with a triaxial accelerometer daily during a 2-week study period in 21 healthy Swedish women (20 to 45 years of age and BMI from 17.7 kg/m² to 33.6 kg/m²). The results were evaluated by fitting linear mixed effect models and descriptive statistics and graphs. With the accelerometer, 57% (95% confidence interval [CI] 40%-66%) of the variation was within subjects, while with the cell phone, within-subject variation was 76% (95% CI 59%-83%). The day-to-day variations in PAL observed using the cell phone questions agreed well with the corresponding accelerometer results. Both the cell phone questionnaire and the accelerometer showed high within-subject variations. Furthermore, day-to-day variations in PAL within subjects assessed using the cell phone agreed well with corresponding accelerometer values. Consequently, our cell phone

  5. Estimating Intervention Effects across Different Types of Single-Subject Experimental Designs: Empirical Illustration

    Science.gov (United States)

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M.; Onghena, Patrick; Heyvaert, Mieke; Beretvas, S. Natasha; Van den Noortgate, Wim

    2015-01-01

    The purpose of this study is to illustrate the multilevel meta-analysis of results from single-subject experimental designs of different types, including AB phase designs, multiple-baseline designs, ABAB reversal designs, and alternating treatment designs. Current methodological work on the meta-analysis of single-subject experimental designs…

  6. Combining information from surveys of several species to estimate the probability of freedom from Echinococcus multilocularis in Sweden, Finland and mainland Norway

    Directory of Open Access Journals (Sweden)

    Hjertqvist Marika

    2011-02-01

    Full Text Available Abstract Background The fox tapeworm Echinococcus multilocularis has foxes and other canids as definitive host and rodents as intermediate hosts. However, most mammals can be accidental intermediate hosts and the larval stage may cause serious disease in humans. The parasite has never been detected in Sweden, Finland and mainland Norway. All three countries require currently an anthelminthic treatment for dogs and cats prior to entry in order to prevent introduction of the parasite. Documentation of freedom from E. multilocularis is necessary for justification of the present import requirements. Methods The probability that Sweden, Finland and mainland Norway were free from E. multilocularis and the sensitivity of the surveillance systems were estimated using scenario trees. Surveillance data from five animal species were included in the study: red fox (Vulpes vulpes, raccoon dog (Nyctereutes procyonoides, domestic pig, wild boar (Sus scrofa and voles and lemmings (Arvicolinae. Results The cumulative probability of freedom from EM in December 2009 was high in all three countries, 0.98 (95% CI 0.96-0.99 in Finland and 0.99 (0.97-0.995 in Sweden and 0.98 (0.95-0.99 in Norway. Conclusions Results from the model confirm that there is a high probability that in 2009 the countries were free from E. multilocularis. The sensitivity analyses showed that the choice of the design prevalences in different infected populations was influential. Therefore more knowledge on expected prevalences for E. multilocularis in infected populations of different species is desirable to reduce residual uncertainty of the results.

  7. Modification of electrical pain threshold by voluntary breathing-controlled electrical stimulation (BreEStim in healthy subjects.

    Directory of Open Access Journals (Sweden)

    Shengai Li

    Full Text Available BACKGROUND: Pain has a distinct sensory and affective (i.e., unpleasantness component. BreEStim, during which electrical stimulation is delivered during voluntary breathing, has been shown to selectively reduce the affective component of post-amputation phantom pain. The objective was to examine whether BreEStim increases pain threshold such that subjects could have improved tolerance of sensation of painful stimuli. METHODS: Eleven pain-free healthy subjects (7 males, 4 females participated in the study. All subjects received BreEStim (100 stimuli and conventional electrical stimulation (EStim, 100 stimuli to two acupuncture points (Neiguan and Weiguan of the dominant hand in a random order. The two different treatments were provided at least three days apart. Painful, but tolerable electrical stimuli were delivered randomly during EStim, but were triggered by effortful inhalation during BreEStim. Measurements of tactile sensation threshold, electrical sensation and electrical pain thresholds, thermal (cold sensation, warm sensation, cold pain and heat pain thresholds were recorded from the thenar eminence of both hands. These measurements were taken pre-intervention and 10-min post-intervention. RESULTS: There was no difference in the pre-intervention baseline measurement of all thresholds between BreEStim and EStim. The electrical pain threshold significantly increased after BreEStim (27.5±6.7% for the dominant hand and 28.5±10.8% for the non-dominant hand, respectively. The electrical pain threshold significantly decreased after EStim (9.1±2.8% for the dominant hand and 10.2±4.6% for the non-dominant hand, respectively (F[1, 10] = 30.992, p = .00024. There was no statistically significant change in other thresholds after BreEStim and EStim. The intensity of electrical stimuli was progressively increased, but no difference was found between BreEStim and EStim. CONCLUSION: Voluntary breathing controlled electrical stimulation

  8. The objective-subjective assessment of noise: young adults can estimate loudness of events and lifestyle noise.

    Science.gov (United States)

    Beach, Elizabeth Francis; Williams, Warwick; Gilliver, Megan

    2012-06-01

    The aim of the study was to establish whether individuals can subjectively estimate: (1) the loudness of events with respect to the objectively measured noise level; and (2) the overall loudness of their daily noise exposure level. Participants wore personal noise exposure meters for up to five days. During this time, participants kept diaries of daily events and estimated the loudness of these events and their overall noise exposure using 1-to-10 rating scales. A group of 45 volunteers aged between 18 and 35 years participated in the study. 86% of participants' subjective estimates were significantly correlated with the objective noise measurements. Multiple regression showed that age, overall lifestyle noise, and diary quality were predictors of the strength of correlation observed. In addition participants' subjective estimates of their overall noise exposure were significantly correlated with their actual average daily noise exposure. Results suggest that individuals can make a reasonable estimate of the loudness of events they experience and the overall level of noise they experience. These results may have significant influence for those interested in producing effective hearing health awareness programs in that individuals may be capable of assessing their own degree of hazard exposure.

  9. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  10. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  11. Sensor Fusion Based on an Integrated Neural Network and Probability Density Function (PDF) Dual Kalman Filter for On-Line Estimation of Vehicle Parameters and States.

    Science.gov (United States)

    Vargas-Melendez, Leandro; Boada, Beatriz L; Boada, Maria Jesus L; Gauchia, Antonio; Diaz, Vicente

    2017-04-29

    Vehicles with a high center of gravity (COG), such as light trucks and heavy vehicles, are prone to rollover. This kind of accident causes nearly 33 % of all deaths from passenger vehicle crashes. Nowadays, these vehicles are incorporating roll stability control (RSC) systems to improve their safety. Most of the RSC systems require the vehicle roll angle as a known input variable to predict the lateral load transfer. The vehicle roll angle can be directly measured by a dual antenna global positioning system (GPS), but it is expensive. For this reason, it is important to estimate the vehicle roll angle from sensors installed onboard in current vehicles. On the other hand, the knowledge of the vehicle's parameters values is essential to obtain an accurate vehicle response. Some of vehicle parameters cannot be easily obtained and they can vary over time. In this paper, an algorithm for the simultaneous on-line estimation of vehicle's roll angle and parameters is proposed. This algorithm uses a probability density function (PDF)-based truncation method in combination with a dual Kalman filter (DKF), to guarantee that both vehicle's states and parameters are within bounds that have a physical meaning, using the information obtained from sensors mounted on vehicles. Experimental results show the effectiveness of the proposed algorithm.

  12. Modeller subjectivity in estimating pesticide parameters for leaching models using the same laboratory data set

    NARCIS (Netherlands)

    Boesten, J.J.T.I.

    2000-01-01

    User-dependent subjectivity in the process of testing pesticide leaching models is relevant because it may result in wrong interpretation of model tests. About 20 modellers used the same data set to test pesticide leaching models (one or two models per modeller). The data set included laboratory

  13. Elaboration of a clinical and paraclinical score to estimate the probability of herpes simplex virus encephalitis in patients with febrile, acute neurologic impairment.

    Science.gov (United States)

    Gennai, S; Rallo, A; Keil, D; Seigneurin, A; Germi, R; Epaulard, O

    2016-06-01

    Herpes simplex virus (HSV) encephalitis is associated with a high risk of mortality and sequelae, and early diagnosis and treatment in the emergency department are necessary. However, most patients present with non-specific febrile, acute neurologic impairment; this may lead clinicians to overlook the diagnosis of HSV encephalitis. We aimed to identify which data collected in the first hours in a medical setting were associated with the diagnosis of HSV encephalitis. We conducted a multicenter retrospective case-control study in four French public hospitals from 2007 to 2013. The cases were the adult patients who received a confirmed diagnosis of HSV encephalitis. The controls were all the patients who attended the emergency department of Grenoble hospital with a febrile acute neurologic impairment, without HSV detection by polymerase chain reaction (PCR) in the cerebrospinal fluid (CSF), in 2012 and 2013. A multivariable logistic model was elaborated to estimate factors significantly associated with HSV encephalitis. Finally, an HSV probability score was derived from the logistic model. We identified 36 cases and 103 controls. Factors independently associated with HSV encephalitis were the absence of past neurological history (odds ratio [OR] 6.25 [95 % confidence interval (CI): 2.22-16.7]), the occurrence of seizure (OR 8.09 [95 % CI: 2.73-23.94]), a systolic blood pressure ≥140 mmHg (OR 5.11 [95 % CI: 1.77-14.77]), and a C-reactive protein <10 mg/L (OR 9.27 [95 % CI: 2.98-28.88]). An HSV probability score was calculated summing the value attributed to each independent factor. HSV encephalitis diagnosis may benefit from the use of this score based upon some easily accessible data. However, diagnostic evocation and probabilistic treatment must remain the rule.

  14. The Use of Generalizability Theory to Estimate Data Reliability in Single-Subject Observational Research

    Science.gov (United States)

    Lei, Pui-Wa; Smith, Maria; Suen, Hoi K.

    2007-01-01

    Direct observation of behaviors is a data collection method customarily used in clinical and educational settings. Repeated measures and small samples are inherent characteristics of observational studies that pose challenges to the numerical estimation of reliability for observational data. In this article, we review some debates about the use of…

  15. A two-part model for reference curve estimation subject to a limit of detection

    OpenAIRE

    Zhang, Z.; Addo, O. Y.; Himes, J. H.; Hediger, M L; Albert, P S; Gollenberg, A. L.; Lee, P. A.; Louis, G. M. Buck

    2011-01-01

    Reference curves are commonly used to identify individuals with extreme values of clinically relevant variables or stages of progression which depend naturally on age or maturation. Estimation of reference curves can be complicated by a technical limit of detection (LOD) that censors the measurement from the left, as is the case in our study of reproductive hormone levels in boys around the time of the onset of puberty. We discuss issues with common approaches to the LOD problem in the contex...

  16. Estimating species – area relationships by modeling abundance and frequency subject to incomplete sampling

    Science.gov (United States)

    Yamaura, Yuichi; Connor, Edward F.; Royle, Andy; Itoh, Katsuo; Sato, Kiyoshi; Taki, Hisatomo; Mishima, Yoshio

    2016-01-01

    Models and data used to describe species–area relationships confound sampling with ecological process as they fail to acknowledge that estimates of species richness arise due to sampling. This compromises our ability to make ecological inferences from and about species–area relationships. We develop and illustrate hierarchical community models of abundance and frequency to estimate species richness. The models we propose separate sampling from ecological processes by explicitly accounting for the fact that sampled patches are seldom completely covered by sampling plots and that individuals present in the sampling plots are imperfectly detected. We propose a multispecies abundance model in which community assembly is treated as the summation of an ensemble of species-level Poisson processes and estimate patch-level species richness as a derived parameter. We use sampling process models appropriate for specific survey methods. We propose a multispecies frequency model that treats the number of plots in which a species occurs as a binomial process. We illustrate these models using data collected in surveys of early-successional bird species and plants in young forest plantation patches. Results indicate that only mature forest plant species deviated from the constant density hypothesis, but the null model suggested that the deviations were too small to alter the form of species–area relationships. Nevertheless, results from simulations clearly show that the aggregate pattern of individual species density–area relationships and occurrence probability–area relationships can alter the form of species–area relationships. The plant community model estimated that only half of the species present in the regional species pool were encountered during the survey. The modeling framework we propose explicitly accounts for sampling processes so that ecological processes can be examined free of sampling artefacts. Our modeling approach is extensible and could be applied

  17. Comparison of an automated Most Probable Number (MPN) technique to traditional plating methods for estimating populations of total aerobes, coliforms and E. coli associated with freshly processed broiler chickens

    Science.gov (United States)

    Recently, an instrument (TEMPOTM) has been developed to automate the Most Probable Number (MPN) technique and reduce the effort required to estimate some bacterial populations. We compared the automated MPN technique to traditional microbiological plating methods or PetrifilmTM for estimating the t...

  18. Using competing speech to estimate articulatory automatization in children: the possible effect of masking level and subject grade.

    Science.gov (United States)

    Manning, W H; Scheer, B R

    1978-09-01

    In order to study the possible influence of masking level and subject grade on a procedure for determining a child's articulatory automatization (Manning et al., 1976) 47 first and second grade and 49 third and fourth grade children were administered the McDonald Deep Test of Articulation under one of five conditions of auditory masking (earphones only, or presentation of competing speech at 50, 60, 70, or 80 dB SPL). Results indicated no significant difference in subject performance across the factors of masking level and subject grade. The findings suggest that these factors do not appear to be critical in the clinical application of the suggested procedure for estimating children's automatization of newly acquired phonemes.

  19. Incorporating a Process-Based Land Use Variable into Species- Distribution Modelling and an Estimated Probability of Species Occurrence Into a Land Change Model: A Case of Albania

    Science.gov (United States)

    Laze, Kuenda

    2016-08-01

    Modelling of land use may be improved by incorporating the results of species distribution modelling and species distribution modelling may be upgraded if a variable of the process-based variable of forest cover change or accessibility of forest from human settlement is included. This work presents the results of spatially explicit analyses of the changes in forest cover from 2000 to 2007 using the method of Geographically Weighted Regression (GWR) and of the species distribution for protected species of Lynx lynx martinoi, Ursus arctos using Generalized Linear Models (GLMs). The methodological approach is separately searching for a parsimonious model for forest cover change and species distribution for the entire territory of Albania. The findings of this work show that modelling of land change and of species distribution is indeed value-added by showing higher values of model selection of corrected Akaike Information Criterion. These results provide evidences on the effects of process-based variables on species distribution modelling and on the performance of species distribution modelling as well as show an example of the incorporation of estimated probability of species occurrences in a land change modelling.

  20. Probability Estimates of Solar Particle Event Doses During a Period of Low Sunspot Number for Thinly-Shielded Spacecraft and Short Duration Missions

    Science.gov (United States)

    Atwell, William; Tylka, Allan J.; Dietrich, William; Rojdev, Kristina; Matzkind, Courtney

    2016-01-01

    In an earlier paper (Atwell, et al., 2015), we investigated solar particle event (SPE) radiation exposures (absorbed dose) to small, thinly-shielded spacecraft during a period when the sunspot number (SSN) was less than 30. These SPEs contain Ground Level Events (GLE), sub-GLEs, and sub-sub-GLEs (Tylka and Dietrich, 2009, Tylka and Dietrich, 2008, and Atwell, et al., 2008). GLEs are extremely energetic solar particle events having proton energies extending into the several GeV range and producing secondary particles in the atmosphere, mostly neutrons, observed with ground station neutron monitors. Sub-GLE events are less energetic, extending into the several hundred MeV range, but do not produce secondary atmospheric particles. Sub-sub GLEs are even less energetic with an observable increase in protons at energies greater than 30 MeV, but no observable proton flux above 300 MeV. In this paper, we consider those SPEs that occurred during 1973-2010 when the SSN was greater than 30 but less than 50. In addition, we provide probability estimates of absorbed dose based on mission duration with a 95% confidence level (CL). We also discuss the implications of these data and provide some recommendations that may be useful to spacecraft designers of these smaller spacecraft.

  1. GIS-based estimation of the winter storm damage probability in forests: a case study from Baden-Wuerttemberg (Southwest Germany).

    Science.gov (United States)

    Schindler, Dirk; Grebhan, Karin; Albrecht, Axel; Schönborn, Jochen; Kohnle, Ulrich

    2012-01-01

    Data on storm damage attributed to the two high-impact winter storms 'Wiebke' (28 February 1990) and 'Lothar' (26 December 1999) were used for GIS-based estimation and mapping (in a 50 × 50 m resolution grid) of the winter storm damage probability (P(DAM)) for the forests of the German federal state of Baden-Wuerttemberg (Southwest Germany). The P(DAM)-calculation was based on weights of evidence (WofE) methodology. A combination of information on forest type, geology, soil type, soil moisture regime, and topographic exposure, as well as maximum gust wind speed field was used to compute P(DAM) across the entire study area. Given the condition that maximum gust wind speed during the two storm events exceeded 35 m s(-1), the highest P(DAM) values computed were primarily where coniferous forest grows in severely exposed areas on temporarily moist soils on bunter sandstone formations. Such areas are found mainly in the mountainous ranges of the northern Black Forest, the eastern Forest of Odes, in the Virngrund area, and in the southwestern Alpine Foothills.

  2. Estimating the Burden of Leptospirosis among Febrile Subjects Aged below 20 Years in Kampong Cham Communities, Cambodia, 2007-2009.

    Science.gov (United States)

    Hem, Sopheak; Ly, Sowath; Votsi, Irene; Vogt, Florian; Asgari, Nima; Buchy, Philippe; Heng, Seiha; Picardeau, Mathieu; Sok, Touch; Ly, Sovann; Huy, Rekol; Guillard, Bertrand; Cauchemez, Simon; Tarantola, Arnaud

    2016-01-01

    Leptospirosis is an emerging but neglected public health challenge in the Asia/Pacific Region with an annual incidence estimated at 10-100 per 100,000 population. No accurate data, however, are available for at-risk rural Cambodian communities. We conducted anonymous, unlinked testing for IgM antibodies to Leptospira spp. on paired sera of Cambodian patients Leptospirosis testing was done on paired serological samples negative for Dengue, Japanese encephalitis and Chikungunya viruses after random selection. Convalescent samples found positive while initial samples were negative were considered as proof of acute infection. We then applied a mathematical model to estimate the risk of fever caused by leptospirosis, dengue or other causes in rural Cambodia. A total of 630 samples are coming from a randomly selected subset of 2358 samples. IgM positive were found on the convalescent serum sample, among which 100 (15.8%) samples were IgM negative on an earlier sample. Seventeen of these 100 seroconversions were confirmed using a Microagglutination Test. We estimated the probability of having a fever due to leptospirosis at 1. 03% (95% Credible Interval CI: 0. 95%-1. 22%) per semester. In comparison, this probability was 2. 61% (95% CI: 2. 55%, 2. 83%) for dengue and 17. 65% (95% CI: 17. 49%, 18. 08%) for other causes. Our data from febrile cases aged below 20 years suggest that the burden of leptospirosis is high in rural Cambodian communities. This is especially true during the rainy season, even in the absence of identified epidemics.

  3. Validation of air-displacement plethysmography for estimation of body fat mass in healthy elderly subjects.

    Science.gov (United States)

    Bosy-Westphal, A; Mast, M; Eichhorn, C; Becker, C; Kutzner, D; Heller, M; Müller, M J

    2003-08-01

    Air-displacement plethysmography (ADP) is a non-invasive method for body composition analysis that divides the body into fat-free mass (FFM) and fat mass (FM) (= 2 compartment model, 2C). It places low demands on subject performance and is therefore most convenient in the elderly. To validate ADP against dual energy X-ray absorptiometry (DEXA) and to compare it to a four-compartment model of body composition (4C; fat mass, total body water, bone mineral content and residual mass) in the elderly. Body composition was assessed by ADP, DEXA and bioelectrical impedance analysis (BIA) in 26 healthy elderly subjects (15 women, 11 men) aged 60-82 years. Despite a high correlation of %FM assessed by ADP and DEXA we observed significant differences between the results of these methods for both sexes (2.5 +/-3.4%; bias +/- SD). Deviations of %FM(ADP) from %FM(DEXA) were dependent on bone mineral content (BMC(DEXA)) fraction of FFM. A low BMC(DEXA) was related to an overestimation of DEXA-derived %FM by ADP. There was a systematic bias between results from ADP and the 4C model. 76% of its variance was explained by the assumption of a fixed density of FFM. 96% of the variance in the density of FFM was explained by water content and only 4% by BMC(DEXA) of FFM. When compared to a 4C model, overestimation of %FM(ADP) increases with increasing water fraction of FFM. Although there is a tendency for overestimation of %FM(ADP),ADP is a valid method for body composition measurement in the elderly. The bias in %FM(ADP) is mainly related to water content of FFM and indicates that a correction factor for TBW may improve the accuracy of the ADP measurements in the elderly.

  4. Normal Tissue Complication Probability Estimation by the Lyman-Kutcher-Burman Method Does Not Accurately Predict Spinal Cord Tolerance to Stereotactic Radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Daly, Megan E.; Luxton, Gary [Department of Radiation Oncology, Stanford University, Stanford, CA (United States); Choi, Clara Y.H. [Department of Neurosurgery, Stanford University, Stanford, CA (United States); Gibbs, Iris C. [Department of Radiation Oncology, Stanford University, Stanford, CA (United States); Chang, Steven D.; Adler, John R. [Department of Neurosurgery, Stanford University, Stanford, CA (United States); Soltys, Scott G., E-mail: sgsoltys@stanford.edu [Department of Radiation Oncology, Stanford University, Stanford, CA (United States)

    2012-04-01

    Purpose: To determine whether normal tissue complication probability (NTCP) analyses of the human spinal cord by use of the Lyman-Kutcher-Burman (LKB) model, supplemented by linear-quadratic modeling to account for the effect of fractionation, predict the risk of myelopathy from stereotactic radiosurgery (SRS). Methods and Materials: From November 2001 to July 2008, 24 spinal hemangioblastomas in 17 patients were treated with SRS. Of the tumors, 17 received 1 fraction with a median dose of 20 Gy (range, 18-30 Gy) and 7 received 20 to 25 Gy in 2 or 3 sessions, with cord maximum doses of 22.7 Gy (range, 17.8-30.9 Gy) and 22.0 Gy (range, 20.2-26.6 Gy), respectively. By use of conventional values for {alpha}/{beta}, volume parameter n, 50% complication probability dose TD{sub 50}, and inverse slope parameter m, a computationally simplified implementation of the LKB model was used to calculate the biologically equivalent uniform dose and NTCP for each treatment. Exploratory calculations were performed with alternate values of {alpha}/{beta} and n. Results: In this study 1 case (4%) of myelopathy occurred. The LKB model using radiobiological parameters from Emami and the logistic model with parameters from Schultheiss overestimated complication rates, predicting 13 complications (54%) and 18 complications (75%), respectively. An increase in the volume parameter (n), to assume greater parallel organization, improved the predictive value of the models. Maximum-likelihood LKB fitting of {alpha}/{beta} and n yielded better predictions (0.7 complications), with n = 0.023 and {alpha}/{beta} = 17.8 Gy. Conclusions: The spinal cord tolerance to the dosimetry of SRS is higher than predicted by the LKB model using any set of accepted parameters. Only a high {alpha}/{beta} value in the LKB model and only a large volume effect in the logistic model with Schultheiss data could explain the low number of complications observed. This finding emphasizes that radiobiological models

  5. Positron emission tomography/computerised tomography imaging in detecting and managing recurrent cervical cancer: systematic review of evidence, elicitation of subjective probabilities and economic modelling.

    Science.gov (United States)

    Meads, C; Auguste, P; Davenport, C; Małysiak, S; Sundar, S; Kowalska, M; Zapalska, A; Guest, P; Thangaratinam, S; Martin-Hirsch, P; Borowiack, E; Barton, P; Roberts, T; Khan, K

    2013-03-01

    Cancer of the uterine cervix is a common cause of mortality in women. After initial treatment women may be symptom free, but the cancer may recur within a few years. It is uncertain whether it is more clinically effective to survey asymptomatic women for signs of recurrence or to await symptoms or signs before using imaging. This project compared the diagnostic accuracy of imaging using positron emission tomography/computerised tomography (PET-CT) with that of imaging using CT or magnetic resonance imaging (MRI) alone and evaluated the cost-effectiveness of adding PET-CT as an adjunct to standard practice. Standard systematic review methods were used to obtain and evaluate relevant test accuracy and effectiveness studies. Databases searched included MEDLINE, EMBASE, Science Citation Index and The Cochrane Library. All databases were searched from inception to May 2010. Study quality was assessed using appropriately modified Quality Assessment of Diagnostic Accuracy Studies (QUADAS) criteria. Included were any studies of PET-CT, MRI or CT compared with the reference standard of histopathological findings or clinical follow-up in symptomatic women suspected of having recurrent or persistent cervical cancer and in asymptomatic women a minimum of 3 months after completion of primary treatment. Subjective elicitation of expert opinion was used to supplement diagnostic information needed for the economic evaluation. The effectiveness of treatment with chemotherapy, radiotherapy, chemoradiotherapy, radical hysterectomy and pelvic exenteration was systematically reviewed. Meta-analysis was carried out in RevMan 5.1 (The Cochrane Collaboration, The Nordic Cochrane Centre, Copenhagen, Denmark) and Stata version 11 (StataCorp LP, College Station, Texas, USA). A Markov model was developed to compare the relative cost-effectiveness using TreeAge Pro software version 2011 (TreeAge Software Inc., Evanston, IL, USA). For the diagnostic review, a total of 7524 citations were

  6. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  7. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  8. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  9. Estimation of spinopelvic muscles' volumes in young asymptomatic subjects: a quantitative analysis.

    Science.gov (United States)

    Amabile, Celia; Moal, Bertrand; Chtara, Oussama Arous; Pillet, Helene; Raya, Jose G; Iannessi, Antoine; Skalli, Wafa; Lafage, Virginie; Bronsard, Nicolas

    2017-04-01

    Muscles have been proved to be a major component in postural regulation during pathological evolution or aging. Particularly, spinopelvic muscles are recruited for compensatory mechanisms such as pelvic retroversion, or knee flexion. Change in muscles' volume could, therefore, be a marker of greater postural degradation. Yet, it is difficult to interpret spinopelvic muscular degradation as there are few reported values for young asymptomatic adults to compare to. The objective was to provide such reference values on spinopelvic muscles. A model predicting the muscular volume from reduced set of MRI segmented images was investigated. A total of 23 asymptomatic subjects younger than 24 years old underwent an MRI acquisition from T12 to the knee. Spinopelvic muscles were segmented to obtain an accurate 3D reconstruction, allowing precise computation of muscle's volume. A model computing the volume of muscular groups from less than six MRI segmented slices was investigated. Baseline values have been reported in tables. For all muscles, invariance was found for the shape factor [ratio of volume over (area times length): SD muscles' values for a reference population have been reported. A new model predicting the muscles' volumes from a reduced set of MRI slices is proposed. While this model still needs to be validated on other populations, the current study appears promising for clinical use to determine, quantitatively, the muscular degradation.

  10. Subject-specific body segment parameter estimation using 3D photogrammetry with multiple cameras

    Science.gov (United States)

    Morris, Mark; Sellers, William I.

    2015-01-01

    Inertial properties of body segments, such as mass, centre of mass or moments of inertia, are important parameters when studying movements of the human body. However, these quantities are not directly measurable. Current approaches include using regression models which have limited accuracy: geometric models with lengthy measuring procedures or acquiring and post-processing MRI scans of participants. We propose a geometric methodology based on 3D photogrammetry using multiple cameras to provide subject-specific body segment parameters while minimizing the interaction time with the participants. A low-cost body scanner was built using multiple cameras and 3D point cloud data generated using structure from motion photogrammetric reconstruction algorithms. The point cloud was manually separated into body segments, and convex hulling applied to each segment to produce the required geometric outlines. The accuracy of the method can be adjusted by choosing the number of subdivisions of the body segments. The body segment parameters of six participants (four male and two female) are presented using the proposed method. The multi-camera photogrammetric approach is expected to be particularly suited for studies including populations for which regression models are not available in literature and where other geometric techniques or MRI scanning are not applicable due to time or ethical constraints. PMID:25780778

  11. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  12. A comparison of Coomassie blue dye with radioiodinated albumin as an indicator for plasma volume estimation in human subjects

    Science.gov (United States)

    Menzies, Ian S.

    1966-01-01

    Plasma volume has been estimated in 10 human subjects using Coomassie blue and 131I radioiodinated human serum albumin dilution methods simultaneously. Three different methods of correction used by previous workers to overcome the error due to early dye loss were applied. Satisfactory agreement with the established radioiodinated albumin method was only obtained by extrapolation of the semilogarithmic plot of Coomassie blue plasma dye concentration between five and 10 minutes to the time of injection. The significance of the controversial Evans blue `mixing curve' is discussed. An analogous phase in the Coomassie blue disappearance slope is considered to be due to initial rapid loss of dye from the circulation rather than to the process of mixing. It is shown that Coomassie blue fulfils the criteria listed in the discussion for plasma volume estimation. PMID:4160095

  13. Parameter Uncertainty in Exponential Family Tail Estimation

    OpenAIRE

    Landsman, Z.; Tsanakas, A.

    2012-01-01

    Actuaries are often faced with the task of estimating tails of loss distributions from just a few observations. Thus estimates of tail probabilities (reinsurance prices) and percentiles (solvency capital requirements) are typically subject to substantial parameter uncertainty. We study the bias and MSE of estimators of tail probabilities and percentiles, with focus on 1-parameter exponential families. Using asymptotic arguments it is shown that tail estimates are subject to significant positi...

  14. Estimation of probability for the presence of claw and digital skin diseases by combining cow- and herd-level information using a Bayesian network

    DEFF Research Database (Denmark)

    Ettema, Jehan Frans; Østergaard, Søren; Kristensen, Anders Ringgaard

    2009-01-01

    Cross sectional data on the prevalence of claw and (inter) digital skin diseases on 4854 Holstein Friesian cows in 50 Danish dairy herds was used in a Bayesian network to create herd specific probability distributions for the presence of lameness causing diseases. Parity and lactation stage...... probabilities and random herd effects are used to formulate cow-level probability distributions of disease presence in a specific Danish dairy herd. By step-wise inclusion of information on cow- and herd-level risk factors, lameness prevalence and clinical diagnosis of diseases on cows in the herd, the Bayesian...

  15. Eliciting Subjective Probability Distributions with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    2015-01-01

    We test in a laboratory experiment the theoretical prediction that risk attitudes have a surprisingly small role in distorting reports from true belief distributions. We find evidence consistent with theory in our experiment....

  16. Variance decomposition for single-subject task-based fMRI activity estimates across many sessions.

    Science.gov (United States)

    Gonzalez-Castillo, Javier; Chen, Gang; Nichols, Thomas E; Bandettini, Peter A

    2017-07-01

    Here we report an exploratory within-subject variance decomposition analysis conducted on a task-based fMRI dataset with an unusually large number of repeated measures (i.e., 500 trials in each of three different subjects) distributed across 100 functional scans and 9 to 10 different sessions. Within-subject variance was segregated into four primary components: variance across-sessions, variance across-runs within a session, variance across-blocks within a run, and residual measurement/modeling error. Our results reveal inhomogeneous and distinct spatial distributions of these variance components across significantly active voxels in grey matter. Measurement error is dominant across the whole brain. Detailed evaluation of the remaining three components shows that across-session variance is the second largest contributor to total variance in occipital cortex, while across-runs variance is the second dominant source for the rest of the brain. Network-specific analysis revealed that across-block variance contributes more to total variance in higher-order cognitive networks than in somatosensory cortex. Moreover, in some higher-order cognitive networks across-block variance can exceed across-session variance. These results help us better understand the temporal (i.e., across blocks, runs and sessions) and spatial distributions (i.e., across different networks) of within-subject natural variability in estimates of task responses in fMRI. They also suggest that different brain regions will show different natural levels of test-retest reliability even in the absence of residual artifacts and sufficiently high contrast-to-noise measurements. Further confirmation with a larger sample of subjects and other tasks is necessary to ensure generality of these results. Published by Elsevier Inc.

  17. Raster dataset showing the probability of elevated concentrations of nitrate in ground water in Colorado, hydrogeomorphic regions included and fertilizer use estimates not included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  18. Raster dataset showing the probability of detecting atrazine/desethyl-atrazine in ground water in Colorado, hydrogeomorphic regions included and atrazine use estimates not included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  19. Raster dataset showing the probability of detecting atrazine/desethyl-atrazine in ground water in Colorado, hydrogeomorphic regions and atrazine use estimates included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  20. Raster dataset showing the probability of elevated concentrations of nitrate in ground water in Colorado, hydrogeomorphic regions not included and fertilizer use estimates included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  1. Raster dataset showing the probability of elevated concentrations of nitrate in ground water in Colorado, hydrogeomorphic regions and fertilizer use estimates not included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  2. Raster dataset showing the probability of detecting atrazine/desethyl-atrazine in ground water in Colorado, hydrogeomorphic regions and atrazine use estimates not included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  3. Raster dataset showing the probability of detecting atrazine/desethyl-atrazine in ground water in Colorado, hydrogeomorphic regions not included and atrazine use estimates included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  4. Raster dataset showing the probability of elevated concentrations of nitrate in ground water in Colorado, hydrogeomorphic regions and fertilizer use estimates included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  5. Ambulatory Arterial Stiffness Index (AASI) is Unable to Estimate Arterial Stiffness of Hypertensive Subjects: Role of Nocturnal Dipping of Blood Pressure.

    Science.gov (United States)

    Di Raimondo, Domenico; Casuccio, Alessandra; Di Liberti, Rosangela; Musiari, Gaia; Zappulla, Valentina; D'Angelo, Alessandra; Pinto, Antonio

    2017-01-01

    the dependent variable confirmed the significant association between AASI and nocturnal dip (p: 0.015). The Multinomial Logistic Regression Analysis, in which AASI values were adjusted for the main confounders (age, sex, Body Mass Index, 24h SBP, 24h DBP) showed that the association between AASI and dipping is maintained only for dipper and extreme dipper hypertensives, missing the significance for mild and reverse subjects. 1) AASI levels are associated with night-to-day BP ratio; 2) Lower levels of AASI are significantly associated to extreme dipper and dipper BP nocturnal profile when compared to healthy controls. 3) After correction for the major confounding factors, the association between AASI and the high-damaged class of hypertensive subjects with lower or no nocturnal fall of BP is lost. Our findings support the hypothesis that AASI is unable to estimate AS of older hypertensive subjects with a high burden of organ and vascular damage and several comorbidities, probably because the nocturnal reduction of BP is the main determinant of AASI, being more powerful than AS itself. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  6. Comparison of the unstructured clinician gestalt, the wells score, and the revised Geneva score to estimate pretest probability for suspected pulmonary embolism.

    Science.gov (United States)

    Penaloza, Andrea; Verschuren, Franck; Meyer, Guy; Quentin-Georget, Sybille; Soulie, Caroline; Thys, Frédéric; Roy, Pierre-Marie

    2013-08-01

    The assessment of clinical probability (as low, moderate, or high) with clinical decision rules has become a cornerstone of diagnostic strategy for patients with suspected pulmonary embolism, but little is known about the use of physician gestalt assessment of clinical probability. We evaluate the performance of gestalt assessment for diagnosing pulmonary embolism. We conducted a retrospective analysis of a prospective observational cohort of consecutive suspected pulmonary embolism patients in emergency departments. Accuracy of gestalt assessment was compared with the Wells score and the revised Geneva score by the area under the curve (AUC) of receiver operating characteristic curves. Agreement between the 3 methods was determined by κ test. The study population was 1,038 patients, with a pulmonary embolism prevalence of 31.3%. AUC differed significantly between the 3 methods and was 0.81 (95% confidence interval [CI] 0.78 to 0.84) for gestalt assessment, 0.71 (95% CI 0.68 to 0.75) for Wells, and 0.66 (95% CI 0.63 to 0.70) for the revised Geneva score. The proportion of patients categorized as having low clinical probability was statistically higher with gestalt than with revised Geneva score (43% versus 26%; 95% CI for the difference of 17%=13% to 21%). Proportion of patients categorized as having high clinical probability was higher with gestalt than with Wells (24% versus 7%; 95% CI for the difference of 17%=14% to 20%) or revised Geneva score (24% versus 10%; 95% CI for the difference of 15%=13% to 21%). Pulmonary embolism prevalence was significantly lower with gestalt versus clinical decision rules in low clinical probability (7.6% for gestalt versus 13.0% for revised Geneva score and 12.6% for Wells score) and non-high clinical probability groups (18.3% for gestalt versus 29.3% for Wells and 27.4% for revised Geneva score) and was significantly higher with gestalt versus Wells score in high clinical probability groups (72.1% versus 58.1%). Agreement

  7. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  8. Interval estimation for the area under the receiver operating characteristic curve when data are subject to error.

    Science.gov (United States)

    Li, Yanhong; Koval, John J; Donner, Allan; Zou, G Y

    2010-10-30

    The area (A) under the receiver operating characteristic curve is commonly used to quantify the ability of a biomarker to correctly classify individuals into two populations. However, many markers are subject to measurement error, which must be accounted for to prevent understating their effectiveness. In this paper, we develop a new confidence interval procedure for A which is adjusted for measurement error using either external or internal replicated measurements. Based on the observation that A is a function of normal means and variances, we develop the procedure by recovering variance estimates needed from confidence limits for normal means and variances. Simulation results show that the procedure performs better than the previous ones based on the delta-method in terms of coverage percentage, balance of tail errors and interval width. Two examples are presented. Copyright © 2010 John Wiley & Sons, Ltd.

  9. Estimating probabilities of infestation and extent of damage by the roundheaded pine beetle in ponderosa pine in the Sacramento Mountains, New Mexico

    Science.gov (United States)

    Jose Negron

    1997-01-01

    Classification trees and linear regression analysis were used to build models to predict probabilities of infestation and amount of tree mortality in terms of basal area resulting from roundheaded pine beetle, Dendroctonus adjunctus Blandford, activity in ponderosa pine, Pinus ponderosa Laws., in the Sacramento Mountains, New Mexico. Classification trees were built for...

  10. Two-step probability plot for parameter estimation of lifetime distribution affected by defect clustering in time-dependent dielectric breakdown

    Science.gov (United States)

    Yokogawa, Shinji

    2017-07-01

    In this study, a simple method of statistical parameter estimation is proposed for lifetime distribution that has three parameters due to the defect clustering in the middle-of-line and back-end-of-line. A two-step procedure provides the estimations of distribution parameters effectively for the time-dependent dielectric breakdown. In the first step, a clustering parameter of distribution, which is one of the shape parameters, is estimated by a linearization treatment of plotted data on the proposed chart. Then, in the second step, shape and scale parameters are estimated by calculating of a slope and an intercept, respectively. The statistical accuracy of the estimates is evaluated using the Monte-Carlo simulation technique and mean squared error of estimates.

  11. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  12. SUBJECTIVE AND OBJECTIVE ESTIMATION OF THE LEVEL OF PHYSICAL EDUCATION IN SERVICE OF CONSERVING HEALTH STATUS IMPROVEMENT

    Directory of Open Access Journals (Sweden)

    Dragan Krivokapić

    2014-06-01

    . One of the most important publications in which this connection is emphasized is the report of the American Ministry of Health, called Physical activity and health (1996, which gives a number of useful effects on health status of people who participated in some form of physical activity. Exact minimal volume and intensity of physical activity enough to cause positive effects on health status is still unknown, so the estimation of elements of physical form related to health became important for many institutions occupied with health of people. Discussion: For each of the above mentioned elements of physical form related to health, there were different subjective and objective procedures established that can be used for their estimation. Carpensen CJ. Powell KE, Cristenson GM Besides (1985, it is very important to take into account a clear aim for which a certain estimation is done, because it enables implementation of the most appropriate protocol for estimation of each element of physical form. In that sense, subjective and objective estimation of the level of physical activity of an individual is essential for preservation and improvement of their health status. References: American College for Sports Medicine, Guildelines for exericise testing and Prescription.8th ed. Philadelphia: 2009 Lippincott Williams&Wilkins, 248-52. Carpensen CJ, Powell KE, Cristenson GM (1985. Physical activity , ehercise, and physical fitness: definitions and distinctions for healt-related research. Public Health Rep., 100(2, 126-31. U.S. Department of Healt and Human Services and Centers for DiseaseControl and Prevention. Physical Activity and health: A report of the Surgeon General. Atlanta (GA: 1996 National Center for Chronic Disease Prevention and Health Promotion, 89-90.

  13. Calibration between the Estimated Probability of the Risk Assessment Chart of Japan Atherosclerosis Society and Actual Mortality Using External Population: Evidence for Cardiovascular Prevention from Observational Cohorts in Japan (EPOCH-JAPAN).

    Science.gov (United States)

    Nakai, Michikazu; Miyamoto, Yoshihiro; Higashiyama, Aya; Murakami, Yoshitaka; Nishimura, Kunihiro; Yatsuya, Hiroshi; Saitoh, Shigeyuki; Sakata, Kiyomi; Iso, Hiroyasu; Miura, Katsuyuki; Ueshima, Hirotsugu; Okamura, Tomonori

    2016-01-01

    In Japan Atherosclerosis Society guidelines for the prevention of atherosclerotic cardiovascular diseases 2012 (JAS2012), NIPPON DATA80 risk assessment chart (ND80RAC) was adopted to estimate the 10-year probability of coronary artery disease (CAD) mortality. However, there was no comparison between the estimated mortality calculated by ND80RAC and actual mortality in external populations. Accordingly, we used the large pooled database of cohorts in Japan, EPOCH-JAPAN, as an external population. The participants of EPOCH-JAPAN without a history of cardiovascular disease (15,091 men and 18,589 women aged 40-74 years) were analyzed based on sex. The probability of a 10-year risk of CAD/stroke mortality was estimated by ND80RAC. The participants were divided into both decile of their estimated mortality and three categories according to JAS2012. The calibration between the mean estimated mortality and the actual mortality was performed by the Hosmer and Lemeshow (H-L) test. In both sexes, the estimated CAD mortality was higher than the actual mortality, particularly in higher deciles of estimated mortality, and the estimated stroke mortality was almost concordant with the actual mortality in low/moderate deciles of estimated mortality. As for the categories according to JAS2012, the estimated CAD mortality was higher than the actual mortality in both sexes; actual mortality in Category III was lower than that in Category II in women. However, it increased in the ascending order of category when we excluded the presence of diabetes from Category III. The estimated CAD mortality by ND80RAC tended to be higher than the actual mortality in the population in which the baseline survey was more recently performed.

  14. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  15. Liquefaction Probability Curves for Surficial Geologic Units

    Science.gov (United States)

    Holzer, T. L.; Noce, T. E.; Bennett, M. J.

    2009-12-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different surficial geologic deposits. The geologic units include alluvial fan, beach ridge, river delta, eolian dune, point bar, floodbasin, natural river levee, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities were derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 935 cone penetration tests. Most of the curves can be fit with a 3-parameter logistic function, which facilitates computations of probability. For natural deposits with a water table at 1.5 m depth and subjected to an M7.5 earthquake with a PGA = 0.25 g, probabilities range from 0.5 for fluvial point bar, barrier island beach ridge, and deltaic deposits. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to post-earthquake observations. We also have used the curves to assign ranges of liquefaction probabilities to the susceptibility categories proposed by Youd and Perkins (1978) for different geologic deposits. For the earthquake loading and conditions described above, probabilities range from 0-0.08 for low, 0.09-0.30 for moderate, 0.31-0.62 for high, to 0.63-1.00 for very high susceptibility. Liquefaction probability curves have two primary practical applications. First, the curves can be combined with seismic source characterizations to transform surficial geologic maps into probabilistic liquefaction hazard maps. Geographic specific curves are clearly desirable, but in the absence of such information, generic liquefaction probability curves provide a first approximation of liquefaction hazard. Such maps are useful both

  16. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  17. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  18. Population pharmacokinetics and maximum a posteriori probability Bayesian estimator of abacavir: application of individualized therapy in HIV-infected infants and toddlers.

    NARCIS (Netherlands)

    Zhao, W.; Cella, M.; Pasqua, O. Della; Burger, D.M.; Jacqz-Aigrain, E.

    2012-01-01

    WHAT IS ALREADY KNOWN ABOUT THIS SUBJECT: Abacavir is used to treat HIV infection in both adults and children. The recommended paediatric dose is 8 mg kg(-1) twice daily up to a maximum of 300 mg twice daily. Weight was identified as the central covariate influencing pharmacokinetics of abacavir in

  19. Feasibility study for the non-invasive blood pressure estimation based on ppg morphology: normotensive subject study.

    Science.gov (United States)

    Shin, Hangsik; Min, Se Dong

    2017-01-10

    Blood pressure is a critical bio-signal and its importance has been increased with the aged society and the growth of cardiovascular disease population. However, most of hypertensive patients have been suffered the inconvenience in monitoring blood pressure in daily life because the measurement of the blood pressure depends on the cuff-based technique. Nowadays there are many trials to measure blood pressure without cuff, especially, photoplethysmography (PPG) based research is carried out in various ways. Our research is designed to hypothesis the relationship between vessel wall movement and pressure-flow relationship of PPG and to validate its appropriateness by experimental methods. PPG waveform is simplified by approximate model, and then it is analyzed as the velocity and the acceleration of blood flow using the derivatives of PPG. Finally, we develop pressure index (PI) as an estimation factor of blood pressure by combining of statistically significant segments of photoplethysmographic waveform. Twenty-five subjects were participated in the experiment. As a result of simulation, correlation coefficients between developed PI and blood pressure were represented with R = 0.818, R = 0.827 and R = 0.615 in systolic blood pressure, pulse pressure and mean arterial pressure, respectively, and both of result showed the meaningful statistically significance (P blood pressure but could not find the absolute pressure value. Moreover, proposed index has the limitation of diastolic pressure tracing. However, the result shows that the proposed PI is statistically significantly correlated with blood pressures, and it suggests that the proposed PI as a promising additional parameter for the cuff less blood pressure monitoring.

  20. Structural Minimax Probability Machine.

    Science.gov (United States)

    Gu, Bin; Sun, Xingming; Sheng, Victor S

    2017-07-01

    Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

  1. Subjective sleep quality in relation to objective sleep estimates: comparison, gender differences and changes between the acute phase and the six-month follow-up after stroke.

    Science.gov (United States)

    Bakken, Linda N; Kim, Hesook Suzie; Finset, Arnstein; Lerdal, Anners

    2014-03-01

    To describe sleep experiences after stroke using subjective and objective indicators and identify possible gender differences in sleep in the acute phase and at 6-month follow-up. Sleep disturbances after stoke are recognized, but poorly described. Gender differences in sleep exist in other populations, but have not been reported after stroke. A longitudinal cohort study. Subjective sleep quality was measured with the Pittsburgh Sleep Quality Index and objective sleep was estimated with actigraphy in 100 patients in the acute phase and six months after stroke, from April 2007-March 2009. Subjective sleep quality was better and objective wake percentage was lower at follow-up than in the acute phase after stroke. Actigraphy estimated low sleep efficiency and many awakenings at both time points. Subjective and objective measures were correlated at the 6-month follow-up, but not in the acute phase. Women's subjective sleep efficiency and total score on the Pittsburgh Sleep Quality Index were worse than men's in the acute phase, but actigraphy estimated that women slept more than men in the course of a day. Women's subjective sleep quality was better at follow-up than in the acute phase. Men reported worse subjective sleep quality, but better subjective sleep efficiency at follow-up than in the acute phase, and also had lower objective wake percentage at follow-up. Subjective sleep quality was poor and actigraphy indicated disturbed sleep-wake patterns in the acute phase and at 6-month follow-up. Gender differences existed in subjective and objective sleep in the acute phase, but not at follow-up. © 2013 John Wiley & Sons Ltd.

  2. Estimating the probability of polyreactive antibodies 4E10 and 2F5 disabling a gp41 trimer after T cell-HIV adhesion.

    Directory of Open Access Journals (Sweden)

    Bin Hu

    2014-01-01

    Full Text Available A few broadly neutralizing antibodies, isolated from HIV-1 infected individuals, recognize epitopes in the membrane proximal external region (MPER of gp41 that are transiently exposed during viral entry. The best characterized, 4E10 and 2F5, are polyreactive, binding to the viral membrane and their epitopes in the MPER. We present a model to calculate, for any antibody concentration, the probability that during the pre-hairpin intermediate, the transient period when the epitopes are first exposed, a bound antibody will disable a trivalent gp41 before fusion is complete. When 4E10 or 2F5 bind to the MPER, a conformational change is induced that results in a stably bound complex. The model predicts that for these antibodies to be effective at neutralization, the time to disable an epitope must be shorter than the time the antibody remains bound in this conformation, about five minutes or less for 4E10 and 2F5. We investigate the role of avidity in neutralization and show that 2F5 IgG, but not 4E10, is much more effective at neutralization than its Fab fragment. We attribute this to 2F5 interacting more stably than 4E10 with the viral membrane. We use the model to elucidate the parameters that determine the ability of these antibodies to disable epitopes and propose an extension of the model to analyze neutralization data. The extended model predicts the dependencies of IC50 for neutralization on the rate constants that characterize antibody binding, the rate of fusion of gp41, and the number of gp41 bridging the virus and target cell at the start of the pre-hairpin intermediate. Analysis of neutralization experiments indicate that only a small number of gp41 bridges must be disabled to prevent fusion. However, the model cannot determine the exact number from neutralization experiments alone.

  3. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....

  4. Using hierarchical Bayesian multi-species mixture models to estimate tandem hoop-net based habitat associations and detection probabilities of fishes in reservoirs

    Science.gov (United States)

    Stewart, David R.; Long, James M.

    2015-01-01

    Species distribution models are useful tools to evaluate habitat relationships of fishes. We used hierarchical Bayesian multispecies mixture models to evaluate the relationships of both detection and abundance with habitat of reservoir fishes caught using tandem hoop nets. A total of 7,212 fish from 12 species were captured, and the majority of the catch was composed of Channel Catfish Ictalurus punctatus (46%), Bluegill Lepomis macrochirus(25%), and White Crappie Pomoxis annularis (14%). Detection estimates ranged from 8% to 69%, and modeling results suggested that fishes were primarily influenced by reservoir size and context, water clarity and temperature, and land-use types. Species were differentially abundant within and among habitat types, and some fishes were found to be more abundant in turbid, less impacted (e.g., by urbanization and agriculture) reservoirs with longer shoreline lengths; whereas, other species were found more often in clear, nutrient-rich impoundments that had generally shorter shoreline length and were surrounded by a higher percentage of agricultural land. Our results demonstrated that habitat and reservoir characteristics may differentially benefit species and assemblage structure. This study provides a useful framework for evaluating capture efficiency for not only hoop nets but other gear types used to sample fishes in reservoirs.

  5. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  6. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  7. Estimation of macular pigment optical density in the elderly: test-retest variability and effect of optical blur in pseudophakic subjects

    NARCIS (Netherlands)

    Gallaher, Kevin T.; Mura, Marco; Todd, Wm Andrew; Harris, Tarsha L.; Kenyon, Emily; Harris, Tamara; Johnson, Karen C.; Satterfield, Suzanne; Kritchevsky, Stephen B.; Iannaccone, Alessandro

    2007-01-01

    The reproducibility of macular pigment optical density (MPOD) estimates in the elderly was assessed in 40 subjects (age: 79.1+/-3.5). Test-retest variability was good (Pearson's r coefficient: 0.734), with an average coefficient of variation (CV) of 18.4% and an intraclass correlation coefficient

  8. So You Think You Look Young? Matching Older Adults' Subjective Ages with Age Estimations Provided by Younger, Middle-Aged, and Older Adults

    Science.gov (United States)

    Kotter-Gruhn, Dana; Hess, Thomas M.

    2012-01-01

    Perceived age plays an important role in the context of age identity and social interactions. To examine how accurate individuals are in estimating how old they look and how old others are, younger, middle-aged, and older adults rated photographs of older target persons (for whom we had information about objective and subjective age) in terms of…

  9. On Gnostical Estimates

    Czech Academy of Sciences Publication Activity Database

    Fabián, Zdeněk

    2017-01-01

    Roč. 56, č. 2 (2017), s. 125-132 ISSN 0973-1377 Institutional support: RVO:67985807 Keywords : gnostic theory * statistics * robust estimates Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability http://www.ceser.in/ceserp/index.php/ijamas/article/view/4707

  10. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  11. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  12. Comparação de distribuições de probabilidade e estimativa da precipitação provável para região de Barbacena, MG Comparasion of probability distribution models and estimative of the probable rainfall for the Barbacena County, MG

    Directory of Open Access Journals (Sweden)

    Bruno Teixeira Ribeiro

    2007-10-01

    Full Text Available Estudos probabilísticos envolvendo variáveis climáticas são de extrema importância para as atividades da agropecuária, construção civil, turismo, transporte, dentre outros. Visando contribuir para o planejamento da agricultura irrigada, este trabalho teve como objetivos comparar distribuições de probabilidade ajustadas às séries históricas decendiais e mensais, e estimar as precipitações prováveis para o município de Barbacena, MG. Foram estudados os meses de dezembro, janeiro e fevereiro, no período de 1942 a 2003, constituindo-se séries históricas com 62 anos de observações. As lâminas diárias foram totalizadas em períodos mensais e decendiais, sendo aplicadas as distribuições log-Normal 2 parâmetros, log-Normal 3 parâmetros e Gama. Para avaliar a adequabilidade das distribuições, nos períodos estudados, utilizou-se o teste de Qui-quadrado (chi2, ao nível de 5% de significância. As precipitações prováveis foram estimadas para cada período estudado utilizando a distribuição que apresentou o menor valor de chi2, nos níveis de probabilidade de excedência de 75, 90 e 98%. A distribuição Gama foi a que melhor se ajustou aos dados. O estudo de precipitações prováveis é uma boa ferramenta no auxílio da tomada de decisão quanto ao planejamento e uso da irrigação.Probabilistic studies involving climatic variables are of extreme importance for farming activities, construction, tourism, among others. Seeking to contribute for the planning of irrigate agriculture, this work had as objectives to compare adjusted probability distribution models to the monthly and decennial historical series and to estimate the probable rainfall for the Barbacena County, Minas Gerais State, Brazil. Rainfall data of December, January and February, from 1942 to 2003, were studied, constituting historical series with 62 years of observations. Daily rainfall depths were added for 10 and 30 days, applying Gama, log-Normal 2 and

  13. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  14. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems . II. Further results with application to a set of ALMA and ATCA data

    Science.gov (United States)

    Vio, R.; Vergès, C.; Andreani, P.

    2017-08-01

    The matched filter (MF) is one of the most popular and reliable techniques to the detect signals of known structure and amplitude smaller than the level of the contaminating noise. Under the assumption of stationary Gaussian noise, MF maximizes the probability of detection subject to a constant probability of false detection or false alarm (PFA). This property relies upon a priori knowledge of the position of the searched signals, which is usually not available. Recently, it has been shown that when applied in its standard form, MF may severely underestimate the PFA. As a consequence the statistical significance of features that belong to noise is overestimated and the resulting detections are actually spurious. For this reason, an alternative method of computing the PFA has been proposed that is based on the probability density function (PDF) of the peaks of an isotropic Gaussian random field. In this paper we further develop this method. In particular, we discuss the statistical meaning of the PFA and show that, although useful as a preliminary step in a detection procedure, it is not able to quantify the actual reliability of a specific detection. For this reason, a new quantity is introduced called the specific probability of false alarm (SPFA), which is able to carry out this computation. We show how this method works in targeted simulations and apply it to a few interferometric maps taken with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Australia Telescope Compact Array (ATCA). We select a few potential new point sources and assign an accurate detection reliability to these sources.

  15. Impact of UKPDS risk estimation added to a first subjective risk estimation on management of coronary disease risk in type 2 diabetes - An observational study

    NARCIS (Netherlands)

    Wind, Anne E.; Gorter, Kees J.; Van Den Donk, Maureen; Rutten, Guy E H M

    2016-01-01

    Aims To investigate the impact of the UKPDS risk engine on management of CHD risk in T2DM patients. Methods Observational study among 139 GPS. Data from 933 consecutive patients treated with a maximum of two oral glucose lowering drugs, collected at baseline and after twelve months. GPS estimated

  16. Experience matters: information acquisition optimizes probability gain.

    Science.gov (United States)

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.

  17. Stationary algorithmic probability

    National Research Council Canada - National Science Library

    Müller, Markus

    2010-01-01

    ...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...

  18. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    Science.gov (United States)

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data. © The Author(s) 2014.

  19. An automated image-based method of 3D subject-specific body segment parameter estimation for kinetic analyses of rapid movements.

    Science.gov (United States)

    Sheets, Alison L; Corazza, Stefano; Andriacchi, Thomas P

    2010-01-01

    Accurate subject-specific body segment parameters (BSPs) are necessary to perform kinetic analyses of human movements with large accelerations, or no external contact forces or moments. A new automated topographical image-based method of estimating segment mass, center of mass (CM) position, and moments of inertia is presented. Body geometry and volume were measured using a laser scanner, then an automated pose and shape registration algorithm segmented the scanned body surface, and identified joint center (JC) positions. Assuming the constant segment densities of Dempster, thigh and shank masses, CM locations, and moments of inertia were estimated for four male subjects with body mass indexes (BMIs) of 19.7-38.2. The subject-specific BSP were compared with those determined using Dempster and Clauser regression equations. The influence of BSP and BMI differences on knee and hip net forces and moments during a running swing phase were quantified for the subjects with the smallest and largest BMIs. Subject-specific BSP for 15 body segments were quickly calculated using the image-based method, and total subject masses were overestimated by 1.7-2.9%.When compared with the Dempster and Clauser methods, image-based and regression estimated thigh BSP varied more than the shank parameters. Thigh masses and hip JC to thigh CM distances were consistently larger, and each transverse moment of inertia was smaller using the image-based method. Because the shank had larger linear and angular accelerations than the thigh during the running swing phase, shank BSP differences had a larger effect on calculated intersegmental forces and moments at the knee joint than thigh BSP differences did at the hip. It was the net knee kinetic differences caused by the shank BSP differences that were the largest contributors to the hip variations. Finally, BSP differences produced larger kinetic differences for the subject with larger segment masses, suggesting that parameter accuracy is more

  20. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics Impact factor: 1.357, year: 2016 http:// library .utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  1. Factual and cognitive probability

    OpenAIRE

    Chuaqui, Rolando

    2012-01-01

    This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...

  2. Lean body mass-adjusted Cockcroft and Gault formula improves the estimation of glomerular filtration rate in subjects with normal-range serum creatinine.

    Science.gov (United States)

    Lim, Wai H; Lim, Ee M; McDonald, Stephen

    2006-06-01

    Assessment of glomerular filtration rate (GFR) in individuals with normal-range serum creatinine is important in certain clinical situations, such as in potential living kidney donors. Accurate measurements of GFR invariably involve using an invasive method (e.g. inulin clearances), but is inconvenient. The aim of the present study was to determine whether serum creatinine-based prediction formulae adjusted for lean body mass (LBM) could improve the accuracy of GFR estimation in these subjects. Glomerular filtration rate was determined by the clearance of technetium-99m-labelled diethylenetriamine penta-acetic acid ((99m)Tc DTPA) from plasma in 56 subjects with normal serum creatinine. For each subject, GFR was estimated using prediction formulae +/- LBM adjustment and compared with measured GFR. Formulae analysed include Cockcroft-Gault, Levey, Gates, Mawer, Hull, Toto, Jellife and Bjornsson. All formulae +/- LBM adjustment underestimated measured GFR, with poor precision, poor agreement and correlation (r (2) formulae correctly classified those with a normal measured GFR. LBM-adjusted formulae significantly improved the accuracy of GFR estimation compared with unadjusted formulae. The lean body mass-adjusted Cockcroft-Gault formula was the closest to measured GFR but is not accurate enough to replace radionuclide GFR measurement. Prediction formulae should be adjusted for LBM to improve GFR estimation.

  3. Discharge estimation from H-ADCP measurements in a tidal river subject to sidewall effects and a mobile bed

    NARCIS (Netherlands)

    Sassi, M.G.; Hoitink, A.J.F.; Vermeulen, B.; Hidayat, H.

    2011-01-01

    Horizontal acoustic Doppler current profilers (H-ADCPs) can be employed to estimate river discharge based on water level measurements and flow velocity array data across a river transect. A new method is presented that accounts for the dip in velocity near the water surface, which is caused by

  4. Simultaneous estimation of liquid and solid gastric emptying using radiolabelled egg and water in supine normal subjects.

    Science.gov (United States)

    Kris, M G; Yeh, S D; Gralla, R J; Young, C W

    1986-01-01

    To develop an additional method for the measurement of gastric emptying in supine subjects, 10 normal subjects were given a test meal containing 99Tc-labelled scrambled egg as the "solid" phase marker and 111In in tapwater as the marker for the "liquid" phase. The mean time for emptying 50% of the "solid" phase (t1/2) was 85 min and 29 min for the "liquid" phase. Three individuals were restudied with a mean difference between the two determinations of 10.8% for the "solid" phase and 6.5% for the "liquid" phase. Twenty-six additional studies attempted have been successfully completed in symptomatic patients with advanced cancer. This method provides a simple and reproducible procedure for the determination of gastric emptying that yields results similar to those reported for other test meals and can be used in debilitated patients.

  5. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  6. A seismic probability map

    Directory of Open Access Journals (Sweden)

    J. M. MUNUERA

    1964-06-01

    Full Text Available The material included in former two papers (SB and EF
    which summs 3307 shocks corresponding to 2360 years, up to I960, was
    reduced to a 50 years period by means the weight obtained for each epoch.
    The weitliing factor is the ratio 50 and the amount of years for every epoch.
    The frequency has been referred over basis VII of the international
    seismic scale of intensity, for all cases in which the earthquakes are equal or
    greater than VI and up to IX. The sum of products: frequency and parameters
    previously exposed, is the probable frequency expected for the 50
    years period.
    On each active small square, we have made the corresponding computation
    and so we have drawn the Map No 1, in percentage. The epicenters with
    intensity since X to XI are plotted in the Map No 2, in order to present a
    complementary information.
    A table shows the return periods obtained for all data (VII to XI,
    and after checking them with other computed from the first up to last shock,
    a list includes the probable approximate return periods estimated for the area.
    The solution, we suggest, is an appropriated form to express the seismic
    contingent phenomenon and it improves the conventional maps showing
    the equal intensity curves corresponding to the maximal values of given side.

  7. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  8. Probability Density Functions and Higher Order Statistics of Large-Scale Geostrophic Velocity Estimates and Sea Surface Height, as seen from the Jason-1 -TOPEX/Poseidon Tandem Mission

    Science.gov (United States)

    Scharffenberg, Martin G.; Biri, Stavroula; Stammer, Detlef

    2013-09-01

    Geostrophic velocity Probability Density Functions (PDF), Skewness (S) and Kurtosis (K) are shown for both velocity components (u, v) estimated from the 3- year long Jason-1 - TOPEX/Poseidon (JTP) Tandem Mission which allowed infer both velocity components directly from the altimeter observations. To be comparable to previous results of velocity- (w) and SSH-PDF, we include the 18.5-year time series of SSH from the TOPEX/Poseidon, Jason-1 and Jason-2 (TPJJ) missions.The differences in the PDF of both velocity components are found to be evident, with a wider shape for the zonal velocity component due to the larger variability in zonal direction. Results confirm that the exponential shape of the global velocity PDF is a consequence of the spatially inhomogeneous EKE distribution over the global ocean. Only regions with a small variance in EKE, have Gaussian shaped PDF, however, normalizing each time series with their STD results in Gaussian PDF everywhere.

  9. Efficient probability sequence

    OpenAIRE

    Regnier, Eva

    2014-01-01

    A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...

  10. Efficient probability sequences

    OpenAIRE

    Regnier, Eva

    2014-01-01

    DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...

  11. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  12. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  13. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  14. Omega-3 fatty acids status in human subjects estimated using a food frequency questionnaire and plasma phospholipids levels

    Directory of Open Access Journals (Sweden)

    Garneau Véronique

    2012-07-01

    Full Text Available Abstract Background Intakes of omega-3 (n-3 fatty acids (FA are associated with several health benefits. The aim of this study was to verify whether intakes of n-3 FA estimated from a food frequency questionnaire (FFQ correlate with n-3 FA levels measured in plasma phospholipids (PL. Methods The study sample consisted of 200 French-Canadians men and women aged between 18 to 55 years. Dietary data were collected using a validated FFQ. Fasting blood samples were collected and the plasma PL FA profile was measured by gas chromatography. Results Low intakes of n-3 long-chain FA together with low percentages of n-3 long-chain FA in plasma PL were found in French-Canadian population. Daily intakes of eicosapentaenoic acid (EPA, docosapentaenoic acid (DPA and docosahexaenoic acid (DHA were similar between men and women. Yet, alpha-linolenic acid (ALA and total n-3 FA intakes were significantly higher in men compared to women (ALA: 2.28 g and 1.69 g, p n-3 FA: 2.57 g and 1.99 g, p n-3 FA (men: r = 0.47, p  Conclusion Estimated n-3 long-chain FA intake among this young and well-educated French-Canadian population is lower than the recommendations. Further, FFQ data is comparable to plasma PL results to estimate DHA and total n-3 FA status in healthy individuals as well as to evaluate the EPA and DPA status in women. Overall, this FFQ could be used as a simple, low-cost tool in future studies to rank n-3 FA status of individuals.

  15. Estimation of glomerular filtration rate in diabetic subjects: Cockcroft formula or modification of Diet in Renal Disease study equation?

    Science.gov (United States)

    Rigalleau, Vincent; Lasseur, Catherine; Perlemoine, Caroline; Barthe, Nicole; Raffaitin, C; Liu, Chung; Chauveau, Phillipe; Baillet-Blanco, Laurence; Beauvieux, Marie-Christine; Combe, C; Gin, Henri

    2005-04-01

    The Cockcroft-Gault formula is recommended for the evaluation of renal function in diabetic patients. The more recent Modification of Diet in Renal Disease (MDRD) study equation seems more accurate, but it has not been validated in diabetic patients. This study compares the two methods. In 160 diabetic patients, we compared the Cockcroft-Gault formula and MDRD equation estimations to glomerular filtration rates (GFRs) measured by an isotopic method ((51)Cr-EDTA) by correlation studies and a Bland-Altman procedure. Their accuracy for the diagnosis of moderately (GFR formula (r = 0.74; P formula. Analysis of ROC curves showed that the MDRD equation had a better maximal accuracy for the diagnosis of moderate (areas under the curve [AUCs] 0.868 for the Cockcroft-Gault formula and 0.927 for the MDRD equation; P = 0.012) and severe renal failure (AUC 0.883 for the Cockcroft-Gault formula and 0.962 for the MDRD equation; P = 0.0001). In the 87 patients with renal insufficiency, the MDRD equation estimation was better correlated with isotopic GFR (Cockcroft-Gault formula r = 0.57; the MDRD equation r = 0.78; P < 0.01), and it was not biased as evaluated by the Bland-Altman procedure. Although both equations have imperfections, the MDRD equation is more accurate for the diagnosis and stratification of renal failure in diabetic patients.

  16. Electron-ion temperature ratio estimations in the summer polar mesosphere when subject to HF radio wave heating

    Science.gov (United States)

    Pinedo, H.; La Hoz, C.; Havnes, O.; Rietveld, M.

    2014-10-01

    We have inferred the electron temperature enhancements above mesospheric altitudes under Polar Mesospheric Summer Echoes (PMSE) conditions when the ionosphere is exposed to artificial HF radio wave heating. The proposed method uses the dependence of the radar cross section on the electron-to-ion temperature ratio to infer the heating factor from incoherent scatter radar (ISR) power measurements above 90 km. Model heating temperatures match our ISR estimations between 90 and 130 km with 0.94 Pearson correlation index. The PMSE strength measured by the MORRO MST radar is about 50% weaker during the heater-on period when the modeled electron-to-ion mesospheric temperature is approximately 10 times greater than the unperturbed value. No PMSE weakening is found when the mesospheric temperature enhancement is by a factor of three or less. The PMSE weakening and its absence are consistent with the modeled mesospheric electron temperatures. This consistency supports to the proposed method for estimating mesospheric electron temperatures achieved by independent MST and ISR radar measurements.

  17. Oxygen boundary crossing probabilities.

    Science.gov (United States)

    Busch, N A; Silver, I A

    1987-01-01

    The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.

  18. Risk Stratification by 24-Hour Ambulatory Blood Pressure and Estimated Glomerular Filtration Rate in 5322 Subjects From 11 Populations

    DEFF Research Database (Denmark)

    Boggia, José; Thijs, Lutgarde; Li, Yan

    2013-01-01

    subjects (median age, 51.8 years; 43.1% women) randomly recruited from 11 populations, who had baseline measurements of 24-hour ambulatory blood pressure (ABP(24)) and eGFR. We computed hazard ratios using multivariable-adjusted Cox regression. Median follow-up was 9.3 years. In fully adjusted models......, which included both ABP(24) and eGFR, ABP(24) predicted (P≤0.008) both total (513 deaths) and cardiovascular (206) mortality; eGFR only predicted cardiovascular mortality (P=0.012). Furthermore, ABP(24) predicted (P≤0.0056) fatal combined with nonfatal events as a result of all cardiovascular causes...... (555 events), cardiac disease (335 events), or stroke (218 events), whereas eGFR only predicted the composite cardiovascular end point and stroke (P≤0.035). The interaction terms between ABP(24) and eGFR were all nonsignificant (P≥0.082). For cardiovascular mortality, the composite cardiovascular end...

  19. In All Probability, Probability is not All

    Science.gov (United States)

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  20. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  1. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  2. Normative perceptual estimates for 91 healthy subjects age 60-75: Impact of age, education, employment, physical exercise, alcohol and video gaming

    Directory of Open Access Journals (Sweden)

    Inge Linda Wilms

    2014-10-01

    Full Text Available Visual perception serves as the basis for much of the higher level cognitive processing as well as human activity in general. Here we present normative estimates for the following components of visual perception: the visual perceptual threshold, the visual short-term memory capacity and the visual perceptual encoding/decoding speed (processing speed of Visual Short-Term Memory (VSTM based on an assessment of 91 healthy subjects aged 60-75. The estimates were modelled from input from a whole-report assessment based on A Theory of Visual Attention (TVA. In addition to the estimates themselves, we present correlational data, and multiple regression analyses between the estimates and self-reported demographic data and lifestyle variables. The regression statistics suggest that education level, video gaming activity and employment status may significantly impact the encoding/decoding speed of VTSM but not the capacity of VSTM nor the visual perceptual threshold. The estimates will be useful for future studies into the effects of various types of intervention and training on cognition in general and visual attention in particular.

  3. Development of a Probabilistic Safety Assessment Framework for an Interim Dry Storage Facility Subjected to an Aircraft Crash Using Best-Estimate Structural Analysis

    Directory of Open Access Journals (Sweden)

    Belal Almomani

    2017-03-01

    Full Text Available Using a probabilistic safety assessment, a risk evaluation framework for an aircraft crash into an interim spent fuel storage facility is presented. Damage evaluation of a detailed generic cask model in a simplified building structure under an aircraft impact is discussed through a numerical structural analysis and an analytical fragility assessment. Sequences of the impact scenario are shown in a developed event tree, with uncertainties considered in the impact analysis and failure probabilities calculated. To evaluate the influence of parameters relevant to design safety, risks are estimated for three specification levels of cask and storage facility structures. The proposed assessment procedure includes the determination of the loading parameters, reference impact scenario, structural response analyses of facility walls, cask containment, and fuel assemblies, and a radiological consequence analysis with dose–risk estimation. The risk results for the proposed scenario in this study are expected to be small relative to those of design basis accidents for best-estimated conservative values. The importance of this framework is seen in its flexibility to evaluate the capability of the facility to withstand an aircraft impact and in its ability to anticipate potential realistic risks; the framework also provides insight into epistemic uncertainty in the available data and into the sensitivity of the design parameters for future research.

  4. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  5. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  6. Dynamic Connectivity States Estimated from Resting fMRI Identify Differences among Schizophrenia, Bipolar Disorder, and Healthy Control Subjects

    Directory of Open Access Journals (Sweden)

    Barnaly eRashid

    2014-11-01

    Full Text Available Schizophrenia and bipolar disorder share significant overlap in clinical symptoms, brain characteristics, and risk genes, and both are associated with dysconnectivity among large-scale brain networks. Resting state functional magnetic resonance imaging (rsfMRI data facilitates studying macroscopic connectivity among distant brain regions. Standard approaches to identifying such connectivity include seed-based correlation and data-driven clustering methods such as independent component analysis (ICA but typically focus on average connectivity. In this study, we utilize ICA on rsfMRI data to obtain intrinsic connectivity networks (ICNs in cohorts of healthy controls (HC and age matched schizophrenia and bipolar disorder patients. Subsequently, we investigated difference in functional network connectivity (FNC, defined as pairwise correlations among the timecourses of ICNs, between healthy controls and patients. We quantified differences in both static (average and dynamic (windowed connectivity during the entire scan duration. Disease-specific differences were identified in connectivity within different dynamic states. Schizophrenia patients showed more differences from healthy subjects than did bipolars, including both hyper and hypo connectivity in one common connectivity state (dynamic state 3. Also group differences between schizophrenia and bipolar patients were identified in patterns (states of connectivity involving the frontal (dynamic state 1 and frontal-parietal regions (dynamic state 3. Our results provide new information about these illnesses and strongly suggest that state-based analyses are critical to avoid averaging together important factors that can help distinguish these clinical groups.

  7. Probability output modeling for support vector machines

    Science.gov (United States)

    Zhang, Xiang; Xiao, Xiaoling; Tian, Jinwen; Liu, Jian

    2007-11-01

    In this paper we propose an approach to model the posterior probability output of multi-class SVMs. The sigmoid function is used to estimate the posterior probability output in binary classification. This approach modeling the posterior probability output of multi-class SVMs is achieved by directly solving the equations that are based on the combination of the probability outputs of binary classifiers using the Bayes's rule. The differences and different weights among these two-class SVM classifiers, based on the posterior probability, are considered and given for the combination of the probability outputs among these two-class SVM classifiers in this method. The comparative experiment results show that our method achieves the better classification precision and the better probability distribution of the posterior probability than the pairwise couping method and the Hastie's optimization method.

  8. Salivary protein concentration, flow rate, buffer capacity and pH estimation: A comparative study among young and elderly subjects, both normal and with gingivitis and periodontitis

    Science.gov (United States)

    Shaila, Mulki; Pai, G. Prakash; Shetty, Pushparaj

    2013-01-01

    Background: To evaluate the salivary protein concentration in gingivitis and periodontitis patients and compare the parameters like salivary total protein, salivary albumin, salivary flow rate, pH, buffer capacity and flow rate in both young and elderly patients with simple methods. Materials and Methods: One hundred and twenty subjects were grouped based on their age as young and elderly. Each group was subgrouped (20 subjects) as controls, gingivitis and periodontitis. Unstimulated whole saliva was collected from patients and flow rate was noted down during collection of the sample. Salivary protein estimation was done using the Biuret method and salivary albumin was assessed using the Bromocresol green method. pH was estimated with a pHmeter and buffering capacity was analyzed with the titration method. Student's t-test, Fisher's test (ANOVA) and Tukey HSD (ANOVA) tests were used for statistical analysis. Results: A very highly significant rise in the salivary total protein and albumin concentration was noted in gingivitis and periodontitis subjects of both young and elderly. An overall decrease in salivary flow rate was observed among the elderly, and also the salivary flow rate of women was significantly lower than that of men. Conclusion: Significant associations between salivary total protein and albumin in gingivitis and periodontitis were found with simple biochemical tests. A decrease in salivary flow rate among elderly and among women was noted. PMID:23633771

  9. Salivary protein concentration, flow rate, buffer capacity and pH estimation: A comparative study among young and elderly subjects, both normal and with gingivitis and periodontitis

    Directory of Open Access Journals (Sweden)

    Mulki Shaila

    2013-01-01

    Full Text Available Background: To evaluate the salivary protein concentration in gingivitis and periodontitis patients and compare the parameters like salivary total protein, salivary albumin, salivary flow rate, pH, buffer capacity and flow rate in both young and elderly patients with simple methods. Materials and Methods: One hundred and twenty subjects were grouped based on their age as young and elderly. Each group was subgrouped (20 subjects as controls, gingivitis and periodontitis. Unstimulated whole saliva was collected from patients and flow rate was noted down during collection of the sample. Salivary protein estimation was done using the Biuret method and salivary albumin was assessed using the Bromocresol green method. pH was estimated with a pHmeter and buffering capacity was analyzed with the titration method. Student′s t-test, Fisher′s test (ANOVA and Tukey HSD (ANOVA tests were used for statistical analysis. Results: A very highly significant rise in the salivary total protein and albumin concentration was noted in gingivitis and periodontitis subjects of both young and elderly. An overall decrease in salivary flow rate was observed among the elderly, and also the salivary flow rate of women was significantly lower than that of men. Conclusion: S ignificant associations between salivary total protein and albumin in gingivitis and periodontitis were found with simple biochemical tests. A decrease in salivary flow rate among elderly and among women was noted.

  10. Salivary protein concentration, flow rate, buffer capacity and pH estimation: A comparative study among young and elderly subjects, both normal and with gingivitis and periodontitis.

    Science.gov (United States)

    Shaila, Mulki; Pai, G Prakash; Shetty, Pushparaj

    2013-01-01

    To evaluate the salivary protein concentration in gingivitis and periodontitis patients and compare the parameters like salivary total protein, salivary albumin, salivary flow rate, pH, buffer capacity and flow rate in both young and elderly patients with simple methods. One hundred and twenty subjects were grouped based on their age as young and elderly. Each group was subgrouped (20 subjects) as controls, gingivitis and periodontitis. Unstimulated whole saliva was collected from patients and flow rate was noted down during collection of the sample. Salivary protein estimation was done using the Biuret method and salivary albumin was assessed using the Bromocresol green method. pH was estimated with a pHmeter and buffering capacity was analyzed with the titration method. Student's t-test, Fisher's test (ANOVA) and Tukey HSD (ANOVA) tests were used for statistical analysis. A very highly significant rise in the salivary total protein and albumin concentration was noted in gingivitis and periodontitis subjects of both young and elderly. An overall decrease in salivary flow rate was observed among the elderly, and also the salivary flow rate of women was significantly lower than that of men. Significant associations between salivary total protein and albumin in gingivitis and periodontitis were found with simple biochemical tests. A decrease in salivary flow rate among elderly and among women was noted.

  11. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover

  12. Comparison of an automated most-probable-number technique with traditional plating methods for estimating populations of total aerobes, coliforms, and Escherichia coli associated with freshly processed broiler chickens.

    Science.gov (United States)

    Line, J E; Stern, N J; Oakley, B B; Seal, B S

    2011-09-01

    An instrument (TEMPO) has been developed to automate the most-probable-number (MPN) technique and reduce the effort required to estimate some bacterial populations. We compared the automated MPN technique with traditional microbiological plating methods and Petrifilm methods for estimating the total viable count of aerobic microorganisms (TVC), total coliforms (CC), and Escherichia coli populations (EC) on freshly processed broiler chicken carcasses (postchill whole carcass rinse [WCR] samples) and cumulative drip-line samples from a commercial broiler processing facility. Overall, 120 broiler carcasses, 36 prechill drip-line samples, and 40 postchill drip-line samples were collected over 5 days (representing five individual flocks) and analyzed by the automated MPN and direct agar plating and Petrifilm methods. The TVC correlation coefficient between the automated MPN and traditional methods was very high (0.972) for the prechill drip samples, which had mean log-transformed values of 3.09 and 3.02, respectively. The TVC correlation coefficient was lower (0.710) for the postchill WCR samples, which had lower mean log values of 1.53 and 1.31, respectively. Correlations between the methods for the prechill CC and EC samples were 0.812 and 0.880, respectively. The estimated number of total aerobes was generally greater than the total number of coliforms or E. coli recovered for all sample types (P < 2e⁻¹⁶). Significantly more bacteria were recovered from the prechill samples than from the postchill WCR or cumulative drip samples (P < 9.5e⁻¹² and P < 2e⁻¹⁶, respectively). When samples below the limit of detection were excluded, 92.1% of the total responses were within a single log difference between the traditional plating or Petrifilm methods and the automated MPN method.

  13. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  15. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  16. Statistics and Probability

    Directory of Open Access Journals (Sweden)

    Laktineh Imad

    2010-04-01

    Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  17. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  18. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  19. Estimation of Pulmonary Motion in Healthy Subjects and Patients with Intrathoracic Tumors Using 3D-Dynamic MRI: Initial Results

    Energy Technology Data Exchange (ETDEWEB)

    Plathow, Christian; Schoebinger, Max; Meinzer, Heinz Peter [German Cancer Research Center, Heidelberg (Germany); Herth, Felix; Tuengerthal, Siegfried [Clinic of Thoracic Disease, Heidelberg (Germany); Kauczor, Hans Ulrich [University of Heidelberg, Heidelberg (Germany)

    2009-12-15

    To estimate a new technique for quantifying regional lung motion using 3D-MRI in healthy volunteers and to apply the technique in patients with intra- or extrapulmonary tumors. Intraparenchymal lung motion during a whole breathing cycle was quantified in 30 healthy volunteers using 3D-dynamic MRI (FLASH [fast low angle shot] 3D, TRICKS [time-resolved interpolated contrast kinetics]). Qualitative and quantitative vector color maps and cumulative histograms were performed using an introduced semiautomatic algorithm. An analysis of lung motion was performed and correlated with an established 2D-MRI technique for verification. As a proof of concept, the technique was applied in five patients with non-small cell lung cancer (NSCLC) and 5 patients with malignant pleural mesothelioma (MPM). The correlation between intraparenchymal lung motion of the basal lung parts and the 2D-MRI technique was significant (r = 0.89, p < 0.05). Also, the vector color maps quantitatively illustrated regional lung motion in all healthy volunteers. No differences were observed between both hemithoraces, which was verified by cumulative histograms. The patients with NSCLC showed a local lack of lung motion in the area of the tumor. In the patients with MPM, there was global diminished motion of the tumor bearing hemithorax, which improved significantly after chemotherapy (CHT) (assessed by the 2D- and 3D-techniques) (p < 0.01). Using global spirometry, an improvement could also be shown (vital capacity 2.9 {+-} 0.5 versus 3.4 L {+-} 0.6, FEV1 0.9 {+-} 0.2 versus 1.4 {+-} 0.2 L) after CHT, but this improvement was not significant. A 3D-dynamic MRI is able to quantify intraparenchymal lung motion. Local and global parenchymal pathologies can be precisely located and might be a new tool used to quantify even slight changes in lung motion (e.g. in therapy monitoring, follow-up studies or even benign lung diseases)

  20. Estimation of absorbed dose by newborn patients subjected to chest radiographs; Estimativa de dose efetiva para radiografias do torax em pediatria neonatal

    Energy Technology Data Exchange (ETDEWEB)

    Bunick, Ana P. [Faculdades Pequeno Principe, Curitiba, PR (Brazil); Schelin, Hugo R. [Instituto de Pesquisa Pele Pequeno Principe, Curitiba, PR (Brazil); Denyak, Valeriy [Hospital Infantil Pequeno Principe, Curitiba, PR (Brazil)

    2016-07-01

    The aim of this study is to present an estimate of the effective dose received by newborn patients hospitalized in NICU and subjected to X-ray examinations of the chest in the AP projection. Initially, were followed examinations chest X-rays performed on newborn patients and subsequently, simulated in a newborn simulator object. The ESAK values obtained by TLDs were used to calculate the effective dose obtained at each examination by Caldose{sub X} software. The estimated values for the effective dose in the simulated exams in this study range from 2,3μSv the 10,7μSv. The results achieved are, generally, inferior to those reported for similar previous studies. (author)

  1. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  2. Difficulties related to Probabilities

    OpenAIRE

    Rosinger, Elemer Elad

    2010-01-01

    Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.

  3. On Randomness and Probability

    Indian Academy of Sciences (India)

    casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...

  4. Dynamic update with probabilities

    NARCIS (Netherlands)

    Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant

  5. Elements of quantum probability

    NARCIS (Netherlands)

    Kummerer, B.; Maassen, H.

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with

  6. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  7. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-06-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  8. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance of uncer......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  9. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  10. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  11. A database of virtual healthy subjects to assess the accuracy of foot-to-foot pulse wave velocities for estimation of aortic stiffness.

    Science.gov (United States)

    Willemet, Marie; Chowienczyk, Phil; Alastruey, Jordi

    2015-08-15

    While central (carotid-femoral) foot-to-foot pulse wave velocity (PWV) is considered to be the gold standard for the estimation of aortic arterial stiffness, peripheral foot-to-foot PWV (brachial-ankle, femoral-ankle, and carotid-radial) are being studied as substitutes of this central measurement. We present a novel methodology to assess theoretically these computed indexes and the hemodynamics mechanisms relating them. We created a database of 3,325 virtual healthy adult subjects using a validated one-dimensional model of the arterial hemodynamics, with cardiac and arterial parameters varied within physiological healthy ranges. For each virtual subject, foot-to-foot PWV was computed from numerical pressure waveforms at the same locations where clinical measurements are commonly taken. Our numerical results confirm clinical observations: 1) carotid-femoral PWV is a good indicator of aortic stiffness and correlates well with aortic PWV; 2) brachial-ankle PWV overestimates aortic PWV and is related to the stiffness and geometry of both elastic and muscular arteries; and 3) muscular PWV (carotid-radial, femoral-ankle) does not capture the stiffening of the aorta and should therefore not be used as a surrogate for aortic stiffness. In addition, our analysis highlights that the foot-to-foot PWV algorithm is sensitive to the presence of reflected waves in late diastole, which introduce errors in the PWV estimates. In this study, we have created a database of virtual healthy subjects, which can be used to assess theoretically the efficiency of physiological indexes based on pulse wave analysis. Copyright © 2015 the American Physiological Society.

  12. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  13. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  14. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  15. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  16. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  17. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  18. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  19. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  20. Elements of quantum probability

    OpenAIRE

    Kummerer, B.; Maassen, Hans

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of ‘quantum coin tosses’ are discussed, closely related to V.F.R....

  1. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  2. Estimation of pelvis kinematics in level walking based on a single inertial sensor positioned close to the sacrum: validation on healthy subjects with stereophotogrammetric system.

    Science.gov (United States)

    Buganè, Francesca; Benedetti, Maria Grazia; D'Angeli, Valentina; Leardini, Alberto

    2014-10-21

    Kinematics measures from inertial sensors have a value in the clinical assessment of pathological gait, to track quantitatively the outcome of interventions and rehabilitation programs. To become a standard tool for clinicians, it is necessary to evaluate their capability to provide reliable and comprehensible information, possibly by comparing this with that provided by the traditional gait analysis. The aim of this study was to assess by state-of-the-art gait analysis the reliability of a single inertial device attached to the sacrum to measure pelvis kinematics during level walking. The output signals of the three-axis gyroscope were processed to estimate the spatial orientation of the pelvis in the sagittal (tilt angle), frontal (obliquity) and transverse (rotation) anatomical planes These estimated angles were compared with those provided by a 8 TV-cameras stereophotogrammetric system utilizing a standard experimental protocol, with four markers on the pelvis. This was observed in a group of sixteen healthy subjects while performing three repetitions of level walking along a 10 meter walkway at slow, normal and fast speeds. The determination coefficient, the scale factor and the bias of a linear regression model were calculated to represent the differences between the angular patterns from the two measurement systems. For the intra-subject variability, one volunteer was asked to repeat walking at normal speed 10 times. A good match was observed for obliquity and rotation angles. For the tilt angle, the pattern and range of motion was similar, but a bias was observed, due to the different initial inclination angle in the sagittal plane of the inertial sensor with respect to the pelvis anatomical frame. A good intra-subject consistency has also been shown by the small variability of the pelvic angles as estimated by the new system, confirmed by very small values of standard deviation for all three angles. These results suggest that this inertial device is a

  3. Methods for fitting a parametric probability distribution to most probable number data.

    Science.gov (United States)

    Williams, Michael S; Ebel, Eric D

    2012-07-02

    Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two

  4. Experience Matters: Information Acquisition Optimizes Probability Gain

    Science.gov (United States)

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  5. "I'm afraid I have bad news for you…" Estimating the impact of different health impairments on subjective well-being.

    Science.gov (United States)

    Binder, Martin; Coad, Alex

    2013-06-01

    Bad health decreases individuals' happiness, but few studies measure the impact of specific illnesses. We apply matching estimators to examine how changes in different (objective) conditions of bad health affect subjective well-being for a sample of 100,265 observations from the British Household Panel Survey (BHPS) database (1996-2006). The strongest effect is for alcohol and drug abuse, followed by anxiety, depression and other mental illnesses, stroke and cancer. Adaptation to health impairments varies across health impairments. There is also a puzzling asymmetry: strong adverse reactions to deteriorations in health appear alongside weak increases in well-being after health improvements. In conclusion, our analysis offers a more detailed account of how bad health influences happiness than accounts focusing on how bad self-assessed health affects individual well-being. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  7. Frequentist probability and frequentist statistics

    Energy Technology Data Exchange (ETDEWEB)

    Neyman, J.

    1977-01-01

    A brief, nontechnical outline is given of the author's views on the ''frequentist'' theory of probability and the ''frequentist'' theory of statistics; their applications are illustrated in a few domains of study of nature. The phenomenon of apparently stable relative frequencies as the source of the frequentist theories of probability and statistics is taken up first. Three steps are set out: empirical establishment of apparently stable long-run relative frequencies of events judged interesting, as they develop in nature; guessing and then verifying the chance mechanism, the repeated operation of which produced the observed frequencies--this is a problem of frequentist probability theory; using the hypothetical chance mechanism of the phenomenon studied to deduce rules of adjusting our actions to the observations to ensure the highest ''measure'' of ''success''. Illustrations of the three steps are given. The theory of testing statistical hypotheses is sketched: basic concepts, simple and composite hypotheses, hypothesis tested, importance of the power of the test used, practical applications of the theory of testing statistical hypotheses. Basic ideas and an example of the randomization of experiments are discussed, and an ''embarrassing'' example is given. The problem of statistical estimation is sketched: example of an isolated problem, example of connected problems treated routinely, empirical Bayes theory, point estimation. The theory of confidence intervals is outlined: basic concepts, anticipated misunderstandings, construction of confidence intervals: regions of acceptance. Finally, the theory of estimation by confidence intervals or regions is considered briefly. 4 figures. (RWR)

  8. UT Biomedical Informatics Lab (BMIL) Probability Wheel.

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B; Sun, Clement; Fan, Kaili; Reece, Gregory P; Kim, Min Soon; Markey, Mia K

    2016-01-01

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant," about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  9. Weight limits, estimations of future BMI, subjective pubertal timing and physical appearance comparisons among adolescent girls as precursors of disturbed eating behaviour in a community sample.

    Science.gov (United States)

    Berger, Uwe; Weitkamp, Katharina; Strauss, Bernhard

    2009-03-01

    From a clinical point of view, a high 'objective' BMI or an early biological onset of puberty are well-known risk factors for eating disorders. In contrast, little is known about irrational beliefs and subjective meanings of body weight and pubertal timing. Mostly using standardised questionnaires, 136 girls with an average age of 12 years were asked to report their eating behaviour, (body) self-esteem, body dissatisfaction, weight limits, estimations of future BMI, subjective pubertal timing and appearance-related social comparisons. Results showed significant correlations between disturbed eating behaviour and the existence of a weight limit, which was reported by 45% of the girls. Twenty two per cent wished to have a future BMI beneath the 10th percentile. In terms of pubertal timing, girls who perceived themselves as either 'early starters' or 'late starters' reported significantly more risky eating behaviour. Results are discussed with a focus on the psychotherapeutic use of our findings as well as the opportunity for the development of preventive strategies.

  10. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  11. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  12. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  13. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  14. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  15. This data set represents the estimated percentage of the 1-km grid cell that is covered by or subject to the agricultural conservation practice (CPIS05), Combination of Irrigation Sources (CIS) on agricultural land by county (nri_is05)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This data set represents the estimated percentage of the 1-km grid cell that is covered by or subject to the agricultural conservation practice (CPIS05), Combination...

  16. This data set represents the estimated percentage of the 1-km grid cell that is covered by or subject to the agricultural conservation practice (CPIT01), Gravity Irrigation Source (GI) on agricultural land by county (nri_it01)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This data set represents the estimated percentage of the 1-km grid cell that is covered by or subject to the agricultural conservation practice (CPIT01), Gravity...

  17. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  18. Huygens' foundations of probability

    NARCIS (Netherlands)

    Freudenthal, Hans

    It is generally accepted that Huygens based probability on expectation. The term “expectation,” however, stems from Van Schooten's Latin translation of Huygens' treatise. A literal translation of Huygens' Dutch text shows more clearly what Huygens actually meant and how he proceeded.

  19. Univariate Probability Distributions

    Science.gov (United States)

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  20. The Theory of Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.

  1. Probability Theory Without Tears!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...

  2. probably mostly white

    African Journals Online (AJOL)

    Willem Scholtz

    internet – the (probably mostly white) public's interest in the so-called Border War is ostensibly at an all-time high. By far most of the publications are written by ex- ... understanding of this very important episode in the history of Southern Africa. It was, therefore, with some anticipation that one waited for this book, which.

  3. the theory of probability

    Indian Academy of Sciences (India)

    important practical applications in statistical quality control. Of a similar kind are the laws of probability for the scattering of missiles, which are basic in the ..... deviations for different ranges for each type of gun and of shell are found empirically in firing practice on an artillery range. But the subsequent solution of all possible ...

  4. On Randomness and Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.

  5. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  6. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  7. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  8. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  9. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...

  10. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  11. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  12. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  13. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  14. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  15. SU-F-I-36: In-Utero Dose Measurements Within Postmortem Subjects for Estimating Fetal Doses in Pregnant Patients Examined with Pulmonary Embolism, Trauma, and Appendicitis CT

    Energy Technology Data Exchange (ETDEWEB)

    Lipnharski, I; Quails, N; Carranza, C; Correa, N; Bidari, S; Bickelhaup, M; Rill, L; Arreola, M [University of Florida, Gainesville, FL (United States)

    2016-06-15

    Purpose: The imaging of pregnant patients is medically necessary in certain clinical situations. The purpose of this work was to directly measure uterine doses in a cadaver scanned with CT protocols commonly performed on pregnant patients in order to estimate fetal dose and assess potential risk. Method: One postmortem subject was scanned on a 320-slice CT scanner with standard pulmonary embolism, trauma, and appendicitis protocols. All protocols were performed with the scan parameters and ranges currently used in clinical practice. Exams were performed both with and without iterative reconstruction to highlight the dose savings potential. Optically stimulated luminescent dosimeters (OSLDs) were inserted into the uterus in order to approximate fetal doses. Results: In the pulmonary embolism CT protocol, the uterus is outside of the primary beam, and the dose to the uterus was under 1 mGy. In the trauma and appendicitis protocols, the uterus is in the primary beam, the fetal dose estimates were 30.5 mGy for the trauma protocol, and 20.6 mGy for the appendicitis protocol. Iterative reconstruction reduced fetal doses by 30%, with uterine doses at 21.3 for the trauma and 14.3 mGy for the appendicitis protocol. Conclusion: Fetal doses were under 1 mGy when exposed to scatter radiation, and under 50 mGy when exposed to primary radiation with the trauma and appendicitis protocols. Consistent with the National Council on Radiation Protection & Measurements (NCRP) and the International Commission on Radiological Protection (ICRP), these doses exhibit a negligible risk to the fetus, with only a small increased risk of cancer. Still, CT scans are not recommended during pregnancy unless the benefits of the exam clearly outweigh the potential risk. Furthermore, when possible, pregnant patients should be examined on CT scanners equipped with iterative reconstruction in order to keep patient doses as low as reasonable achievable.

  16. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  17. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  18. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  19. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  20. Estimation of spatial-temporal gait parameters in level walking based on a single accelerometer: validation on normal subjects by standard gait analysis.

    Science.gov (United States)

    Bugané, F; Benedetti, M G; Casadio, G; Attala, S; Biagi, F; Manca, M; Leardini, A

    2012-10-01

    This paper investigates the ability of a single wireless inertial sensing device stuck on the lower trunk to provide spatial-temporal parameters during level walking. The 3-axial acceleration signals were filtered and the timing of the main gait events identified. Twenty-two healthy subjects were analyzed with this system for validation, and the estimated parameters were compared with those obtained with state-of-the-art gait analysis, i.e. stereophotogrammetry and dynamometry. For each side, from four to six gait cycles were measured with the device, of which two were validated by gait analysis. The new acquisition system is easy to use and does not interfere with regular walking. No statistically significant differences were found between the acceleration-based measurements and the corresponding ones from gait analysis for most of the spatial-temporal parameters, i.e. stride length, stride duration, cadence and speed, etc.; significant differences were found for the gait cycle phases, i.e. single and double support duration, etc. The system therefore shows promise also for a future routine clinical use. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.