WorldWideScience

Sample records for kaplan-meier estimated probability

  1. Understanding survival analysis: Kaplan-Meier estimate

    Science.gov (United States)

    Goel, Manish Kumar; Khanna, Pardeep; Kishore, Jugal

    2010-01-01

    Kaplan-Meier estimate is one of the best options to be used to measure the fraction of subjects living for a certain amount of time after treatment. In clinical trials or community trials, the effect of an intervention is assessed by measuring the number of subjects survived or saved after that intervention over a period of time. The time starting from a defined point to the occurrence of a given event, for example death is called as survival time and the analysis of group data as survival analysis. This can be affected by subjects under study that are uncooperative and refused to be remained in the study or when some of the subjects may not experience the event or death before the end of the study, although they would have experienced or died if observation continued, or we lose touch with them midway in the study. We label these situations as censored observations. The Kaplan-Meier estimate is the simplest way of computing the survival over time in spite of all these difficulties associated with subjects or situations. The survival curve can be created assuming various situations. It involves computing of probabilities of occurrence of event at a certain point of time and multiplying these successive probabilities by any earlier computed probabilities to get the final estimate. This can be calculated for two groups of subjects and also their statistical difference in the survivals. This can be used in Ayurveda research when they are comparing two drugs and looking for survival of subjects. PMID:21455458

  2. A comparison between Kaplan-Meier and weighted Kaplan-Meier methods of five-year survival estimation of patients with gastric cancer.

    Science.gov (United States)

    Zare, Ali; Mahmoodi, Mahmood; Mohammad, Kazem; Zeraati, Hojjat; Hosseini, Mostafa; Holakouie Naieni, Kourosh

    2014-01-01

    The 5-year survival rate is a good prognostic indicator for patients with Gastric cancer that is usually estimated based on Kaplan-Meier. In situations where censored observations are too many, this method produces biased estimations. This study aimed to compare estimations of Kaplan-Meier and Weighted Kaplan-Meier as an alternative method to deal with the problem of heavy-censoring. Data from 330 patients with Gastric cancer who had undergone surgery at Iran Cancer Institute from 1995- 1999 were analyzed. The Survival Time of these patients was determined after surgery, and the 5-year survival rate for these patients was evaluated based on Kaplan-Meier and Weighted Kaplan-Meier methods. A total of 239 (72.4%) patients passed away by the end of the study and 91(27.6%) patients were censored. The mean and median of survival time for these patients were 24.86±23.73 and 16.33 months, respectively. The one-year, two-year, three-year, four-year, and five-year survival rates of these patients with standard error estimation based on Kaplan-Meier were 0.66 (0.0264), 0.42 (0.0284), 0.31 (0.0274), 0.26 (0.0264) and 0.21 (0.0256) months, respectively. The estimations of Weighted Kaplan-Meier for these patients were 0.62 (0.0251), 0.35 (0.0237), 0.24 (0.0211), 0.17 (0.0172), and 0.10 (0.0125) months, consecutively. In cases where censoring assumption is not made, and the study has many censored observations, estimations obtained from the Kaplan-Meier are biased and are estimated higher than its real amount. But Weighted Kaplan-Meier decreases bias of survival probabilities by providing appropriate weights and presents more accurate understanding.

  3. A comparison between Kaplan-Meier and weighted Kaplan-Meier methods of five-year survival estimation of patients with gastric cancer.

    Directory of Open Access Journals (Sweden)

    Ali Zare

    2014-10-01

    Full Text Available The 5-year survival rate is a good prognostic indicator for patients with Gastric cancer that is usually estimated based on Kaplan-Meier. In situations where censored observations are too many, this method produces biased estimations. This study aimed to compare estimations of Kaplan-Meier and Weighted Kaplan-Meier as an alternative method to deal with the problem of heavy-censoring. Data from 330 patients with Gastric cancer who had undergone surgery at Iran Cancer Institute from 1995- 1999 were analyzed. The Survival Time of these patients was determined after surgery, and the 5-year survival rate for these patients was evaluated based on Kaplan-Meier and Weighted Kaplan-Meier methods. A total of 239 (72.4% patients passed away by the end of the study and 91(27.6% patients were censored. The mean and median of survival time for these patients were 24.86±23.73 and 16.33 months, respectively. The one-year, two-year, three-year, four-year, and five-year survival rates of these patients with standard error estimation based on Kaplan-Meier were 0.66 (0.0264, 0.42 (0.0284, 0.31 (0.0274, 0.26 (0.0264 and 0.21 (0.0256 months, respectively. The estimations of Weighted Kaplan-Meier for these patients were 0.62 (0.0251, 0.35 (0.0237, 0.24 (0.0211, 0.17 (0.0172, and 0.10 (0.0125 months, consecutively. In cases where censoring assumption is not made, and the study has many censored observations, estimations obtained from the Kaplan-Meier are biased and are estimated higher than its real amount. But Weighted Kaplan-Meier decreases bias of survival probabilities by providing appropriate weights and presents more accurate understanding.

  4. On an exponential bound for the Kaplan-Meier estimator.

    Science.gov (United States)

    Wellner, Jon A

    2007-12-01

    We review limit theory and inequalities for the Kaplan-Meier Kaplan and Meier (J Am Stat Assoc 53:457-481, 1958) product limit estimator of a survival function on the whole line [Formula: see text] . Along the way we provide bounds for the constant in an interesting inequality due to Biotouzé et al. (Ann Inst H Poincaré Probab Stat 35:735-763, 1999), and provide some numerical evidence in support of one of their conjectures.

  5. A Berry-Essen Inequality for the Kaplan-Meier L-Estimator

    Institute of Scientific and Technical Information of China (English)

    Qi Hua WANG; Li Xing ZHU

    2001-01-01

    LetFn be the Kaplan-Meier estimator of distribution function F. Let J(.) be a measureablereal-valued function. In this paper, a U-statistic representation for the Kaplan-Meier L-estimator,T(Fn) = xJ(Fn(x))dFn(x), is derived. Furthermore, the representation is also used to establish aBerry-Essen inequality for T(Fn).

  6. A practical divergence measure for survival distributions that can be estimated from Kaplan-Meier curves.

    Science.gov (United States)

    Cox, Trevor F; Czanner, Gabriela

    2016-06-30

    This paper introduces a new simple divergence measure between two survival distributions. For two groups of patients, the divergence measure between their associated survival distributions is based on the integral of the absolute difference in probabilities that a patient from one group dies at time t and a patient from the other group survives beyond time t and vice versa. In the case of non-crossing hazard functions, the divergence measure is closely linked to the Harrell concordance index, C, the Mann-Whitney test statistic and the area under a receiver operating characteristic curve. The measure can be used in a dynamic way where the divergence between two survival distributions from time zero up to time t is calculated enabling real-time monitoring of treatment differences. The divergence can be found for theoretical survival distributions or can be estimated non-parametrically from survival data using Kaplan-Meier estimates of the survivor functions. The estimator of the divergence is shown to be generally unbiased and approximately normally distributed. For the case of proportional hazards, the constituent parts of the divergence measure can be used to assess the proportional hazards assumption. The use of the divergence measure is illustrated on the survival of pancreatic cancer patients. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Factors Determining Disease Duration in Alzheimer’s Disease: A Postmortem Study of 103 Cases Using the Kaplan-Meier Estimator and Cox Regression

    Directory of Open Access Journals (Sweden)

    R. A. Armstrong

    2014-01-01

    Full Text Available Factors associated with duration of dementia in a consecutive series of 103 Alzheimer’s disease (AD cases were studied using the Kaplan-Meier estimator and Cox regression analysis (proportional hazard model. Mean disease duration was 7.1 years (range: 6 weeks–30 years, standard deviation = 5.18; 25% of cases died within four years, 50% within 6.9 years, and 75% within 10 years. Familial AD cases (FAD had a longer duration than sporadic cases (SAD, especially cases linked to presenilin (PSEN genes. No significant differences in duration were associated with age, sex, or apolipoprotein E (Apo E genotype. Duration was reduced in cases with arterial hypertension. Cox regression analysis suggested longer duration was associated with an earlier disease onset and increased senile plaque (SP and neurofibrillary tangle (NFT pathology in the orbital gyrus (OrG, CA1 sector of the hippocampus, and nucleus basalis of Meynert (NBM. The data suggest shorter disease duration in SAD and in cases with hypertensive comorbidity. In addition, degree of neuropathology did not influence survival, but spread of SP/NFT pathology into the frontal lobe, hippocampus, and basal forebrain was associated with longer disease duration.

  8. Quantitative estimation of the stability of methicillin-resistant Staphylococcus aureus strain-typing systems by use of Kaplan-Meier survival analysis.

    Science.gov (United States)

    O'Sullivan, Matthew V N; Sintchenko, Vitali; Gilbert, Gwendolyn L

    2013-01-01

    Knowledge concerning stability is important in the development and assessment of microbial molecular typing systems and is critical for the interpretation of their results. Typing system stability is usually measured as the fraction of isolates that change type after several in vivo passages, but this does not necessarily reflect in vivo stability. The aim of this study was to utilize survival analysis to provide an informative quantitative measure of in vivo stability and to compare the stabilities of various techniques employed in typing methicillin-resistant Staphylococcus aureus (MRSA). We identified 100 MRSA pairs (isolated from the same patient ≥ 1 month apart) and typed them using multilocus sequence typing (MLST), phage-derived open reading frame (PDORF) typing, toxin gene profiling (TGP), staphylococcal cassette chromosome mec (SCCmec) subtyping, pulsed-field gel electrophoresis (PFGE), and spa sequence typing. Discordant isolate pairs, belonging to different MLST clonal complexes, were excluded, leaving 81 pairs for analysis. The stabilities of these methods were examined using Kaplan-Meier survival analysis, and discriminatory power was measured by Simpson's index of diversity. The probability percentages that the type remained unchanged at 6 months for spa sequence typing, TGP, multilocus variable number of tandem repeats analysis (MLVA), SCCmec subtyping, PDORF typing, and PFGE were 95, 95, 88, 82, 71, and 58, respectively, while the Simpson's indices of diversity were 0.48, 0.47, 0.70, 0.72, 0.89, and 0.88, respectively. Survival analysis using sequential clinical isolates adds an important quantitative dimension to the measurement of stability of a microbial typing system. Of the methods compared here, PDORF typing provides high discriminatory power, comparable with that of PFGE, and a level of stability suitable for MRSA surveillance and outbreak investigations.

  9. THE LAW OF THE ITERATED LOGARITHM OF THE KAPLAN-MEIER INTEGRAL AND ITS APPLICATION

    Institute of Scientific and Technical Information of China (English)

    HE SHUYUAN; WANG YANHUA

    2004-01-01

    For right censored data, the law of the iterated logarithm of the Kaplan-Meier integral is established. As an application, the authors prove the law of the iterated logarithm for weighted least square estimates of randomly censored linear regression model.

  10. Total Ankle Replacement Survival Rates Based on Kaplan-Meier Survival Analysis of National Joint Registry Data.

    Science.gov (United States)

    Bartel, Annette F P; Roukis, Thomas S

    2015-10-01

    National joint registry data provides unique information about primary total ankle replacement (TAR) survival. We sought to recreate survival curves among published national joint registry data sets using the Kaplan-Meier estimator. Overall, 5152 primary and 591 TAR revisions were included over a 2- to 13-year period with prosthesis survival for all national joint registries of 0.94 at 2-years, 0.87 at 5-years and 0.81 at 10-years. National joint registry datasets should strive for completion of data presentation including revision definitions, modes and time of failure, and patients lost to follow-up or death for complete accuracy of the Kaplan-Meier estimator.

  11. Application of Kaplan-Meier analysis in reliability evaluation of products cast from aluminium alloys

    Directory of Open Access Journals (Sweden)

    J. Szymszal

    2010-04-01

    Full Text Available The article evaluates the reliability of AlSi17CuNiMg alloys using Kaplan-Meier-based technique, very popular as a survival estimation tool in medical science. The main object of survival analysis is a group (or groups of units for which the time of occurrence of an event (failure taking place after some time of waiting is estimated. For example, in medicine, the failure can be patient’s death. In this study, the failure was the specimen fracture during a periodical fatigue test, while the survival time was either the test duration to specimen failure (complete observations, or the test end time (censored observations. The parameters of theoretical survival function were estimated with procedures based on the method of least squares, while typical survival time distribution followed either an exponential or two-parameter Weibull distribution. The goodness of fit of a model survival function was estimated with an incremental chi-square test, based on the values of the log likelihood ratio. The effect of alloy processing history on the run of a survival function was examined. The factors shaping the alloy processing history included: mould type (sand or metal mould, alloy modification process, and heat treatment type (solution heat treatment and ageing.

  12. Kaplan-Meier analysis on seizure outcome after epilepsy surgery: do gender and race influence it?

    Science.gov (United States)

    Burneo, Jorge G; Villanueva, Vicente; Knowlton, Robert C; Faught, R Edward; Kuzniecky, Ruben I

    2008-06-01

    To evaluate seizure outcome following epilepsy surgery for patients with temporal lobe epilepsy and evaluate is gender and race/ethnicity influence it. Data were obtained from the discharge database of the University of Alabama at Birmingham, Epilepsy Center, between 1985 and 2001. The sample consisted of all patients with a primary diagnosis of medically intractable temporal lobe epilepsy (TLE) who underwent anterior temporal lobectomy. Seizure recurrence was tabulated at 7 days, 2 months, 6 months, 1, 2, 3, 4, 5, and 6 years following surgery. Logistic regression analysis was used to model the presence of seizure recurrence after anterior temporal lobectomy for all patients. Kaplan-Meier analysis was done to obtain estimates and 95% CIs of seizure freedom from baseline. Baseline variables--age at surgery, age at seizure onset, sex, side of resection, immediate postoperative seizures, and pathology results--were assessed as potential predictors of each outcome by comparing the survival curves within each variable with a log rank test. Three hundred sixty-eight patients underwent surgical treatment for TLE, mean age of 30.2 years. Thirty-five patients were African American, 43% were men. Immediate postoperative seizures were seen in 23 patients, while seizure recurrence occurred in 27.3% patients within a year after surgery, and in 33.6% within 6 years. Logistic regression results showed no differences between African Americans and whites, between males and females. The occurrence of immediate postoperative seizures was a strong predictor of late seizure recurrence only at 1 year after surgery. The occurrence of seizures in the immediate postoperative period is a strong predictor of later seizure recurrence. Sex and race/ethnicity do not appear to be predictors of long-term outcome following surgery for temporal lobe epilepsy.

  13. Hazard Rate Estimation for Censored Data via Strong Representation of the Kaplan-Meier Estimator.

    Science.gov (United States)

    1985-08-01

    of bounded variation (condition (k4).) The process( /n 1 has mean zero and covariance SA t (26) r(s,t) E E[C(s) C(t)] - F(s) F(t) f [(u)]- 2 d Lj(u...continuous with density f(x) > 0 at x. Suppose k is of bounded variation and is continuous. Then fn(x) admits the strong approximation on the interval [0,T

  14. Gastric emptying of solids in humans: improved evaluation by Kaplan-Meier plots, with special reference to obesity and gender

    Energy Technology Data Exchange (ETDEWEB)

    Grybaeck, P. [Department of Diagnostic Radiology, Karolinska Hospital, Stockholm (Sweden); Naeslund, E. [Department of Surgery, Karolinska Institute at Danderyd Hospital, Stockholm (Sweden); Hellstroem, P.M. [Department of Internal Medicine, Karolinska Hospital, Stockholm (Sweden); Jacobsson, H. [Department of Diagnostic Radiology, Karolinska Hospital, Stockholm (Sweden)]|[Department of Nuclear Medicine, Karolinska Hospital, Stockholm (Sweden); Backman, L. [Department of Surgery, Karolinska Institute at Danderyd Hospital, Stockholm (Sweden)

    1996-12-01

    It has been suggested that obesity is associated with an altered rate of gastric emptying, and that there are also sex differences in gastric emptying. The results of earlier studies examining gastric emptying rates in obesity and in males and females have proved inconsistent. The aim of this study was to investigate the influence of obesity and gender on gastric emptying, by extending conventional evaluation methods with Kaplan-Meier plots, in order to assess whether these factors have to be accounted for when interpreting results of scintigraphic gastric emptying tests. Twenty-one normal-weight volunteers and nine obese subjects were fed a standardised technetium-99m labelled albumin omelette. Imaging data were acquired at 5- and 10-min intervals in both posterior and anterior projections with the subjects in the sitting position. The half-emptying time, analysed by Kaplan-Meier plot (log-rank test), were shorter in obese subjects compared to normal-weight subjects and later in females compared to males. Also, the lag-phase and half-emptying time were shorter in obese females than in normal females. This study shows an association between different gastric emptying rates and obesity and gender. Therefore, body mass index and gender have to be accounted for when interpreting results of scintigraphic gastric emptying studies. (orig.). With 6 figs., 4 tabs.

  15. KMWin--a convenient tool for graphical presentation of results from Kaplan-Meier survival time analysis.

    Directory of Open Access Journals (Sweden)

    Arnd Gross

    Full Text Available BACKGROUND: Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav, SAS export (xpt or text file (dat, which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. RESULTS: On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. CONCLUSIONS: We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.

  16. Days of Shanghai Stock Index Successive Rises and Fall Based on Kaplan-Meier Algorithms%基于Kaplan-Meier算法的上证指数涨跌天数研究

    Institute of Scientific and Technical Information of China (English)

    毕建欣

    2011-01-01

    运用Kaplan-Meier算法对上证指数连续上涨和下跌天数进行研究,研究了在不同的市场交易制度(即T+0,T+1和涨停板制度)对上证指数涨跌天数的影响,其结果表明Kaplan-Meier算法对于分析股市的变动是有效的.%In this paper, Days of Shanghai Stock Index Successive rises and fall are analyzed by Kaplan-Meier Algorithms. It demonstrates the policy effect on days of Shanghai Stock Index successive rises and fall , such as" T + 0","T + 1"and"soaring deadline system". It also reveals that Kaplan-Meier Algorithms is valid for analyzing the changes of the stock market.

  17. A review and comparison of methods for recreating individual patient data from published Kaplan-Meier survival curves for economic evaluations: a simulation study.

    Directory of Open Access Journals (Sweden)

    Xiaomin Wan

    Full Text Available In general, the individual patient-level data (IPD collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1 least squares method, 2 graphical method; and two recently proposed methods by 3 Hoyle and Henley, 4 Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study.A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured.All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method.The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method.

  18. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  19. Estimating Probabilities in Recommendation Systems

    CERN Document Server

    Sun, Mingxuan; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computation schemes using combinatorial properties of generating functions. We demonstrate our approach with several case studies involving real world movie recommendation data. The results are comparable with state-of-the-art techniques while also providing probabilistic preference estimates outside the scope of traditional recommender systems.

  20. Tail Probabilities for Registration Estimators

    NARCIS (Netherlands)

    T. Mikosch; C.G. de Vries (Casper)

    2006-01-01

    textabstractEstimators of regression coefficients are known to be asymptotically normally distributed, provided certain regularity conditions are satisfied. In small samples and if the noise is not normally distributed, this can be a poor guide to the quality of the estimators. The paper addresses

  1. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  2. Channel Capacity Estimation using Free Probability Theory

    CERN Document Server

    Ryan, Øyvind

    2007-01-01

    In many channel measurement applications, one needs to estimate some characteristics of the channels based on a limited set of measurements. This is mainly due to the highly time varying characteristics of the channel. In this contribution, it will be shown how free probability can be used for channel capacity estimation in MIMO systems. Free probability has already been applied in various application fields such as digital communications, nuclear physics and mathematical finance, and has been shown to be an invaluable tool for describing the asymptotic behaviour of many systems when the dimensions of the system get large (i.e. the number of antennas). In particular, introducing the notion of free deconvolution, we provide hereafter an asymptotically (in the number of antennas) unbiased capacity estimator (w.r.t. the number of observations) for MIMO channels impaired with noise. Another unbiased estimator (for any number of observations) is also constructed by slightly modifying the free probability based est...

  3. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  4. Probabilities of future VEI ≥ 2 eruptions at the Central American Volcanic Arc: a statistical perspective based on the past centuries' eruption record

    Science.gov (United States)

    Dzierma, Yvonne; Wehrmann, Heidi

    2014-10-01

    A probabilistic eruption forecast is provided for seven historically active volcanoes along the Central American Volcanic Arc (CAVA), as a pivotal empirical contribution to multi-disciplinary volcanic hazards assessment. The eruption probabilities are determined with a Kaplan-Meier estimator of survival functions, and parametric time series models are applied to describe the historical eruption records. Aside from the volcanoes that are currently in a state of eruptive activity (Santa María, Fuego, and Arenal), the highest probabilities for eruptions of VEI ≥ 2 occur at Concepción and Cerro Negro in Nicaragua, which are likely to erupt to 70-85 % within the next 10 years. Poás and Irazú in Costa Rica show a medium to high eruption probability, followed by San Miguel (El Salvador), Rincón de la Vieja (Costa Rica), and Izalco (El Salvador; 24 % within the next 10 years).

  5. A Thermodynamical Approach for Probability Estimation

    CERN Document Server

    Isozaki, Takashi

    2012-01-01

    The issue of discrete probability estimation for samples of small size is addressed in this study. The maximum likelihood method often suffers over-fitting when insufficient data is available. Although the Bayesian approach can avoid over-fitting by using prior distributions, it still has problems with objective analysis. In response to these drawbacks, a new theoretical framework based on thermodynamics, where energy and temperature are introduced, was developed. Entropy and likelihood are placed at the center of this method. The key principle of inference for probability mass functions is the minimum free energy, which is shown to unify the two principles of maximum likelihood and maximum entropy. Our method can robustly estimate probability functions from small size data.

  6. A new estimator of the discovery probability.

    Science.gov (United States)

    Favaro, Stefano; Lijoi, Antonio; Prünster, Igor

    2012-12-01

    Species sampling problems have a long history in ecological and biological studies and a number of issues, including the evaluation of species richness, the design of sampling experiments, and the estimation of rare species variety, are to be addressed. Such inferential problems have recently emerged also in genomic applications, however, exhibiting some peculiar features that make them more challenging: specifically, one has to deal with very large populations (genomic libraries) containing a huge number of distinct species (genes) and only a small portion of the library has been sampled (sequenced). These aspects motivate the Bayesian nonparametric approach we undertake, since it allows to achieve the degree of flexibility typically needed in this framework. Based on an observed sample of size n, focus will be on prediction of a key aspect of the outcome from an additional sample of size m, namely, the so-called discovery probability. In particular, conditionally on an observed basic sample of size n, we derive a novel estimator of the probability of detecting, at the (n+m+1)th observation, species that have been observed with any given frequency in the enlarged sample of size n+m. Such an estimator admits a closed-form expression that can be exactly evaluated. The result we obtain allows us to quantify both the rate at which rare species are detected and the achieved sample coverage of abundant species, as m increases. Natural applications are represented by the estimation of the probability of discovering rare genes within genomic libraries and the results are illustrated by means of two expressed sequence tags datasets.

  7. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth;

    2003-01-01

    from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based......biquitous computing environments are highly dynamic, with new unforeseen circumstances and constantly changing environments, which introduces new risks that cannot be assessed through traditional means of risk analysis. Mobile entities in a ubiquitous computing environment require the ability...... to perform an autonomous assessment of the risk incurred by a specific interaction with another entity in a given context. This assessment will allow a mobile entity to decide whether sufficient evidence exists to mitigate the risk and allow the interaction to proceed. Such evidence might include records...

  8. Comparison of density estimators. [Estimation of probability density functions

    Energy Technology Data Exchange (ETDEWEB)

    Kao, S.; Monahan, J.F.

    1977-09-01

    Recent work in the field of probability density estimation has included the introduction of some new methods, such as the polynomial and spline methods and the nearest neighbor method, and the study of asymptotic properties in depth. This earlier work is summarized here. In addition, the computational complexity of the various algorithms is analyzed, as are some simulations. The object is to compare the performance of the various methods in small samples and their sensitivity to change in their parameters, and to attempt to discover at what point a sample is so small that density estimation can no longer be worthwhile. (RWR)

  9. Estimation of transition probabilities of credit ratings

    Science.gov (United States)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  10. NONPARAMETRIC ESTIMATION OF CHARACTERISTICS OF PROBABILITY DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Orlov A. I.

    2015-10-01

    Full Text Available The article is devoted to the nonparametric point and interval estimation of the characteristics of the probabilistic distribution (the expectation, median, variance, standard deviation, variation coefficient of the sample results. Sample values are regarded as the implementation of independent and identically distributed random variables with an arbitrary distribution function having the desired number of moments. Nonparametric analysis procedures are compared with the parametric procedures, based on the assumption that the sample values have a normal distribution. Point estimators are constructed in the obvious way - using sample analogs of the theoretical characteristics. Interval estimators are based on asymptotic normality of sample moments and functions from them. Nonparametric asymptotic confidence intervals are obtained through the use of special output technology of the asymptotic relations of Applied Statistics. In the first step this technology uses the multidimensional central limit theorem, applied to the sums of vectors whose coordinates are the degrees of initial random variables. The second step is the conversion limit multivariate normal vector to obtain the interest of researcher vector. At the same considerations we have used linearization and discarded infinitesimal quantities. The third step - a rigorous justification of the results on the asymptotic standard for mathematical and statistical reasoning level. It is usually necessary to use the necessary and sufficient conditions for the inheritance of convergence. This article contains 10 numerical examples. Initial data - information about an operating time of 50 cutting tools to the limit state. Using the methods developed on the assumption of normal distribution, it can lead to noticeably distorted conclusions in a situation where the normality hypothesis failed. Practical recommendations are: for the analysis of real data we should use nonparametric confidence limits

  11. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  12. The estimation of tree posterior probabilities using conditional clade probability distributions.

    Science.gov (United States)

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  13. Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation

    Science.gov (United States)

    Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin

    2016-12-01

    If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.

  14. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  15. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    Estimation of blood velocities by time-domain cross-correlation of successive high frequency sampled ultrasound signals is investigated. It is shown that any velocity can result from the estimator regardless of the true velocity due to the nonlinear technique employed. Using a simple simulation...... as a filter with a transfer function depending on the actual velocity. This influences the detection probability, which gets lower at certain velocities. An index directly reflecting the probability of detection can easily be calculated from the cross-correlation estimated. This makes it possible to assess...... the reliability of the velocity estimate in real time...

  16. Naive Probability: Model-Based Estimates of Unique Events.

    Science.gov (United States)

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning.

  17. Revising probability estimates: Why increasing likelihood means increasing impact.

    Science.gov (United States)

    Maglio, Sam J; Polman, Evan

    2016-08-01

    Forecasted probabilities rarely stay the same for long. Instead, they are subject to constant revision-moving upward or downward, uncertain events become more or less likely. Yet little is known about how people interpret probability estimates beyond static snapshots, like a 30% chance of rain. Here, we consider the cognitive, affective, and behavioral consequences of revisions to probability forecasts. Stemming from a lay belief that revisions signal the emergence of a trend, we find in 10 studies (comprising uncertain events such as weather, climate change, sex, sports, and wine) that upward changes to event-probability (e.g., increasing from 20% to 30%) cause events to feel less remote than downward changes (e.g., decreasing from 40% to 30%), and subsequently change people's behavior regarding those events despite the revised event-probabilities being the same. Our research sheds light on how revising the probabilities for future events changes how people manage those uncertain events. (PsycINFO Database Record

  18. False Alarm Probability Estimation for Compressive Sensing Radar

    NARCIS (Netherlands)

    Anitori, L.; Otten, M.P.G.; Hoogeboom, P.

    2011-01-01

    In this paper false alarm probability (FAP) estimation of a radar using Compressive Sensing (CS) in the frequency domain is investigated. Compressive Sensing is a recently proposed technique which allows reconstruction of sparse signal from sub-Nyquist rate measurements. The estimation of the FAP is

  19. Estimation of State Transition Probabilities: A Neural Network Model

    Science.gov (United States)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  20. Estimating the empirical probability of submarine landslide occurrence

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  1. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials.

  2. Failure Probability Estimation of Wind Turbines by Enhanced Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Naess, Arvid

    2012-01-01

    This paper discusses the estimation of the failure probability of wind turbines required by codes of practice for designing them. The Standard Monte Carlo (SMC) simulations may be used for this reason conceptually as an alternative to the popular Peaks-Over-Threshold (POT) method. However......, estimation of very low failure probabilities with SMC simulations leads to unacceptably high computational costs. In this study, an Enhanced Monte Carlo (EMC) method is proposed that overcomes this obstacle. The method has advantages over both POT and SMC in terms of its low computational cost and accuracy...... is controlled by the pitch controller. This provides a fair framework for comparison of the behavior and failure event of the wind turbine with emphasis on the effect of the pitch controller. The Enhanced Monte Carlo method is then applied to the model and the failure probabilities of the model are estimated...

  3. Robustness of survival estimates for radio-marked animals

    Science.gov (United States)

    Bunck, C.M.; Chen, C.-L.

    1992-01-01

    Telemetry techniques are often used to study the survival of birds and mammals; particularly whcn mark-recapture approaches are unsuitable. Both parametric and nonparametric methods to estimate survival have becn developed or modified from other applications. An implicit assumption in these approaches is that the probability of re-locating an animal with a functioning transmitter is one. A Monte Carlo study was conducted to determine the bias and variance of the Kaplan-Meier estimator and an estimator based also on the assumption of constant hazard and to eva!uate the performance of the two-sample tests associated with each. Modifications of each estimator which allow a re-Iocation probability of less than one are described and evaluated. Generallv the unmodified estimators were biased but had lower variance. At low sample sizes all estimators performed poorly. Under the null hypothesis, the distribution of all test statistics reasonably approximated the null distribution when survival was low but not when it was high. The power of the two-sample tests were similar.

  4. Bias and precision of methods for estimating the difference in restricted mean survival time from an individual patient data meta-analysis

    Directory of Open Access Journals (Sweden)

    Béranger Lueza

    2016-03-01

    Full Text Available Abstract Background The difference in restricted mean survival time ( rmstD t ∗ $$ rmstD\\left({t}^{\\ast}\\right $$ , the area between two survival curves up to time horizon t ∗ $$ {t}^{\\ast } $$ , is often used in cost-effectiveness analyses to estimate the treatment effect in randomized controlled trials. A challenge in individual patient data (IPD meta-analyses is to account for the trial effect. We aimed at comparing different methods to estimate the rmstD t ∗ $$ rmstD\\left({t}^{\\ast}\\right $$ from an IPD meta-analysis. Methods We compared four methods: the area between Kaplan-Meier curves (experimental vs. control arm ignoring the trial effect (Naïve Kaplan-Meier; the area between Peto curves computed at quintiles of event times (Peto-quintile; the weighted average of the areas between either trial-specific Kaplan-Meier curves (Pooled Kaplan-Meier or trial-specific exponential curves (Pooled Exponential. In a simulation study, we varied the between-trial heterogeneity for the baseline hazard and for the treatment effect (possibly correlated, the overall treatment effect, the time horizon t ∗ $$ {t}^{\\ast } $$ , the number of trials and of patients, the use of fixed or DerSimonian-Laird random effects model, and the proportionality of hazards. We compared the methods in terms of bias, empirical and average standard errors. We used IPD from the Meta-Analysis of Chemotherapy in Nasopharynx Carcinoma (MAC-NPC and its updated version MAC-NPC2 for illustration that included respectively 1,975 and 5,028 patients in 11 and 23 comparisons. Results The Naïve Kaplan-Meier method was unbiased, whereas the Pooled Exponential and, to a much lesser extent, the Pooled Kaplan-Meier methods showed a bias with non-proportional hazards. The Peto-quintile method underestimated the rmstD t ∗ $$ rmstD\\left({t}^{\\ast}\\right $$ , except with non-proportional hazards at t ∗ $$ {t}^{\\ast } $$ = 5 years. In the presence of treatment effect

  5. Simulation and Estimation of Extreme Quantiles and Extreme Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Guyader, Arnaud, E-mail: arnaud.guyader@uhb.fr [Universite Rennes 2 (France); Hengartner, Nicolas [Los Alamos National Laboratory, Information Sciences Group (United States); Matzner-Lober, Eric [Universite Rennes 2 (France)

    2011-10-15

    Let X be a random vector with distribution {mu} on Double-Struck-Capital-R {sup d} and {Phi} be a mapping from Double-Struck-Capital-R {sup d} to Double-Struck-Capital-R . That mapping acts as a black box, e.g., the result from some computer experiments for which no analytical expression is available. This paper presents an efficient algorithm to estimate a tail probability given a quantile or a quantile given a tail probability. The algorithm improves upon existing multilevel splitting methods and can be analyzed using Poisson process tools that lead to exact description of the distribution of the estimated probabilities and quantiles. The performance of the algorithm is demonstrated in a problem related to digital watermarking.

  6. Allelic drop-out probabilities estimated by logistic regression

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic dro...

  7. Estimating the joint survival probabilities of married individuals

    NARCIS (Netherlands)

    Sanders, Lisanne; Melenberg, Bertrand

    2016-01-01

    We estimate the joint survival probability of spouses using a large random sample drawn from a Dutch census. As benchmarks we use two bivariate Weibull models. We consider more flexible models, using a semi-nonparametric approach, by extending the independent Weibull distribution using squared polyn

  8. Recursive estimation of prior probabilities using the mixture approach

    Science.gov (United States)

    Kazakos, D.

    1974-01-01

    The problem of estimating the prior probabilities q sub k of a mixture of known density functions f sub k(X), based on a sequence of N statistically independent observations is considered. It is shown that for very mild restrictions on f sub k(X), the maximum likelihood estimate of Q is asymptotically efficient. A recursive algorithm for estimating Q is proposed, analyzed, and optimized. For the M = 2 case, it is possible for the recursive algorithm to achieve the same performance with the maximum likelihood one. For M 2, slightly inferior performance is the price for having a recursive algorithm. However, the loss is computable and tolerable.

  9. Estimating the historical and future probabilities of large terrorist events

    CERN Document Server

    Clauset, Aaron

    2012-01-01

    Quantities with right-skewed distributions are ubiquitous in complex social systems, including political conflict, economics and social networks, and these systems sometimes produce extremely large events. For instance, the 9/11 terrorist events produced nearly 3000 fatalities, nearly six times more than the next largest event. But, was this enormous loss of life statistically unlikely given modern terrorism's historical record? Accurately estimating the probability of such an event is complicated by the large fluctuations in the empirical distribution's upper tail. We present a generic statistical algorithm for making such estimates, which combines semi-parametric models of tail behavior and a non-parametric bootstrap. Applied to a global database of terrorist events, we estimate the worldwide historical probability of observing at least one 9/11-sized or larger event since 1968 to be 11-35%. These results are robust to conditioning on global variations in economic development, domestic versus international ...

  10. Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory

    Directory of Open Access Journals (Sweden)

    Yafei Song

    2015-01-01

    Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.

  11. Incorporating medical interventions into carrier probability estimation for genetic counseling

    Directory of Open Access Journals (Sweden)

    Katki Hormuzd A

    2007-03-01

    Full Text Available Abstract Background Mendelian models for predicting who may carry an inherited deleterious mutation of known disease genes based on family history are used in a variety of clinical and research activities. People presenting for genetic counseling are increasingly reporting risk-reducing medical interventions in their family histories because, recently, a slew of prophylactic interventions have become available for certain diseases. For example, oophorectomy reduces risk of breast and ovarian cancers, and is now increasingly being offered to women with family histories of breast and ovarian cancer. Mendelian models should account for medical interventions because interventions modify mutation penetrances and thus affect the carrier probability estimate. Methods We extend Mendelian models to account for medical interventions by accounting for post-intervention disease history through an extra factor that can be estimated from published studies of the effects of interventions. We apply our methods to incorporate oophorectomy into the BRCAPRO model, which predicts a woman's risk of carrying mutations in BRCA1 and BRCA2 based on her family history of breast and ovarian cancer. This new BRCAPRO is available for clinical use. Results We show that accounting for interventions undergone by family members can seriously affect the mutation carrier probability estimate, especially if the family member has lived many years post-intervention. We show that interventions have more impact on the carrier probability as the benefits of intervention differ more between carriers and non-carriers. Conclusion These findings imply that carrier probability estimates that do not account for medical interventions may be seriously misleading and could affect a clinician's recommendation about offering genetic testing. The BayesMendel software, which allows one to implement any Mendelian carrier probability model, has been extended to allow medical interventions, so future

  12. Collective animal behavior from Bayesian estimation and probability matching.

    Directory of Open Access Journals (Sweden)

    Alfonso Pérez-Escudero

    2011-11-01

    Full Text Available Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  13. Estimating Income Variances by Probability Sampling: A Case Study

    Directory of Open Access Journals (Sweden)

    Akbar Ali Shah

    2010-08-01

    Full Text Available The main focus of the study is to estimate variability in income distribution of households by conducting a survey. The variances in income distribution have been calculated by probability sampling techniques. The variances are compared and relative gains are also obtained. It is concluded that the income distribution has been better as compared to first Household Income and Expenditure Survey (HIES conducted in Pakistan 1993-94.

  14. Maximum Entropy Estimation of Transition Probabilities of Reversible Markov Chains

    Directory of Open Access Journals (Sweden)

    Erik Van der Straeten

    2009-11-01

    Full Text Available In this paper, we develop a general theory for the estimation of the transition probabilities of reversible Markov chains using the maximum entropy principle. A broad range of physical models can be studied within this approach. We use one-dimensional classical spin systems to illustrate the theoretical ideas. The examples studied in this paper are: the Ising model, the Potts model and the Blume-Emery-Griffiths model.

  15. Ensembles of probability estimation trees for customer churn prediction

    OpenAIRE

    2010-01-01

    Customer churn prediction is one of the most, important elements tents of a company's Customer Relationship Management, (CRM) strategy In tins study, two strategies are investigated to increase the lift. performance of ensemble classification models, i.e (1) using probability estimation trees (PETs) instead of standard decision trees as base classifiers; and (n) implementing alternative fusion rules based on lift weights lot the combination of ensemble member's outputs Experiments ale conduct...

  16. Estimating the exceedance probability of rain rate by logistic regression

    Science.gov (United States)

    Chiu, Long S.; Kedem, Benjamin

    1990-01-01

    Recent studies have shown that the fraction of an area with rain intensity above a fixed threshold is highly correlated with the area-averaged rain rate. To estimate the fractional rainy area, a logistic regression model, which estimates the conditional probability that rain rate over an area exceeds a fixed threshold given the values of related covariates, is developed. The problem of dependency in the data in the estimation procedure is bypassed by the method of partial likelihood. Analyses of simulated scanning multichannel microwave radiometer and observed electrically scanning microwave radiometer data during the Global Atlantic Tropical Experiment period show that the use of logistic regression in pixel classification is superior to multiple regression in predicting whether rain rate at each pixel exceeds a given threshold, even in the presence of noisy data. The potential of the logistic regression technique in satellite rain rate estimation is discussed.

  17. Estimating probable flaw distributions in PWR steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Gorman, J.A.; Turner, A.P.L. [Dominion Engineering, Inc., McLean, VA (United States)

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses.

  18. Probability Density and CFAR Threshold Estimation for Hyperspectral Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G A

    2004-09-21

    The work reported here shows the proof of principle (using a small data set) for a suite of algorithms designed to estimate the probability density function of hyperspectral background data and compute the appropriate Constant False Alarm Rate (CFAR) matched filter decision threshold for a chemical plume detector. Future work will provide a thorough demonstration of the algorithms and their performance with a large data set. The LASI (Large Aperture Search Initiative) Project involves instrumentation and image processing for hyperspectral images of chemical plumes in the atmosphere. The work reported here involves research and development on algorithms for reducing the false alarm rate in chemical plume detection and identification algorithms operating on hyperspectral image cubes. The chemical plume detection algorithms to date have used matched filters designed using generalized maximum likelihood ratio hypothesis testing algorithms [1, 2, 5, 6, 7, 12, 10, 11, 13]. One of the key challenges in hyperspectral imaging research is the high false alarm rate that often results from the plume detector [1, 2]. The overall goal of this work is to extend the classical matched filter detector to apply Constant False Alarm Rate (CFAR) methods to reduce the false alarm rate, or Probability of False Alarm P{sub FA} of the matched filter [4, 8, 9, 12]. A detector designer is interested in minimizing the probability of false alarm while simultaneously maximizing the probability of detection P{sub D}. This is summarized by the Receiver Operating Characteristic Curve (ROC) [10, 11], which is actually a family of curves depicting P{sub D} vs. P{sub FA}parameterized by varying levels of signal to noise (or clutter) ratio (SNR or SCR). Often, it is advantageous to be able to specify a desired P{sub FA} and develop a ROC curve (P{sub D} vs. decision threshold r{sub 0}) for that case. That is the purpose of this work. Specifically, this work develops a set of algorithms and MATLAB

  19. Accurate photometric redshift probability density estimation - method comparison and application

    CERN Document Server

    Rau, Markus Michael; Brimioulle, Fabrice; Frank, Eibe; Friedrich, Oliver; Gruen, Daniel; Hoyle, Ben

    2015-01-01

    We introduce an ordinal classification algorithm for photometric redshift estimation, which vastly improves the reconstruction of photometric redshift probability density functions (PDFs) for individual galaxies and galaxy samples. As a use case we apply our method to CFHTLS galaxies. The ordinal classification algorithm treats distinct redshift bins as ordered values, which improves the quality of photometric redshift PDFs, compared with non-ordinal classification architectures. We also propose a new single value point estimate of the galaxy redshift, that can be used to estimate the full redshift PDF of a galaxy sample. This method is competitive in terms of accuracy with contemporary algorithms, which stack the full redshift PDFs of all galaxies in the sample, but requires orders of magnitudes less storage space. The methods described in this paper greatly improve the log-likelihood of individual object redshift PDFs, when compared with a popular Neural Network code (ANNz). In our use case, this improvemen...

  20. ESTIMATION OF INTRUSION DETECTION PROBABILITY BY PASSIVE INFRARED DETECTORS

    Directory of Open Access Journals (Sweden)

    V. V. Volkhonskiy

    2015-07-01

    Full Text Available Subject of Research. The paper deals with estimation of detection probability of intruder by passive infrared detector in different conditions of velocity and direction for automated analyses of physical protection systems effectiveness. Method. Analytic formulas for detection distance distribution laws obtained by means of experimental histogram approximation are used. Main Results. Applicability of different distribution laws has been studied, such as Rayleigh, Gauss, Gamma, Maxwell and Weibull distribution. Based on walk tests results, approximation of experimental histograms of detection distance probability distribution laws by passive infrared detectors was done. Conformity of the histograms to the mentioned analytical laws according to fitting criterion 2 has been checked for different conditions of velocity and direction of intruder movement. Mean and variance of approximate distribution laws were equal to the same parameters of experimental histograms for corresponding intruder movement parameters. Approximation accuracy evaluation for above mentioned laws was done with significance level of 0.05. According to fitting criterion 2, the Rayleigh and Gamma laws are corresponded mostly close to the histograms for different velocity and direction of intruder movement. Dependences of approximation accuracy for different conditions of intrusion have been got. They are usable for choosing an approximation law in the certain condition. Practical Relevance. Analytic formulas for detection probability are usable for modeling of intrusion process and objective effectiveness estimation of physical protection systems by both developers and users.

  1. Estimation of the probability of success in petroleum exploration

    Science.gov (United States)

    Davis, J.C.

    1977-01-01

    A probabilistic model for oil exploration can be developed by assessing the conditional relationship between perceived geologic variables and the subsequent discovery of petroleum. Such a model includes two probabilistic components, the first reflecting the association between a geologic condition (structural closure, for example) and the occurrence of oil, and the second reflecting the uncertainty associated with the estimation of geologic variables in areas of limited control. Estimates of the conditional relationship between geologic variables and subsequent production can be found by analyzing the exploration history of a "training area" judged to be geologically similar to the exploration area. The geologic variables are assessed over the training area using an historical subset of the available data, whose density corresponds to the present control density in the exploration area. The success or failure of wells drilled in the training area subsequent to the time corresponding to the historical subset provides empirical estimates of the probability of success conditional upon geology. Uncertainty in perception of geological conditions may be estimated from the distribution of errors made in geologic assessment using the historical subset of control wells. These errors may be expressed as a linear function of distance from available control. Alternatively, the uncertainty may be found by calculating the semivariogram of the geologic variables used in the analysis: the two procedures will yield approximately equivalent results. The empirical probability functions may then be transferred to the exploration area and used to estimate the likelihood of success of specific exploration plays. These estimates will reflect both the conditional relationship between the geological variables used to guide exploration and the uncertainty resulting from lack of control. The technique is illustrated with case histories from the mid-Continent area of the U.S.A. ?? 1977 Plenum

  2. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...... (range 0-15 years). One-third of patients experienced a transient improvement in motor symptoms with use of levodopa. Median survival from disease onset was 6.9 years (range 1-16 years, 95% CI 6.3-7.5) with no apparent variation according to gender or subtype. Conclusions: Our nationwide approach...

  3. Estimation of probability densities using scale-free field theories.

    Science.gov (United States)

    Kinney, Justin B

    2014-07-01

    The question of how best to estimate a continuous probability density from finite data is an intriguing open problem at the interface of statistics and physics. Previous work has argued that this problem can be addressed in a natural way using methods from statistical field theory. Here I describe results that allow this field-theoretic approach to be rapidly and deterministically computed in low dimensions, making it practical for use in day-to-day data analysis. Importantly, this approach does not impose a privileged length scale for smoothness of the inferred probability density, but rather learns a natural length scale from the data due to the tradeoff between goodness of fit and an Occam factor. Open source software implementing this method in one and two dimensions is provided.

  4. New estimators of the extreme value index under random right censoring, for heavy-tailed distributions

    OpenAIRE

    Worms, Julien; Worms, Rym

    2014-01-01

    International audience; This paper presents new approaches for the estimation of the extreme value index in the framework of randomly censored (from the right) samples, based on the ideas of Kaplan-Meier integration and the synthetic data approach of S.Leurgans (1987). These ideas are developed here in the heavy tail case and for the adaptation of the Hill estimator, for which the consistency is proved under first order conditions. Simulations show good performance of the two approaches, with...

  5. Estimation of exposure distribution adjusting for association between exposure level and detection limit.

    Science.gov (United States)

    Yang, Yuchen; Shelton, Brent J; Tucker, Thomas T; Li, Li; Kryscio, Richard; Chen, Li

    2017-08-15

    In environmental exposure studies, it is common to observe a portion of exposure measurements to fall below experimentally determined detection limits (DLs). The reverse Kaplan-Meier estimator, which mimics the well-known Kaplan-Meier estimator for right-censored survival data with the scale reversed, has been recommended for estimating the exposure distribution for the data subject to DLs because it does not require any distributional assumption. However, the reverse Kaplan-Meier estimator requires the independence assumption between the exposure level and DL and can lead to biased results when this assumption is violated. We propose a kernel-smoothed nonparametric estimator for the exposure distribution without imposing any independence assumption between the exposure level and DL. We show that the proposed estimator is consistent and asymptotically normal. Simulation studies demonstrate that the proposed estimator performs well in practical situations. A colon cancer study is provided for illustration. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Renal tolerance to nonhomogenous irradiation: Comparison of observed effects to predictions of normal tissue complication probability from different biophysical models

    Energy Technology Data Exchange (ETDEWEB)

    Flentje, M.; Hensley, F.; Gademann, G.; Wannenmacher, M. (Univ. of Heidelberg (Germany)); Menke, M. (German Cancer Research Center, Heidelberg (Germany))

    1993-09-01

    A patient series was analyzed retrospectively as an example of whole organ kidney irradiation with an inhomogenous dose distribution to test the validity of biophysical models predicting normal tissue tolerance to radiotherapy. From 1969 to 1984, 142 patients with seminoma were irradiated to the paraaortic region using predominantly rotational techniques which led to variable but partly substantial exposure of the kidneys. Median follow up was 8.2 (2.1-21) years and actuarial 10-year survival (Kaplan-Meier estimate) 82.8%. For all patients 3-dimensional dose distributions were reconstructed and normal tissue complication probabilities for the kidneys were generated from the individual dose volume histograms. To this respect different published biophysical algorithms were introduced in a 3-dimensional-treatment planning system. In seven patients clinically manifest renal impairment was observed (interval 10-84 months). An excellent agreement between predicted and observed effects was seen for two volume-oriented models, whereas complications were overestimated by an algorithm based on critical element assumptions. Should these observations be confirmed and extended to different types of organs corresponding algorithms could easily be integrated into 3-dimensional-treatment planning programs and be used for comparing and judging different plans on a more biologically oriented basis.

  7. Site Specific Probable Maximum Precipitation Estimates and Professional Judgement

    Science.gov (United States)

    Hayes, B. D.; Kao, S. C.; Kanney, J. F.; Quinlan, K. R.; DeNeale, S. T.

    2015-12-01

    State and federal regulatory authorities currently rely upon the US National Weather Service Hydrometeorological Reports (HMRs) to determine probable maximum precipitation (PMP) estimates (i.e., rainfall depths and durations) for estimating flooding hazards for relatively broad regions in the US. PMP estimates for the contributing watersheds upstream of vulnerable facilities are used to estimate riverine flooding hazards while site-specific estimates for small water sheds are appropriate for individual facilities such as nuclear power plants. The HMRs are often criticized due to their limitations on basin size, questionable applicability in regions affected by orographic effects, their lack of consist methods, and generally by their age. HMR-51 for generalized PMP estimates for the United States east of the 105th meridian, was published in 1978 and is sometimes perceived as overly conservative. The US Nuclear Regulatory Commission (NRC), is currently reviewing several flood hazard evaluation reports that rely on site specific PMP estimates that have been commercially developed. As such, NRC has recently investigated key areas of expert judgement via a generic audit and one in-depth site specific review as they relate to identifying and quantifying actual and potential storm moisture sources, determining storm transposition limits, and adjusting available moisture during storm transposition. Though much of the approach reviewed was considered a logical extension of HMRs, two key points of expert judgement stood out for further in-depth review. The first relates primarily to small storms and the use of a heuristic for storm representative dew point adjustment developed for the Electric Power Research Institute by North American Weather Consultants in 1993 in order to harmonize historic storms for which only 12 hour dew point data was available with more recent storms in a single database. The second issue relates to the use of climatological averages for spatially

  8. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be

  9. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables

    NARCIS (Netherlands)

    Brus, D.J.; Gruijter, de J.J.

    2003-01-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be in

  10. Polynomial probability distribution estimation using the method of moments.

    Science.gov (United States)

    Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.

  11. Estimating age conditional probability of developing disease from surveillance data

    Directory of Open Access Journals (Sweden)

    Fay Michael P

    2004-07-01

    Full Text Available Abstract Fay, Pfeiffer, Cronin, Le, and Feuer (Statistics in Medicine 2003; 22; 1837–1848 developed a formula to calculate the age-conditional probability of developing a disease for the first time (ACPDvD for a hypothetical cohort. The novelty of the formula of Fay et al (2003 is that one need not know the rates of first incidence of disease per person-years alive and disease-free, but may input the rates of first incidence per person-years alive only. Similarly the formula uses rates of death from disease and death from other causes per person-years alive. The rates per person-years alive are much easier to estimate than per person-years alive and disease-free. Fay et al (2003 used simple piecewise constant models for all three rate functions which have constant rates within each age group. In this paper, we detail a method for estimating rate functions which does not have jumps at the beginning of age groupings, and need not be constant within age groupings. We call this method the mid-age group joinpoint (MAJ model for the rates. The drawback of the MAJ model is that numerical integration must be used to estimate the resulting ACPDvD. To increase computational speed, we offer a piecewise approximation to the MAJ model, which we call the piecewise mid-age group joinpoint (PMAJ model. The PMAJ model for the rates input into the formula for ACPDvD described in Fay et al (2003 is the current method used in the freely available DevCan software made available by the National Cancer Institute.

  12. On estimation of survival function under random censoring model

    Institute of Scientific and Technical Information of China (English)

    JIANG; Jiancheng(蒋建成); CHENG; Bo(程博); WU; Xizhi(吴喜之)

    2002-01-01

    We study an estimator of the survival function under the random censoring model. Bahadur-type representation of the estimator is obtained and asymptotic expression for its mean squared errors is given, which leads to the consistency and asymptotic normality of the estimator. A data-driven local bandwidth selection rule for the estimator is proposed. It is worth noting that the estimator is consistent at left boundary points, which contrasts with the cases of density and hazard rate estimation. A Monte Carlo comparison of different estimators is made and it appears that the proposed data-driven estimators have certain advantages over the common Kaplan-Meier estmator.

  13. Dental age estimation: the role of probability estimates at the 10 year threshold.

    Science.gov (United States)

    Lucas, Victoria S; McDonald, Fraser; Neil, Monica; Roberts, Graham

    2014-08-01

    The use of probability at the 18 year threshold has simplified the reporting of dental age estimates for emerging adults. The availability of simple to use widely available software has enabled the development of the probability threshold for individual teeth in growing children. Tooth development stage data from a previous study at the 10 year threshold were reused to estimate the probability of developing teeth being above or below the 10 year thresh-hold using the NORMDIST Function in Microsoft Excel. The probabilities within an individual subject are averaged to give a single probability that a subject is above or below 10 years old. To test the validity of this approach dental panoramic radiographs of 50 female and 50 male children within 2 years of the chronological age were assessed with the chronological age masked. Once the whole validation set of 100 radiographs had been assessed the masking was removed and the chronological age and dental age compared. The dental age was compared with chronological age to determine whether the dental age correctly or incorrectly identified a validation subject as above or below the 10 year threshold. The probability estimates correctly identified children as above or below on 94% of occasions. Only 2% of the validation group with a chronological age of less than 10 years were assigned to the over 10 year group. This study indicates the very high accuracy of assignment at the 10 year threshold. Further work at other legally important age thresholds is needed to explore the value of this approach to the technique of age estimation. Copyright © 2014. Published by Elsevier Ltd.

  14. First hitting probabilities for semi markov chains and estimation

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos

    2017-01-01

    We first consider a stochastic system described by an absorbing semi-Markov chain with finite state space and we introduce the absorption probability to a class of recurrent states. Afterwards, we study the first hitting probability to a subset of states for an irreducible semi-Markov chain...

  15. Probability Prediction in Multistate Survival Models for Patients with Chronic Myeloid Leukaemia

    Institute of Scientific and Technical Information of China (English)

    FANG Ya; Hein Putter

    2005-01-01

    In order to find an appropriate model suitable for a multistate survival experiment, 634 patients with chronic myeloid leukaemia (CML) were selected to illustrate the method of analysis.After transplantation, there were 4 possible situations for a patient: disease free, relapse but still alive, death before relapse, and death after relapse. The last 3 events were considered as treatment failure. The results showed that the risk of death before relapse was higher than that of the relapse,especially in the first year after transplantation with competing-risk method. The result of patients with relapse time less than 12 months was much poor by the Kaplan-Meier method. And the multistate survival models were developed, which were detailed and informative based on the analysis of competing risks and Kaplan-Meier analysis. With the multistate survival models, a further analysis on conditional probability was made for patients who were disease free and still alive at month 12 after transplantation. It was concluded that it was possible for an individual patient to predict the 4 possible probabilities at any time. Also the prognoses for relapse either death or not and death either before or afterrelapse may be given. Furthermore, the conditional probabilities for patients who were disease free and still alive in a given time after transplantation can be predicted.

  16. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  17. A method to combine non-probability sample data with probability sample data in estimating spatial means of environmental variables.

    Science.gov (United States)

    Brus, D J; de Gruijter, J J

    2003-04-01

    In estimating spatial means of environmental variables of a region from data collected by convenience or purposive sampling, validity of the results can be ensured by collecting additional data through probability sampling. The precision of the pi estimator that uses the probability sample can be increased by interpolating the values at the nonprobability sample points to the probability sample points, and using these interpolated values as an auxiliary variable in the difference or regression estimator. These estimators are (approximately) unbiased, even when the nonprobability sample is severely biased such as in preferential samples. The gain in precision compared to the pi estimator in combination with Simple Random Sampling is controlled by the correlation between the target variable and interpolated variable. This correlation is determined by the size (density) and spatial coverage of the nonprobability sample, and the spatial continuity of the target variable. In a case study the average ratio of the variances of the simple regression estimator and pi estimator was 0.68 for preferential samples of size 150 with moderate spatial clustering, and 0.80 for preferential samples of similar size with strong spatial clustering. In the latter case the simple regression estimator was substantially more precise than the simple difference estimator.

  18. Survival estimation in two-phase cohort studies with application to biomarkers evaluation.

    Science.gov (United States)

    Rebora, Paola; Valsecchi, Maria Grazia

    2016-12-01

    Two-phase studies are attractive for their economy and efficiency in research settings where large cohorts are available for investigating the prognostic and predictive role of novel genetic and biological factors. In this type of study, information on novel factors is collected only in a convenient subcohort (phase II) drawn from the cohort (phase I) according to a given (optimal) sampling strategy. Estimation of survival in the subcohort needs to account for the design. The Kaplan-Meier method, based on counts of events and of subjects at risk in time, must be applied accounting, with suitable weights, for the sampling probabilities of the subjects in phase II, in order to recover the representativeness of the subcohort for the entire cohort. The authors derived a proper variance estimator of survival by linearization. The proposed method is applied in the context of a two-phase study on childhood acute lymphoblastic leukemia, which was planned in order to evaluate the role of genetic polymorphisms on treatment failure due to relapse. The method has shown satisfactory performance through simulations under different scenarios, including the case-control setting, and proved to be useful for describing results in the clinical example.

  19. The estimation of yearly probability gain for seismic statistical model

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the calculation method of information gain in the stochastic process presented by Vere-Jones, the relation between information gain and probability gain is studied, which is very common in earthquake prediction, and the yearly probability gain for seismic statistical model is proposed. The method is applied to the non-stationary Poisson model with whole-process exponential increase and stress release model. In addition, the prediction method of stress release model is obtained based on the inverse function simulation method of stochastic variable.

  20. Improved Estimation of Forestry Edge Effects Accounting for Detection Probability

    OpenAIRE

    Hocking, Daniel; Babbitt, Kimberly; Yamasaki, Mariko

    2013-01-01

    Poster presented at the 98th annual meeting of the Ecological Society of America (ESA) in Minneapolis, Minnesota, USA. We used a non-linear, parametric model accounting for detection probability to quantify red-backed salamander (Plethodon cinereus) abundance across clearcut-forest edges. This approach allows for projection across landscapes and prediction given alternative logging plans.

  1. Estimating Probabilities of Default for Low Default Portfolios

    OpenAIRE

    Katja Pluto; Dirk Tasche

    2004-01-01

    For credit risk management purposes in general, and for allocation of regulatory capital by banks in particular (Basel II), numerical assessments of the credit-worthiness of borrowers are indispensable. These assessments are expressed in terms of probabilities of default (PD) that should incorporate a certain degree of conservatism in order to reflect the prudential risk management style banks are required to apply. In case of credit portfolios that did not at all suffer defaults, or very few...

  2. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  3. Estimating the posterior probabilities using the k-nearest neighbor rule.

    Science.gov (United States)

    Atiya, Amir F

    2005-03-01

    In many pattern classification problems, an estimate of the posterior probabilities (rather than only a classification) is required. This is usually the case when some confidence measure in the classification is needed. In this article, we propose a new posterior probability estimator. The proposed estimator considers the K-nearest neighbors. It attaches a weight to each neighbor that contributes in an additive fashion to the posterior probability estimate. The weights corresponding to the K-nearest-neighbors (which add to 1) are estimated from the data using a maximum likelihood approach. Simulation studies confirm the effectiveness of the proposed estimator.

  4. Estimating the probability of coexistence in cross-feeding communities.

    Science.gov (United States)

    Vessman, Björn; Gerlee, Philip; Lundh, Torbjörn

    2016-11-07

    The dynamics of many microbial ecosystems are driven by cross-feeding interactions, in which metabolites excreted by some species are metabolised further by others. The population dynamics of such ecosystems are governed by frequency-dependent selection, which allows for stable coexistence of two or more species. We have analysed a model of cross-feeding based on the replicator equation, with the aim of establishing criteria for coexistence in ecosystems containing three species, given the information of the three species' ability to coexist in their three separate pairs, i.e. the long term dynamics in the three two-species component systems. The triple-system is studied statistically and the probability of coexistence in the species triplet is computed for two models of species interactions. The interaction parameters are modelled either as stochastically independent or organised in a hierarchy where any derived metabolite carries less energy than previous nutrients in the metabolic chain. We differentiate between different modes of coexistence with respect to the pair-wise dynamics of the species, and find that the probability of coexistence is close to 12 for triplet systems with three pair-wise coexistent pairs and for the so-called intransitive systems. Systems with two and one pair-wise coexistent pairs are more likely to exist for random interaction parameters, but are on the other hand much less likely to exhibit triplet coexistence. Hence we conclude that certain species triplets are, from a statistical point of view, rare, but if allowed to interact are likely to coexist. This knowledge might be helpful when constructing synthetic microbial communities for industrial purposes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. METAPHOR: Probability density estimation for machine learning based photometric redshifts

    Science.gov (United States)

    Amaro, V.; Cavuoti, S.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.

    2017-06-01

    We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method able to provide a reliable PDF for photometric galaxy redshifts estimated through empirical techniques. METAPHOR is a modular workflow, mainly based on the MLPQNA neural network as internal engine to derive photometric galaxy redshifts, but giving the possibility to easily replace MLPQNA with any other method to predict photo-z's and their PDF. We present here the results about a validation test of the workflow on the galaxies from SDSS-DR9, showing also the universality of the method by replacing MLPQNA with KNN and Random Forest models. The validation test include also a comparison with the PDF's derived from a traditional SED template fitting method (Le Phare).

  6. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  7. The Estimation of Probability of Extreme Events for Small Samples

    Science.gov (United States)

    Pisarenko, V. F.; Rodkin, M. V.

    2017-02-01

    The most general approach to the study of rare extreme events is based on the extreme value theory. The fundamental General Extreme Value Distribution lies in the basis of this theory serving as the limit distribution for normalized maxima. It depends on three parameters. Usually the method of maximum likelihood (ML) is used for the estimation that possesses well-known optimal asymptotic properties. However, this method works efficiently only when sample size is large enough ( 200-500), whereas in many applications the sample size does not exceed 50-100. For such sizes, the advantage of the ML method in efficiency is not guaranteed. We have found that for this situation the method of statistical moments (SM) works more efficiently over other methods. The details of the estimation for small samples are studied. The SM is applied to the study of extreme earthquakes in three large virtual seismic zones, representing the regime of seismicity in subduction zones, intracontinental regime of seismicity, and the regime in mid-ocean ridge zones. The 68%-confidence domains for pairs of parameter (ξ, σ) and (σ, μ) are derived.

  8. Estimating deficit probabilities with price-responsive demand in contract-based electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Galetovic, Alexander [Facultad de Ciencias Economicas y Empresariales, Universidad de los Andes, Santiago (Chile); Munoz, Cristian M. [Departamento de Ingenieria Electrica, Universidad de Chile, Mariano Sanchez Fontecilla 310, piso 3 Las Condes, Santiago (Chile)

    2009-02-15

    Studies that estimate deficit probabilities in hydrothermal systems have generally ignored the response of demand to changing prices, in the belief that such response is largely irrelevant. We show that ignoring the response of demand to prices can lead to substantial over or under estimation of the probability of an energy deficit. To make our point we present an estimation of deficit probabilities in Chile's Central Interconnected System between 2006 and 2010. This period is characterized by tight supply, fast consumption growth and rising electricity prices. When the response of demand to rising prices is acknowledged, forecasted deficit probabilities and marginal costs are shown to be substantially lower. (author)

  9. Efficient Estimation of first Passage Probability of high-Dimensional Nonlinear Systems

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Bucher, Christian

    2011-01-01

    An efficient method for estimating low first passage probabilities of high-dimensional nonlinear systems based on asymptotic estimation of low probabilities is presented. The method does not require any a priori knowledge of the system, i.e. it is a black-box method, and has very low requirements......, the failure probabilities of three well-known nonlinear systems are estimated. Next, a reduced degree-of-freedom model of a wind turbine is developed and is exposed to a turbulent wind field. The model incorporates very high dimensions and strong nonlinearities simultaneously. The failure probability...

  10. Estimation and asymptotic theory for transition probabilities in Markov Renewal Multi–state models

    NARCIS (Netherlands)

    Spitoni, C.; Verduijn, M.; Putter, H.

    2012-01-01

    In this paper we discuss estimation of transition probabilities for semi–Markov multi–state models. Non–parametric and semi–parametric estimators of the transition probabilities for a large class of models (forward going models) are proposed. Large sample theory is derived using the functional delta

  11. The contribution of threat probability estimates to reexperiencing symptoms: a prospective analog study.

    Science.gov (United States)

    Regambal, Marci J; Alden, Lynn E

    2012-09-01

    Individuals with posttraumatic stress disorder (PTSD) are hypothesized to have a "sense of current threat." Perceived threat from the environment (i.e., external threat), can lead to overestimating the probability of the traumatic event reoccurring (Ehlers & Clark, 2000). However, it is unclear if external threat judgments are a pre-existing vulnerability for PTSD or a consequence of trauma exposure. We used trauma analog methodology to prospectively measure probability estimates of a traumatic event, and investigate how these estimates were related to cognitive processes implicated in PTSD development. 151 participants estimated the probability of being in car-accident related situations, watched a movie of a car accident victim, and then completed a measure of data-driven processing during the movie. One week later, participants re-estimated the probabilities, and completed measures of reexperiencing symptoms and symptom appraisals/reactions. Path analysis revealed that higher pre-existing probability estimates predicted greater data-driven processing which was associated with negative appraisals and responses to intrusions. Furthermore, lower pre-existing probability estimates and negative responses to intrusions were both associated with a greater change in probability estimates. Reexperiencing symptoms were predicted by negative responses to intrusions and, to a lesser degree, by greater changes in probability estimates. The undergraduate student sample may not be representative of the general public. The reexperiencing symptoms are less severe than what would be found in a trauma sample. Threat estimates present both a vulnerability and a consequence of exposure to a distressing event. Furthermore, changes in these estimates are associated with cognitive processes implicated in PTSD. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Inverse Probability Weighted Generalised Empirical Likelihood Estimators : Firm Size and R&D Revisited

    NARCIS (Netherlands)

    Inkmann, J.

    2005-01-01

    The inverse probability weighted Generalised Empirical Likelihood (IPW-GEL) estimator is proposed for the estimation of the parameters of a vector of possibly non-linear unconditional moment functions in the presence of conditionally independent sample selection or attrition.The estimator is applied

  13. Probability estimation with machine learning methods for dichotomous and multicategory outcome: theory.

    Science.gov (United States)

    Kruppa, Jochen; Liu, Yufeng; Biau, Gérard; Kohler, Michael; König, Inke R; Malley, James D; Ziegler, Andreas

    2014-07-01

    Probability estimation for binary and multicategory outcome using logistic and multinomial logistic regression has a long-standing tradition in biostatistics. However, biases may occur if the model is misspecified. In contrast, outcome probabilities for individuals can be estimated consistently with machine learning approaches, including k-nearest neighbors (k-NN), bagged nearest neighbors (b-NN), random forests (RF), and support vector machines (SVM). Because machine learning methods are rarely used by applied biostatisticians, the primary goal of this paper is to explain the concept of probability estimation with these methods and to summarize recent theoretical findings. Probability estimation in k-NN, b-NN, and RF can be embedded into the class of nonparametric regression learning machines; therefore, we start with the construction of nonparametric regression estimates and review results on consistency and rates of convergence. In SVMs, outcome probabilities for individuals are estimated consistently by repeatedly solving classification problems. For SVMs we review classification problem and then dichotomous probability estimation. Next we extend the algorithms for estimating probabilities using k-NN, b-NN, and RF to multicategory outcomes and discuss approaches for the multicategory probability estimation problem using SVM. In simulation studies for dichotomous and multicategory dependent variables we demonstrate the general validity of the machine learning methods and compare it with logistic regression. However, each method fails in at least one simulation scenario. We conclude with a discussion of the failures and give recommendations for selecting and tuning the methods. Applications to real data and example code are provided in a companion article (doi:10.1002/bimj.201300077).

  14. Is Risk Aversion Really Correlated with Wealth? How estimated probabilities introduce spurious correlation

    OpenAIRE

    Lybbert, Travis J.; Just, David R

    2006-01-01

    Economists attribute many common behaviors to risk aversion and frequently focus on how wealth moderates risk preferences. This paper highlights a problem associated with empirical tests of the relationship between wealth and risk aversion that can arise when the probabilities individuals face are unobservable to researchers. The common remedy for unobservable probabilities involves the estimation of probabilities in a profit or production that includes farmer, farm and agro-climatic variable...

  15. Probability of an Error in Estimation of States of a Modulated Synchronous Flow of Physical Events

    Science.gov (United States)

    Gortsev, A. M.; Sirotina, M. N.

    2016-11-01

    A flow of physical events (photons, electrons, etc.) is considered. One of the mathematical models of such flows is a modulated synchronous doubly stochastic flow of events. Analytical results for conditional and unconditional probabilities of erroneous decision in optimal estimation of flow states upon the criterion of the a posteriori probability maximum are presented.

  16. Easy probability estimation of the diagnosis of early axial spondyloarthritis by summing up scores.

    Science.gov (United States)

    Feldtkeller, Ernst; Rudwaleit, Martin; Zeidler, Henning

    2013-09-01

    Several sets of criteria for the diagnosis of axial SpA (including non-radiographic axial spondyloarthritis) have been proposed in the literature in which scores were attributed to relevant findings and the diagnosis requests a minimal sum of these scores. To quantitatively estimate the probability of axial SpA, multiplying the likelihood ratios of all relevant findings was proposed by Rudwaleit et al. in 2004. The objective of our proposal is to combine the advantages of both, i.e. to estimate the probability by summing up scores instead of multiplying likelihood ratios. An easy way to estimate the probability of axial spondyloarthritis is to use the logarithms of the likelihood ratios as scores attributed to relevant findings and to use the sum of these scores for the probability estimation. A list of whole-numbered scores for relevant findings is presented, and also threshold sum values necessary for a definite and for a probable diagnosis of axial SpA as well as a threshold below which the diagnosis of axial spondyloarthritis can be excluded. In a diagram, the probability of axial spondyloarthritis is given for sum values between these thresholds. By the method proposed, the advantages of both, the easy summing up of scores and the quantitative calculation of the diagnosis probability, are combined. Our method also makes it easier to estimate which additional tests are necessary to come to a definite diagnosis.

  17. Inverse Probability Weighted Generalised Empirical Likelihood Estimators : Firm Size and R&D Revisited

    OpenAIRE

    2005-01-01

    The inverse probability weighted Generalised Empirical Likelihood (IPW-GEL) estimator is proposed for the estimation of the parameters of a vector of possibly non-linear unconditional moment functions in the presence of conditionally independent sample selection or attrition.The estimator is applied to the estimation of the firm size elasticity of product and process R&D expenditures using a panel of German manufacturing firms, which is affected by attrition and selection into R&D activities....

  18. Conditional probability distribution (CPD) method in temperature based death time estimation: Error propagation analysis.

    Science.gov (United States)

    Hubig, Michael; Muggenthaler, Holger; Mall, Gita

    2014-05-01

    Bayesian estimation applied to temperature based death time estimation was recently introduced as conditional probability distribution or CPD-method by Biermann and Potente. The CPD-method is useful, if there is external information that sets the boundaries of the true death time interval (victim last seen alive and found dead). CPD allows computation of probabilities for small time intervals of interest (e.g. no-alibi intervals of suspects) within the large true death time interval. In the light of the importance of the CPD for conviction or acquittal of suspects the present study identifies a potential error source. Deviations in death time estimates will cause errors in the CPD-computed probabilities. We derive formulae to quantify the CPD error as a function of input error. Moreover we observed the paradox, that in cases, in which the small no-alibi time interval is located at the boundary of the true death time interval, adjacent to the erroneous death time estimate, CPD-computed probabilities for that small no-alibi interval will increase with increasing input deviation, else the CPD-computed probabilities will decrease. We therefore advise not to use CPD if there is an indication of an error or a contra-empirical deviation in the death time estimates, that is especially, if the death time estimates fall out of the true death time interval, even if the 95%-confidence intervals of the estimate still overlap the true death time interval.

  19. The role of misclassification in estimating proportions and an estimator of misclassification probability

    Science.gov (United States)

    Patrick L. Zimmerman; Greg C. Liknes

    2010-01-01

    Dot grids are often used to estimate the proportion of land cover belonging to some class in an aerial photograph. Interpreter misclassification is an often-ignored source of error in dot-grid sampling that has the potential to significantly bias proportion estimates. For the case when the true class of items is unknown, we present a maximum-likelihood estimator of...

  20. Predictiveness of sonographic fetal weight estimation as a function of prior probability of intrauterine growth retardation.

    Science.gov (United States)

    Simon, N V; Levisky, J S; Shearer, D M; Morris, K C; Hansberry, P A

    1988-06-01

    We evaluated the predictiveness of sonographically estimated fetal weight as a function of the estimation of probability of having intrauterine growth retardation (IUGR) before obtaining an ultrasound scan (prior probability). The value of the estimated fetal weight resided more in its high specificity than in its sensitivity, hence in its ability to confirm that the fetus is normal. The predictiveness of the method was further enhanced when the fetal weight estimation was placed in the context of the prior probability of IUGR. In particular, the positive predictive value of the test as well as the likelihood of having a growth-retarded infant in spite of an estimated fetal weight within the normal range were considerably higher as the prior probability of IUGR increased. Since the obstetrician using all available evidence is likely to form a rather good estimate of the possibility of IUGR before ordering a scan, this improvement in the predictiveness of estimated fetal weight through a Bayesian approach can be advantageously applied to ultrasound analysis and can effectively support clinical decision making.

  1. Multifractals embedded in short time series: An unbiased estimation of probability moment

    Science.gov (United States)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  2. First Passage Probability Estimation of Wind Turbines by Markov Chain Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.

    2013-01-01

    Markov Chain Monte Carlo simulation has received considerable attention within the past decade as reportedly one of the most powerful techniques for the first passage probability estimation of dynamic systems. A very popular method in this direction capable of estimating probability of rare events...... with low computation cost is the subset simulation (SS). The idea of the method is to break a rare event into a sequence of more probable events which are easy to be estimated based on the conditional simulation techniques. Recently, two algorithms have been proposed in order to increase the efficiency...... of the method by modifying the conditional sampler. In this paper, applicability of the original SS is compared to the recently introduced modifications of the method on a wind turbine model. The model incorporates a PID pitch controller which aims at keeping the rotational speed of the wind turbine rotor equal...

  3. Estimating probability curves of rock variables using orthogonal polynomials and sample moments

    Institute of Scientific and Technical Information of China (English)

    DENG Jian; BIAN Li

    2005-01-01

    A new algorithm using orthogonal polynomials and sample moments was presented for estimating probability curves directly from experimental or field data of rock variables. The moments estimated directly from a sample of observed values of a random variable could be conventional moments (moments about the origin or central moments) and probability-weighted moments (PWMs). Probability curves derived from orthogonal polynomials and conventional moments are probability density functions (PDF), and probability curves derived from orthogonal polynomials and PWMs are inverse cumulative density functions (CDF) of random variables. The proposed approach is verified by two most commonly-used theoretical standard distributions: normal and exponential distribution. Examples from observed data of uniaxial compressive strength of a rock and concrete strength data are presented for illustrative purposes. The results show that probability curves of rock variable can be accurately derived from orthogonal polynomials and sample moments. Orthogonal polynomials and PWMs enable more secure inferences to be made from relatively small samples about an underlying probability curve.

  4. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  5. Impaired probability estimation and decision-making in pathological gambling poker players.

    Science.gov (United States)

    Linnet, Jakob; Frøslev, Mette; Ramsgaard, Stine; Gebauer, Line; Mouridsen, Kim; Wohlert, Victoria

    2012-03-01

    Poker has gained tremendous popularity in recent years, increasing the risk for some individuals to develop pathological gambling. Here, we investigated cognitive biases in a computerized two-player poker task against a fictive opponent, among 12 pathological gambling poker players (PGP), 10 experienced poker players (ExP), and 11 inexperienced poker players (InP). Players were compared on probability estimation and decision-making with the hypothesis that ExP would have significantly lower cognitive biases than PGP and InP, and that the groups could be differentiated based on their cognitive bias styles. The results showed that ExP had a significantly lower average error margin in probability estimation than PGP and InP, and that PGP played hands with lower winning probability than ExP. Binomial logistic regression showed perfect differentiation (100%) between ExP and PGP, and 90.5% classification accuracy between ExP and InP. Multinomial logistic regression showed an overall classification accuracy of 23 out of 33 (69.7%) between the three groups. The classification accuracy of ExP was higher than that of PGP and InP due to the similarities in probability estimation and decision-making between PGP and InP. These impairments in probability estimation and decision-making of PGP may have implications for assessment and treatment of cognitive biases in pathological gambling poker players.

  6. Allelic drop-out probabilities estimated by logistic regression--further considerations and practical implementation.

    Science.gov (United States)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria; Mogensen, Helle Smidt; Morling, Niels

    2012-03-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic drop-out is occurring. With this discussion, we hope to improve the drop-out model, so that it can be used for practical forensic genetics and stimulate further discussions. We discuss how to estimate drop-out probabilities when using a varying number of PCR cycles and other experimental conditions. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  7. Time of Arrival Estimation in Probability-Controlled Generalized CDMA Systems

    Directory of Open Access Journals (Sweden)

    Hagit Messer

    2007-11-01

    Full Text Available In recent years, more and more wireless communications systems are required to provide also a positioning measurement. In code division multiple access (CDMA communication systems, the positioning accuracy is significantly degraded by the multiple access interference (MAI caused by other users in the system. This MAI is commonly managed by a power control mechanism, and yet, MAI has a major effect on positioning accuracy. Probability control is a recently introduced interference management mechanism. In this mechanism, a user with excess power chooses not to transmit some of its symbols. The information in the nontransmitted symbols is recovered by an error-correcting code (ECC, while all other users receive a more reliable data during these quiet periods. Previous research had shown that the implementation of a probability control mechanism can significantly reduce the MAI. In this paper, we show that probability control also improves the positioning accuracy. We focus on time-of-arrival (TOA based positioning systems. We analyze the TOA estimation performance in a generalized CDMA system, in which the probability control mechanism is employed, where the transmitted signal is noncontinuous with a symbol transmission probability smaller than 1. The accuracy of the TOA estimation is determined using appropriate modifications of the Cramer-Rao bound on the delay estimation. Keeping the average transmission power constant, we show that the TOA accuracy of each user does not depend on its transmission probability, while being a nondecreasing function of the transmission probability of any other user. Therefore, a generalized, noncontinuous CDMA system with a probability control mechanism can always achieve better positioning performance, for all users in the network, than a conventional, continuous, CDMA system.

  8. Estimating stage-specific daily survival probabilities of nests when nest age is unknown

    Science.gov (United States)

    Stanley, T.R.

    2004-01-01

    Estimation of daily survival probabilities of nests is common in studies of avian populations. Since the introduction of Mayfield's (1961, 1975) estimator, numerous models have been developed to relax Mayfield's assumptions and account for biologically important sources of variation. Stanley (2000) presented a model for estimating stage-specific (e.g. incubation stage, nestling stage) daily survival probabilities of nests that conditions on “nest type” and requires that nests be aged when they are found. Because aging nests typically requires handling the eggs, there may be situations where nests can not or should not be aged and the Stanley (2000) model will be inapplicable. Here, I present a model for estimating stage-specific daily survival probabilities that conditions on nest stage for active nests, thereby obviating the need to age nests when they are found. Specifically, I derive the maximum likelihood function for the model, evaluate the model's performance using Monte Carlo simulations, and provide software for estimating parameters (along with an example). For sample sizes as low as 50 nests, bias was small and confidence interval coverage was close to the nominal rate, especially when a reduced-parameter model was used for estimation.

  9. Allelic drop-out probabilities estimated by logistic regression--Further considerations and practical implementation

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Asplund, Maria

    2012-01-01

    We discuss the model for estimating drop-out probabilities presented by Tvedebrink et al. [7] and the concerns, that have been raised. The criticism of the model has demonstrated that the model is not perfect. However, the model is very useful for advanced forensic genetic work, where allelic dro...

  10. Fast estimation of false alarm probabilities of STAP detectors - the AMF

    NARCIS (Netherlands)

    Srinivasan, Rajan; Rangaswamy, Muralidhar

    2005-01-01

    This paper describes an attempt to harness the power of adaptive importance sampling techniques for estimating false alarm probabilities of detectors that use space-time adaptive processing. Fast simulation using these techniques have been notably successful in the study of conventional constant fal

  11. Probability of Error in Estimating States of a Flow of Physical Events

    Science.gov (United States)

    Gortsev, A. M.; Solov'ev, A. A.

    2016-09-01

    A flow of physical events (photons, electrons, etc.) is considered. One of the mathematical models of such flows is the MAP flow of events. Analytical results for conditional and unconditional probabilities of erroneous decision in optimal estimation of states of the MAP flow of events are presented.

  12. Wildland fire probabilities estimated from weather model-deduced monthly mean fire danger indices

    Science.gov (United States)

    Haiganoush K. Preisler; Shyh-Chin Chen; Francis Fujioka; John W. Benoit; Anthony L. Westerling

    2008-01-01

    The National Fire Danger Rating System indices deduced from a regional simulation weather model were used to estimate probabilities and numbers of large fire events on monthly and 1-degree grid scales. The weather model simulations and forecasts are ongoing experimental products from the Experimental Climate Prediction Center at the Scripps Institution of Oceanography...

  13. Hitchhikers on trade routes: A phenology model estimates the probabilities of gypsy moth introduction and establishment.

    Science.gov (United States)

    Gray, David R

    2010-12-01

    As global trade increases so too does the probability of introduction of alien species to new locations. Estimating the probability of an alien species introduction and establishment following introduction is a necessary step in risk estimation (probability of an event times the consequences, in the currency of choice, of the event should it occur); risk estimation is a valuable tool for reducing the risk of biological invasion with limited resources. The Asian gypsy moth, Lymantria dispar (L.), is a pest species whose consequence of introduction and establishment in North America and New Zealand warrants over US$2 million per year in surveillance expenditure. This work describes the development of a two-dimensional phenology model (GLS-2d) that simulates insect development from source to destination and estimates: (1) the probability of introduction from the proportion of the source population that would achieve the next developmental stage at the destination and (2) the probability of establishment from the proportion of the introduced population that survives until a stable life cycle is reached at the destination. The effect of shipping schedule on the probabilities of introduction and establishment was examined by varying the departure date from 1 January to 25 December by weekly increments. The effect of port efficiency was examined by varying the length of time that invasion vectors (shipping containers and ship) were available for infection. The application of GLS-2d is demonstrated using three common marine trade routes (to Auckland, New Zealand, from Kobe, Japan, and to Vancouver, Canada, from Kobe and from Vladivostok, Russia).

  14. Simple indicator kriging for estimating the probability of incorrectly delineating hazardous areas in a contaminated site

    Energy Technology Data Exchange (ETDEWEB)

    Juang, K.W.; Lee, D.Y. [National Taiwan Univ., Taipei (Taiwan, Province of China). Graduate Inst. of Agricultural Chemistry

    1998-09-01

    The probability of incorrectly delineating hazardous areas in a contaminated site is very important for decision-makers because it indicates the magnitude of confidence that decision-makers have in determining areas in need of remediation. In this study, simple indicator kriging (SIK) was used to estimate the probability of incorrectly delineating hazardous areas in a heavy metal-contaminated site, which is located at Taoyuan, Taiwan, and is about 10 ha in area. In the procedure, the values 0 and 1 were assigned to be the stationary means of the indicator codes in the SIK model to represent two hypotheses, hazardous and safe, respectively. The spatial distribution of the conditional probability of heavy metal concentrations lower than a threshold, given each hypothesis, was estimated using SIK. Then, the probabilities of false positives ({alpha}) (i.e., the probability of declaring a location hazardous when it is not) and false negatives ({beta}) (i.e., the probability of declaring a location safe when it is not) in delineating hazardous areas for the heavy metal-contaminated site could be obtained. The spatial distribution of the probabilities of false positives and false negatives could help in delineating hazardous areas based on a tolerable probability level of incorrect delineation. In addition, delineation complicated by the cost of remediation, hazards in the environment, and hazards to human health could be made based on the minimum values of {alpha} and {beta}. The results suggest that the proposed SIK procedure is useful for decision-makers who need to delineate hazardous areas in a heavy metal-contaminated site.

  15. Estimating the Probability of Elevated Nitrate Concentrations in Ground Water in Washington State

    Science.gov (United States)

    Frans, Lonna M.

    2008-01-01

    Logistic regression was used to relate anthropogenic (manmade) and natural variables to the occurrence of elevated nitrate concentrations in ground water in Washington State. Variables that were analyzed included well depth, ground-water recharge rate, precipitation, population density, fertilizer application amounts, soil characteristics, hydrogeomorphic regions, and land-use types. Two models were developed: one with and one without the hydrogeomorphic regions variable. The variables in both models that best explained the occurrence of elevated nitrate concentrations (defined as concentrations of nitrite plus nitrate as nitrogen greater than 2 milligrams per liter) were the percentage of agricultural land use in a 4-kilometer radius of a well, population density, precipitation, soil drainage class, and well depth. Based on the relations between these variables and measured nitrate concentrations, logistic regression models were developed to estimate the probability of nitrate concentrations in ground water exceeding 2 milligrams per liter. Maps of Washington State were produced that illustrate these estimated probabilities for wells drilled to 145 feet below land surface (median well depth) and the estimated depth to which wells would need to be drilled to have a 90-percent probability of drawing water with a nitrate concentration less than 2 milligrams per liter. Maps showing the estimated probability of elevated nitrate concentrations indicated that the agricultural regions are most at risk followed by urban areas. The estimated depths to which wells would need to be drilled to have a 90-percent probability of obtaining water with nitrate concentrations less than 2 milligrams per liter exceeded 1,000 feet in the agricultural regions; whereas, wells in urban areas generally would need to be drilled to depths in excess of 400 feet.

  16. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  17. Estimation of failure probabilities of linear dynamic systems by importance sampling

    Indian Academy of Sciences (India)

    Anna Ivanova Olsen; Arvid Naess

    2006-08-01

    An iterative method for estimating the failure probability for certain time-variant reliability problems has been developed. In the paper, the focus is on the displacement response of a linear oscillator driven by white noise. Failure is then assumed to occur when the displacement response exceeds a critical threshold. The iteration procedure is a two-step method. On the first iteration, a simple control function promoting failure is constructed using the design point weighting principle. After time discretization, two points are chosen to construct a compound deterministic control function. It is based on the time point when the first maximum of the homogenous solution has occurred and on the point at the end of the considered time interval. An importance sampling technique is used in order to estimate the failure probability functional on a set of initial values of state space variables and time. On the second iteration, the concept of optimal control function can be implemented to construct a Markov control which allows much better accuracy in the failure probability estimate than the simple control function. On both iterations, the concept of changing the probability measure by the Girsanov transformation is utilized. As a result the CPU time is substantially reduced compared with the crude Monte Carlo procedure.

  18. Efficient Estimation of first Passage Probability of high-Dimensional Nonlinear Systems

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Bucher, Christian

    2011-01-01

    on the system memory. Consequently, high-dimensional problems can be handled, and nonlinearities in the model neither bring any difficulty in applying it nor lead to considerable reduction of its efficiency. These characteristics suggest that the method is a powerful candidate for complicated problems. First......, the failure probabilities of three well-known nonlinear systems are estimated. Next, a reduced degree-of-freedom model of a wind turbine is developed and is exposed to a turbulent wind field. The model incorporates very high dimensions and strong nonlinearities simultaneously. The failure probability...

  19. Estimating Super Heavy Element Event Random Probabilities Using Monte Carlo Methods

    Science.gov (United States)

    Stoyer, Mark; Henderson, Roger; Kenneally, Jacqueline; Moody, Kenton; Nelson, Sarah; Shaughnessy, Dawn; Wilk, Philip

    2009-10-01

    Because superheavy element (SHE) experiments involve very low event rates and low statistics, estimating the probability that a given event sequence is due to random events is extremely important in judging the validity of the data. A Monte Carlo method developed at LLNL [1] is used on recent SHE experimental data to calculate random event probabilities. Current SHE experimental activities in collaboration with scientists at Dubna, Russia will be discussed. [4pt] [1] N.J. Stoyer, et al., Nucl. Instrum. Methods Phys. Res. A 455 (2000) 433.

  20. Estimating the Upper Limit of Lifetime Probability Distribution, Based on Data of Japanese Centenarians.

    Science.gov (United States)

    Hanayama, Nobutane; Sibuya, Masaaki

    2016-08-01

    In modern biology, theories of aging fall mainly into two groups: damage theories and programed theories. If programed theories are true, the probability that human beings live beyond a specific age will be zero. In contrast, if damage theories are true, such an age does not exist, and a longevity record will be eventually destroyed. In this article, for examining real state, a special type of binomial model based on the generalized Pareto distribution has been applied to data of Japanese centenarians. From the results, it is concluded that the upper limit of lifetime probability distribution in the Japanese population has been estimated 123 years.

  1. A fast algorithm for estimating transmission probabilities in QTL detection designs with dense maps

    Directory of Open Access Journals (Sweden)

    Gilbert Hélène

    2009-11-01

    Full Text Available Abstract Background In the case of an autosomal locus, four transmission events from the parents to progeny are possible, specified by the grand parental origin of the alleles inherited by this individual. Computing the probabilities of these transmission events is essential to perform QTL detection methods. Results A fast algorithm for the estimation of these probabilities conditional to parental phases has been developed. It is adapted to classical QTL detection designs applied to outbred populations, in particular to designs composed of half and/or full sib families. It assumes the absence of interference. Conclusion The theory is fully developed and an example is given.

  2. Inverse probability of censoring weighted estimates of Kendall's τ for gap time analyses.

    Science.gov (United States)

    Lakhal-Chaieb, Lajmi; Cook, Richard J; Lin, Xihong

    2010-12-01

    In life history studies, interest often lies in the analysis of the interevent, or gap times and the association between event times. Gap time analyses are challenging however, even when the length of follow-up is determined independently of the event process, because associations between gap times induce dependent censoring for second and subsequent gap times. This article discusses nonparametric estimation of the association between consecutive gap times based on Kendall's τ in the presence of this type of dependent censoring. A nonparametric estimator that uses inverse probability of censoring weights is provided. Estimates of conditional gap time distributions can be obtained following specification of a particular copula function. Simulation studies show the estimator performs well and compares favorably with an alternative estimator. Generalizations to a piecewise constant Clayton copula are given. Several simulation studies and illustrations with real data sets are also provided.

  3. Estimation of nonuniform quantal parameters with multiple-probability fluctuation analysis: theory, application and limitations.

    Science.gov (United States)

    Silver, R Angus

    2003-12-15

    Synapses are a key determinant of information processing in the central nervous system. Investigation of the mechanisms underlying synaptic transmission at central synapses is complicated by the inaccessibility of synaptic contacts and the fact that their temporal dynamics are governed by multiple parameters. Multiple-probability fluctuation analysis (MPFA) is a recently developed method for estimating quantal parameters from the variance and mean amplitude of evoked steady-state synaptic responses recorded under a range of release probability conditions. This article describes the theoretical basis and the underlying assumptions of MPFA, illustrating how a simplified multinomial model can be used to estimate mean quantal parameters at synapses where quantal size and release probability are nonuniform. Interpretations of the quantal parameter estimates are discussed in relation to uniquantal and multiquantal models of transmission. Practical aspects of this method are illustrated including a new method for estimating quantal size and variability, approaches for optimising data collection, error analysis and a method for identifying multivesicular release. The advantages and limitations of investigating synaptic function with MPFA are explored and contrasted with those for traditional quantal analysis and more recent optical quantal analysis methods.

  4. The effect of coupling hydrologic and hydrodynamic models on probable maximum flood estimation

    Science.gov (United States)

    Felder, Guido; Zischg, Andreas; Weingartner, Rolf

    2017-07-01

    Deterministic rainfall-runoff modelling usually assumes stationary hydrological system, as model parameters are calibrated with and therefore dependant on observed data. However, runoff processes are probably not stationary in the case of a probable maximum flood (PMF) where discharge greatly exceeds observed flood peaks. Developing hydrodynamic models and using them to build coupled hydrologic-hydrodynamic models can potentially improve the plausibility of PMF estimations. This study aims to assess the potential benefits and constraints of coupled modelling compared to standard deterministic hydrologic modelling when it comes to PMF estimation. The two modelling approaches are applied using a set of 100 spatio-temporal probable maximum precipitation (PMP) distribution scenarios. The resulting hydrographs, the resulting peak discharges as well as the reliability and the plausibility of the estimates are evaluated. The discussion of the results shows that coupling hydrologic and hydrodynamic models substantially improves the physical plausibility of PMF modelling, although both modelling approaches lead to PMF estimations for the catchment outlet that fall within a similar range. Using a coupled model is particularly suggested in cases where considerable flood-prone areas are situated within a catchment.

  5. Errors in the estimation of the variance: implications for multiple-probability fluctuation analysis.

    Science.gov (United States)

    Saviane, Chiara; Silver, R Angus

    2006-06-15

    Synapses play a crucial role in information processing in the brain. Amplitude fluctuations of synaptic responses can be used to extract information about the mechanisms underlying synaptic transmission and its modulation. In particular, multiple-probability fluctuation analysis can be used to estimate the number of functional release sites, the mean probability of release and the amplitude of the mean quantal response from fits of the relationship between the variance and mean amplitude of postsynaptic responses, recorded at different probabilities. To determine these quantal parameters, calculate their uncertainties and the goodness-of-fit of the model, it is important to weight the contribution of each data point in the fitting procedure. We therefore investigated the errors associated with measuring the variance by determining the best estimators of the variance of the variance and have used simulations of synaptic transmission to test their accuracy and reliability under different experimental conditions. For central synapses, which generally have a low number of release sites, the amplitude distribution of synaptic responses is not normal, thus the use of a theoretical variance of the variance based on the normal assumption is not a good approximation. However, appropriate estimators can be derived for the population and for limited sample sizes using a more general expression that involves higher moments and introducing unbiased estimators based on the h-statistics. Our results are likely to be relevant for various applications of fluctuation analysis when few channels or release sites are present.

  6. Estimation of submarine mass failure probability from a sequence of deposits with age dates

    Science.gov (United States)

    Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.

    2013-01-01

    The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.

  7. Annotated corpus and the empirical evaluation of probability estimates of grammatical forms

    Directory of Open Access Journals (Sweden)

    Ševa Nada

    2003-01-01

    Full Text Available The aim of the present study is to demonstrate the usage of an annotated corpus in the field of experimental psycholinguistics. Specifically, we demonstrate how the manually annotated Corpus of Serbian Language (Kostić, Đ. 2001 can be used for probability estimates of grammatical forms, which allow the control of independent variables in psycholinguistic experiments. We address the issue of processing Serbian inflected forms within two subparadigms of feminine nouns. In regression analysis, almost all processing variability of inflected forms has been accounted for by the amount of information (i.e. bits carried by the presented forms. In spite of the fact that probability distributions of inflected forms for the two paradigms differ, it was shown that the best prediction of processing variability is obtained by the probabilities derived from the predominant subparadigm which encompasses about 80% of feminine nouns. The relevance of annotated corpora in experimental psycholinguistics is discussed more in detail .

  8. Estimates for the Finite-time Ruin Probability with Insurance and Financial Risks

    Institute of Scientific and Technical Information of China (English)

    Min ZHOU; Kai-yong WANG; Yue-bao WANG

    2012-01-01

    The paper gives estimates for the finite-time ruin probability with insurance and financial risks.When the distribution of the insurance risk belongs to the class (L)(γ) for some γ > 0 or the subexponential distribution class,we abtain some asymptotic equivalent relationships for the finite-time ruin probability,respectively. When the distribution of the insurance risk belongs to the dominated varying-tailed distribution class,we obtain asymptotic upper bound and lower bound for the finite-time ruin probability,where for the asymptotic upper bound,we completely get rid of the restriction of mutual independence on insurance risks,and for the lower bound,we only need the insurance risks to have a weak positive association structure.The obtained results extend and improve some existing results.

  9. A simple method for realistic estimation of the most probable energy loss in thin gas layers

    Science.gov (United States)

    Grishin, V. M.; Merson, G. I.

    1989-01-01

    A simple method for the estimation of the relativistic rise of the most probable ionisation loss in thin gas layers is suggested. The method is based on the similarity of the most probable and restricted energy loss of relativistic charged particles in matter. This allows to correct the Landau-Sternheimer theory taking into account the fact that particle collisions with internal atomic electrons do not influence the most probable value of the ionisation loss. The effective values of the charge number and average ionisation potential which are simple to calculate are used for this correction. A similarity of the energy loss distributions for various gases and gas layers is found. This similarity is expressed in a constant fraction of the ionisation loss distribution tail area ( ˜ 1:3.5). It is the value which was used for correction of the Landau-Sternheimer formula.

  10. Estimating migratory connectivity of birds when re-encounter probabilities are heterogeneous

    Science.gov (United States)

    Cohen, Emily B.; Hostelter, Jeffrey A.; Royle, J. Andrew; Marra, Peter P.

    2014-01-01

    Understanding the biology and conducting effective conservation of migratory species requires an understanding of migratory connectivity – the geographic linkages of populations between stages of the annual cycle. Unfortunately, for most species, we are lacking such information. The North American Bird Banding Laboratory (BBL) houses an extensive database of marking, recaptures and recoveries, and such data could provide migratory connectivity information for many species. To date, however, few species have been analyzed for migratory connectivity largely because heterogeneous re-encounter probabilities make interpretation problematic. We accounted for regional variation in re-encounter probabilities by borrowing information across species and by using effort covariates on recapture and recovery probabilities in a multistate capture–recapture and recovery model. The effort covariates were derived from recaptures and recoveries of species within the same regions. We estimated the migratory connectivity for three tern species breeding in North America and over-wintering in the tropics, common (Sterna hirundo), roseate (Sterna dougallii), and Caspian terns (Hydroprogne caspia). For western breeding terns, model-derived estimates of migratory connectivity differed considerably from those derived directly from the proportions of re-encounters. Conversely, for eastern breeding terns, estimates were merely refined by the inclusion of re-encounter probabilities. In general, eastern breeding terns were strongly connected to eastern South America, and western breeding terns were strongly linked to the more western parts of the nonbreeding range under both models. Through simulation, we found this approach is likely useful for many species in the BBL database, although precision improved with higher re-encounter probabilities and stronger migratory connectivity. We describe an approach to deal with the inherent biases in BBL banding and re-encounter data to demonstrate

  11. A new model to estimate prognosis in patients with hepatocellular carcinoma after Yttrium-90 radioembolization.

    Directory of Open Access Journals (Sweden)

    Zhihong Weng

    Full Text Available AIMS: The current prognostic model to estimate the survival in hepatocellular carcinoma (HCC patients treated with transarterial hepatic selective internal radiotherapy (SIRT is not fully characterized. The aim of this study was to establish a new scoring model including assessment of both tumor responses and therapy-induced systemic changes in HCC patients to predict survival at an early time point post-SIRT. METHODS AND MATERIALS: Between 2008 and 2012, 149 HCC patients treated with SIRT were included into this study. CT images and biomarkers in blood tested at one month post-SIRT were analyzed and correlated with clinical outcome. Tumor responses were assessed by RECIST 1.1, mRECIST, and Choi criteria. Kaplan-Meier methods were used to estimate survival curves. Cox regression was used in uni- and multivariable survival analyses and in the establishment of a prognostic model. RESULTS: A multivariate proportional hazards model was created based on the tumor response, the number of tumor nodules, the score of the model for end stage liver disease (MELD, and the serum C-reactive protein levels which were independent predictors of survival in HCC patients at one month post-SIRT. This prognostic model accurately differentiated the outcome of patients with different risk scores in this cohort (P<0.001. The model also had the ability to assign a predicted survival probability for individual patients. CONCLUSIONS: A new model to predict survival of HCC patients mainly based on tumor responses and therapy-induced systemic changes provides reliable prognosis and accurately discriminates the survival at an early time point after SIRT in these patients.

  12. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis

    Science.gov (United States)

    Chiba, Tomoaki; Akaho, Shotaro; Murata, Noboru

    2017-01-01

    In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group’s sales beat GM’s sales, which is a reasonable scenario. PMID:28076383

  13. Time-Varying Transition Probability Matrix Estimation and Its Application to Brand Share Analysis.

    Science.gov (United States)

    Chiba, Tomoaki; Hino, Hideitsu; Akaho, Shotaro; Murata, Noboru

    2017-01-01

    In a product market or stock market, different products or stocks compete for the same consumers or purchasers. We propose a method to estimate the time-varying transition matrix of the product share using a multivariate time series of the product share. The method is based on the assumption that each of the observed time series of shares is a stationary distribution of the underlying Markov processes characterized by transition probability matrices. We estimate transition probability matrices for every observation under natural assumptions. We demonstrate, on a real-world dataset of the share of automobiles, that the proposed method can find intrinsic transition of shares. The resulting transition matrices reveal interesting phenomena, for example, the change in flows between TOYOTA group and GM group for the fiscal year where TOYOTA group's sales beat GM's sales, which is a reasonable scenario.

  14. Probability estimates of seismic event occurrence compared to health hazards - Forecasting Taipei's Earthquakes

    Science.gov (United States)

    Fung, D. C. N.; Wang, J. P.; Chang, S. H.; Chang, S. C.

    2014-12-01

    Using a revised statistical model built on past seismic probability models, the probability of different magnitude earthquakes occurring within variable timespans can be estimated. The revised model is based on Poisson distribution and includes the use of best-estimate values of the probability distribution of different magnitude earthquakes recurring from a fault from literature sources. Our study aims to apply this model to the Taipei metropolitan area with a population of 7 million, which lies in the Taipei Basin and is bounded by two normal faults: the Sanchaio and Taipei faults. The Sanchaio fault is suggested to be responsible for previous large magnitude earthquakes, such as the 1694 magnitude 7 earthquake in northwestern Taipei (Cheng et. al., 2010). Based on a magnitude 7 earthquake return period of 543 years, the model predicts the occurrence of a magnitude 7 earthquake within 20 years at 1.81%, within 79 years at 6.77% and within 300 years at 21.22%. These estimates increase significantly when considering a magnitude 6 earthquake; the chance of one occurring within the next 20 years is estimated to be 3.61%, 79 years at 13.54% and 300 years at 42.45%. The 79 year period represents the average lifespan of the Taiwan population. In contrast, based on data from 2013, the probability of Taiwan residents experiencing heart disease or malignant neoplasm is 11.5% and 29%. The inference of this study is that the calculated risk that the Taipei population is at from a potentially damaging magnitude 6 or greater earthquake occurring within their lifetime is just as great as of suffering from a heart attack or other health ailments.

  15. Remediating Non-Positive Definite State Covariances for Collision Probability Estimation

    Science.gov (United States)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis team estimates the probability of collision (Pc) for a set of Earth-orbiting satellites. The Pc estimation software processes satellite position+velocity states and their associated covariance matri-ces. On occasion, the software encounters non-positive definite (NPD) state co-variances, which can adversely affect or prevent the Pc estimation process. Inter-polation inaccuracies appear to account for the majority of such covariances, alt-hough other mechanisms contribute also. This paper investigates the origin of NPD state covariance matrices, three different methods for remediating these co-variances when and if necessary, and the associated effects on the Pc estimation process.

  16. Bounding and estimating an exceedance probability in output from monotonous time-consuming computer codes

    CERN Document Server

    Bousquet, Nicolas

    2010-01-01

    This article deals with the estimation of a probability p of an undesirable event. Its occurence is formalized by the exceedance of a threshold reliability value by the unidimensional output of a time-consuming computer code G with multivariate probabilistic input X. When G is assumed monotonous with respect to X, the Monotonous Reliability Method was proposed by de Rocquigny (2009) in an engineering context to provide sequentially narrowing 100%-confidence bounds and a crude estimate of p, via deterministic or stochastic designs of experiments. The present article consists in a formalization and technical deepening of this idea, as a large basis for future theoretical and applied studies. Three kinds of results are especially emphasized. First, the bounds themselves remain too crude and conservative estimators of p for a dimension of X upper than 2. Second, a maximum-likelihood estimator of p can be easily built, presenting a high variance reduction with respect to a standard Monte Carlo case, but suffering ...

  17. METAPHOR: A machine learning based method for the probability density estimation of photometric redshifts

    CERN Document Server

    Cavuoti, Stefano; Brescia, Massimo; Vellucci, Civita; Tortora, Crescenzo; Longo, Giuseppe

    2016-01-01

    A variety of fundamental astrophysical science topics require the determination of very accurate photometric redshifts (photo-z's). A wide plethora of methods have been developed, based either on template models fitting or on empirical explorations of the photometric parameter space. Machine learning based techniques are not explicitly dependent on the physical priors and able to produce accurate photo-z estimations within the photometric ranges derived from the spectroscopic training set. These estimates, however, are not easy to characterize in terms of a photo-z Probability Density Function (PDF), due to the fact that the analytical relation mapping the photometric parameters onto the redshift space is virtually unknown. We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method designed to provide a reliable PDF of the error distribution for empirical techniques. The method is implemented as a modular workflow, whose internal engine for photo-z estimation makes use...

  18. PIGS: improved estimates of identity-by-descent probabilities by probabilistic IBD graph sampling.

    Science.gov (United States)

    Park, Danny S; Baran, Yael; Hormozdiari, Farhad; Eng, Celeste; Torgerson, Dara G; Burchard, Esteban G; Zaitlen, Noah

    2015-01-01

    Identifying segments in the genome of different individuals that are identical-by-descent (IBD) is a fundamental element of genetics. IBD data is used for numerous applications including demographic inference, heritability estimation, and mapping disease loci. Simultaneous detection of IBD over multiple haplotypes has proven to be computationally difficult. To overcome this, many state of the art methods estimate the probability of IBD between each pair of haplotypes separately. While computationally efficient, these methods fail to leverage the clique structure of IBD resulting in less powerful IBD identification, especially for small IBD segments.

  19. The Probability of Default Under IFRS 9: Multi-period Estimation and Macroeconomic Forecast

    Directory of Open Access Journals (Sweden)

    Tomáš Vaněk

    2017-01-01

    Full Text Available In this paper we propose a straightforward, flexible and intuitive computational framework for the multi-period probability of default estimation incorporating macroeconomic forecasts. The concept is based on Markov models, the estimated economic adjustment coefficient and the official economic forecasts of the Czech National Bank. The economic forecasts are taken into account in a separate step to better distinguish between idiosyncratic and systemic risk. This approach is also attractive from the interpretational point of view. The proposed framework can be used especially when calculating lifetime expected credit losses under IFRS 9.

  20. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  1. Using optimal transport theory to estimate transition probabilities in metapopulation dynamics

    Science.gov (United States)

    Nichols, Jonathan M.; Spendelow, Jeffrey A.; Nichols, James

    2017-01-01

    This work considers the estimation of transition probabilities associated with populations moving among multiple spatial locations based on numbers of individuals at each location at two points in time. The problem is generally underdetermined as there exists an extremely large number of ways in which individuals can move from one set of locations to another. A unique solution therefore requires a constraint. The theory of optimal transport provides such a constraint in the form of a cost function, to be minimized in expectation over the space of possible transition matrices. We demonstrate the optimal transport approach on marked bird data and compare to the probabilities obtained via maximum likelihood estimation based on marked individuals. It is shown that by choosing the squared Euclidean distance as the cost, the estimated transition probabilities compare favorably to those obtained via maximum likelihood with marked individuals. Other implications of this cost are discussed, including the ability to accurately interpolate the population's spatial distribution at unobserved points in time and the more general relationship between the cost and minimum transport energy.

  2. Estimation of probability of failure for damage-tolerant aerospace structures

    Science.gov (United States)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  3. On estimating probability of presence from use-availability or presence-background data.

    Science.gov (United States)

    Phillips, Steven J; Elith, Jane

    2013-06-01

    A fundamental ecological modeling task is to estimate the probability that a species is present in (or uses) a site, conditional on environmental variables. For many species, available data consist of "presence" data (locations where the species [or evidence of it] has been observed), together with "background" data, a random sample of available environmental conditions. Recently published papers disagree on whether probability of presence is identifiable from such presence-background data alone. This paper aims to resolve the disagreement, demonstrating that additional information is required. We defined seven simulated species representing various simple shapes of response to environmental variables (constant, linear, convex, unimodal, S-shaped) and ran five logistic model-fitting methods using 1000 presence samples and 10 000 background samples; the simulations were repeated 100 times. The experiment revealed a stark contrast between two groups of methods: those based on a strong assumption that species' true probability of presence exactly matches a given parametric form had highly variable predictions and much larger RMS error than methods that take population prevalence (the fraction of sites in which the species is present) as an additional parameter. For six species, the former group grossly under- or overestimated probability of presence. The cause was not model structure or choice of link function, because all methods were logistic with linear and, where necessary, quadratic terms. Rather, the experiment demonstrates that an estimate of prevalence is not just helpful, but is necessary (except in special cases) for identifying probability of presence. We therefore advise against use of methods that rely on the strong assumption, due to Lele and Keim (recently advocated by Royle et al.) and Lancaster and Imbens. The methods are fragile, and their strong assumption is unlikely to be true in practice. We emphasize, however, that we are not arguing against

  4. Calculation of the number of Monte Carlo histories for a planetary protection probability of impact estimation

    Science.gov (United States)

    Barengoltz, Jack

    2016-07-01

    Monte Carlo (MC) is a common method to estimate probability, effectively by a simulation. For planetary protection, it may be used to estimate the probability of impact P{}_{I} by a launch vehicle (upper stage) of a protected planet. The object of the analysis is to provide a value for P{}_{I} with a given level of confidence (LOC) that the true value does not exceed the maximum allowed value of P{}_{I}. In order to determine the number of MC histories required, one must also guess the maximum number of hits that will occur in the analysis. This extra parameter is needed because a LOC is desired. If more hits occur, the MC analysis would indicate that the true value may exceed the specification value with a higher probability than the LOC. (In the worst case, even the mean value of the estimated P{}_{I} might exceed the specification value.) After the analysis is conducted, the actual number of hits is, of course, the mean. The number of hits arises from a small probability per history and a large number of histories; these are the classic requirements for a Poisson distribution. For a known Poisson distribution (the mean is the only parameter), the probability for some interval in the number of hits is calculable. Before the analysis, this is not possible. Fortunately, there are methods that can bound the unknown mean for a Poisson distribution. F. Garwoodfootnote{ F. Garwood (1936), ``Fiduciary limits for the Poisson distribution.'' Biometrika 28, 437-442.} published an appropriate method that uses the Chi-squared function, actually its inversefootnote{ The integral chi-squared function would yield probability α as a function of the mean µ and an actual value n.} (despite the notation used): This formula for the upper and lower limits of the mean μ with the two-tailed probability 1-α depends on the LOC α and an estimated value of the number of "successes" n. In a MC analysis for planetary protection, only the upper limit is of interest, i.e., the single

  5. Estimates of EPSP amplitude based on changes in motoneuron discharge rate and probability.

    Science.gov (United States)

    Powers, Randall K; Türker, K S

    2010-10-01

    When motor units are discharging tonically, transient excitatory synaptic inputs produce an increase in the probability of spike occurrence and also increase the instantaneous discharge rate. Several researchers have proposed that these induced changes in discharge rate and probability can be used to estimate the amplitude of the underlying excitatory post-synaptic potential (EPSP). We tested two different methods of estimating EPSP amplitude by comparing the amplitude of simulated EPSPs with their effects on the discharge of rat hypoglossal motoneurons recorded in an in vitro brainstem slice preparation. The first estimation method (simplified-trajectory method) is based on the assumptions that the membrane potential trajectory between spikes can be approximated by a 10 mV post-spike hyperpolarization followed by a linear rise to the next spike and that EPSPs sum linearly with this trajectory. We hypothesized that this estimation method would not be accurate due to interspike variations in membrane conductance and firing threshold that are not included in the model and that an alternative method based on estimating the effective distance to threshold would provide more accurate estimates of EPSP amplitude. This second method (distance-to-threshold method) uses interspike interval statistics to estimate the effective distance to threshold throughout the interspike interval and incorporates this distance-to-threshold trajectory into a threshold-crossing model. We found that the first method systematically overestimated the amplitude of small (EPSPs and underestimated the amplitude of large (>5 mV EPSPs). For large EPSPs, the degree of underestimation increased with increasing background discharge rate. Estimates based on the second method were more accurate for small EPSPs than those based on the first model, but estimation errors were still large for large EPSPs. These errors were likely due to two factors: (1) the distance to threshold can only be directly

  6. A flexible parametric approach for estimating continuous-time inverse probability of treatment and censoring weights.

    Science.gov (United States)

    Saarela, Olli; Liu, Zhihui Amy

    2016-10-15

    Marginal structural Cox models are used for quantifying marginal treatment effects on outcome event hazard function. Such models are estimated using inverse probability of treatment and censoring (IPTC) weighting, which properly accounts for the impact of time-dependent confounders, avoiding conditioning on factors on the causal pathway. To estimate the IPTC weights, the treatment assignment mechanism is conventionally modeled in discrete time. While this is natural in situations where treatment information is recorded at scheduled follow-up visits, in other contexts, the events specifying the treatment history can be modeled in continuous time using the tools of event history analysis. This is particularly the case for treatment procedures, such as surgeries. In this paper, we propose a novel approach for flexible parametric estimation of continuous-time IPTC weights and illustrate it in assessing the relationship between metastasectomy and mortality in metastatic renal cell carcinoma patients. Copyright © 2016 John Wiley & Sons, Ltd.

  7. ANNz2 - Photometric redshift and probability density function estimation using machine learning methods

    CERN Document Server

    Sadeh, Iftach; Lahav, Ofer

    2015-01-01

    We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister and Lahav (2004). Large photometric galaxy surveys are important for cosmological studies, and in particular for characterizing the nature of dark energy. The success of such surveys greatly depends on the ability to measure photo-zs, based on limited spectral data. ANNz2 utilizes multiple machine learning methods, such as artificial neural networks, boosted decision/regression trees and k-nearest neighbours. The objective of the algorithm is to dynamically optimize the performance of the photo-z estimation, and to properly derive the associated uncertainties. In addition to single-value solutions, the new code also generates full probability density functions (PDFs) in two different ways. In addition, estimators are incorporated to mitigate possible problems of spectroscopic training samples which are not representative or are incomplete. ANNz2 is also adapted to provide optimized solution...

  8. SVM model for estimating the parameters of the probability-integral method of predicting mining subsidence

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hua; WANG Yun-jia; LI Yong-feng

    2009-01-01

    A new mathematical model to estimate the parameters of the probability-integral method for mining subsidence prediction is proposed. Based on least squares support vector machine (LS-SVM) theory, it is capable of improving the precision and reliability of mining subsidence prediction. Many of the geological and mining factors involved are related in a nonlinear way. The new model is based on statistical theory (SLT) and empirical risk minimization (ERM) principles. Typical data collected from observation stations were used for the learning and training samples. The calculated results from the LS-SVM model were compared with the prediction results of a back propagation neural network (BPNN) model. The results show that the parameters were more precisely predicted by the LS-SVM model than by the BPNN model. The LS-SVM model was faster in computation and had better generalized performance. It provides a highly effective method for calculating the predicting parameters of the probability-integral method.

  9. Estimating occurrence and detection probabilities for stream-breeding salamanders in the Gulf Coastal Plain

    Science.gov (United States)

    Lamb, Jennifer Y.; Waddle, J. Hardin; Qualls, Carl P.

    2017-01-01

    Large gaps exist in our knowledge of the ecology of stream-breeding plethodontid salamanders in the Gulf Coastal Plain. Data describing where these salamanders are likely to occur along environmental gradients, as well as their likelihood of detection, are important for the prevention and management of amphibian declines. We used presence/absence data from leaf litter bag surveys and a hierarchical Bayesian multispecies single-season occupancy model to estimate the occurrence of five species of plethodontids across reaches in headwater streams in the Gulf Coastal Plain. Average detection probabilities were high (range = 0.432–0.942) and unaffected by sampling covariates specific to the use of litter bags (i.e., bag submergence, sampling season, in-stream cover). Estimates of occurrence probabilities differed substantially between species (range = 0.092–0.703) and were influenced by the size of the upstream drainage area and by the maximum proportion of the reach that dried. The effects of these two factors were not equivalent across species. Our results demonstrate that hierarchical multispecies models successfully estimate occurrence parameters for both rare and common stream-breeding plethodontids. The resulting models clarify how species are distributed within stream networks, and they provide baseline values that will be useful in evaluating the conservation statuses of plethodontid species within lotic systems in the Gulf Coastal Plain.

  10. Estimating superpopulation size and annual probability of breeding for pond-breeding salamanders

    Science.gov (United States)

    Kinkead, K.E.; Otis, D.L.

    2007-01-01

    It has long been accepted that amphibians can skip breeding in any given year, and environmental conditions act as a cue for breeding. In this paper, we quantify temporary emigration or nonbreeding probability for mole and spotted salamanders (Ambystoma talpoideum and A. maculatum). We estimated that 70% of mole salamanders may skip breeding during an average rainfall year and 90% may skip during a drought year. Spotted salamanders may be more likely to breed, with only 17% avoiding the breeding pond during an average rainfall year. We illustrate how superpopulations can be estimated using temporary emigration probability estimates. The superpopulation is the total number of salamanders associated with a given breeding pond. Although most salamanders stay within a certain distance of a breeding pond for the majority of their life spans, it is difficult to determine true overall population sizes for a given site if animals are only captured during a brief time frame each year with some animals unavailable for capture at any time during a given year. ?? 2007 by The Herpetologists' League, Inc.

  11. Three-dimensional super-resolution structured illumination microscopy with maximum a posteriori probability image estimation.

    Science.gov (United States)

    Lukeš, Tomáš; Křížek, Pavel; Švindrych, Zdeněk; Benda, Jakub; Ovesný, Martin; Fliegel, Karel; Klíma, Miloš; Hagen, Guy M

    2014-12-01

    We introduce and demonstrate a new high performance image reconstruction method for super-resolution structured illumination microscopy based on maximum a posteriori probability estimation (MAP-SIM). Imaging performance is demonstrated on a variety of fluorescent samples of different thickness, labeling density and noise levels. The method provides good suppression of out of focus light, improves spatial resolution, and allows reconstruction of both 2D and 3D images of cells even in the case of weak signals. The method can be used to process both optical sectioning and super-resolution structured illumination microscopy data to create high quality super-resolution images.

  12. Estimating the probability of allelic drop-out of STR alleles in forensic genetics

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt;

    2009-01-01

    In crime cases with available DNA evidence, the amount of DNA is often sparse due to the setting of the crime. In such cases, allelic drop-out of one or more true alleles in STR typing is possible. We present a statistical model for estimating the per locus and overall probability of allelic drop......-out using the results of all STR loci in the case sample as reference. The methodology of logistic regression is appropriate for this analysis, and we demonstrate how to incorporate this in a forensic genetic framework....

  13. On the method of logarithmic cumulants for parametric probability density function estimation.

    Science.gov (United States)

    Krylov, Vladimir A; Moser, Gabriele; Serpico, Sebastiano B; Zerubia, Josiane

    2013-10-01

    Parameter estimation of probability density functions is one of the major steps in the area of statistical image and signal processing. In this paper we explore several properties and limitations of the recently proposed method of logarithmic cumulants (MoLC) parameter estimation approach which is an alternative to the classical maximum likelihood (ML) and method of moments (MoM) approaches. We derive the general sufficient condition for a strong consistency of the MoLC estimates which represents an important asymptotic property of any statistical estimator. This result enables the demonstration of the strong consistency of MoLC estimates for a selection of widely used distribution families originating from (but not restricted to) synthetic aperture radar image processing. We then derive the analytical conditions of applicability of MoLC to samples for the distribution families in our selection. Finally, we conduct various synthetic and real data experiments to assess the comparative properties, applicability and small sample performance of MoLC notably for the generalized gamma and K families of distributions. Supervised image classification experiments are considered for medical ultrasound and remote-sensing SAR imagery. The obtained results suggest that MoLC is a feasible and computationally fast yet not universally applicable alternative to MoM. MoLC becomes especially useful when the direct ML approach turns out to be unfeasible.

  14. Estimated Probability of a Cervical Spine Injury During an ISS Mission

    Science.gov (United States)

    Brooker, John E.; Weaver, Aaron S.; Myers, Jerry G.

    2013-01-01

    Introduction: The Integrated Medical Model (IMM) utilizes historical data, cohort data, and external simulations as input factors to provide estimates of crew health, resource utilization and mission outcomes. The Cervical Spine Injury Module (CSIM) is an external simulation designed to provide the IMM with parameter estimates for 1) a probability distribution function (PDF) of the incidence rate, 2) the mean incidence rate, and 3) the standard deviation associated with the mean resulting from injury/trauma of the neck. Methods: An injury mechanism based on an idealized low-velocity blunt impact to the superior posterior thorax of an ISS crewmember was used as the simulated mission environment. As a result of this impact, the cervical spine is inertially loaded from the mass of the head producing an extension-flexion motion deforming the soft tissues of the neck. A multibody biomechanical model was developed to estimate the kinematic and dynamic response of the head-neck system from a prescribed acceleration profile. Logistic regression was performed on a dataset containing AIS1 soft tissue neck injuries from rear-end automobile collisions with published Neck Injury Criterion values producing an injury transfer function (ITF). An injury event scenario (IES) was constructed such that crew 1 is moving through a primary or standard translation path transferring large volume equipment impacting stationary crew 2. The incidence rate for this IES was estimated from in-flight data and used to calculate the probability of occurrence. The uncertainty in the model input factors were estimated from representative datasets and expressed in terms of probability distributions. A Monte Carlo Method utilizing simple random sampling was employed to propagate both aleatory and epistemic uncertain factors. Scatterplots and partial correlation coefficients (PCC) were generated to determine input factor sensitivity. CSIM was developed in the SimMechanics/Simulink environment with a

  15. A software for the estimation of binding parameters of biochemical equilibria based on statistical probability model.

    Science.gov (United States)

    Fisicaro, E; Braibanti, A; Sambasiva Rao, R; Compari, C; Ghiozzi, A; Nageswara Rao, G

    1998-04-01

    An algorithm is proposed for the estimation of binding parameters for the interaction of biologically important macromolecules with smaller ones from electrometric titration data. The mathematical model is based on the representation of equilibria in terms of probability concepts of statistical molecular thermodynamics. The refinement of equilibrium concentrations of the components and estimation of binding parameters (log site constant and cooperativity factor) is performed using singular value decomposition, a chemometric technique which overcomes the general obstacles due to near singularity. The present software is validated with a number of biochemical systems of varying number of sites and cooperativity factors. The effect of random errors of realistic magnitude in experimental data is studied using the simulated primary data for some typical systems. The safe area within which approximate binding parameters ensure convergence has been reported for the non-self starting optimization algorithms.

  16. Estimation of (n,f) Cross-Sections by Measuring Reaction Probability Ratios

    Energy Technology Data Exchange (ETDEWEB)

    Plettner, C; Ai, H; Beausang, C W; Bernstein, L A; Ahle, L; Amro, H; Babilon, M; Burke, J T; Caggiano, J A; Casten, R F; Church, J A; Cooper, J R; Crider, B; Gurdal, G; Heinz, A; McCutchan, E A; Moody, K; Punyon, J A; Qian, J; Ressler, J J; Schiller, A; Williams, E; Younes, W

    2005-04-21

    Neutron-induced reaction cross-sections on unstable nuclei are inherently difficult to measure due to target activity and the low intensity of neutron beams. In an alternative approach, named the 'surrogate' technique, one measures the decay probability of the same compound nucleus produced using a stable beam on a stable target to estimate the neutron-induced reaction cross-section. As an extension of the surrogate method, in this paper they introduce a new technique of measuring the fission probabilities of two different compound nuclei as a ratio, which has the advantage of removing most of the systematic uncertainties. This method was benchmarked in this report by measuring the probability of deuteron-induced fission events in coincidence with protons, and forming the ratio P({sup 236}U(d,pf))/P({sup 238}U(d,pf)), which serves as a surrogate for the known cross-section ratio of {sup 236}U(n,f)/{sup 238}U(n,f). IN addition, the P({sup 238}U(d,d{prime}f))/P({sup 236}U(d,d{prime}f)) ratio as a surrogate for the {sup 237}U(n,f)/{sup 235}U(n,f) cross-section ratio was measured for the first time in an unprecedented range of excitation energies.

  17. Estimating the probability of an extinction or major outbreak for an environmentally transmitted infectious disease.

    Science.gov (United States)

    Lahodny, G E; Gautam, R; Ivanek, R

    2015-01-01

    Indirect transmission through the environment, pathogen shedding by infectious hosts, replication of free-living pathogens within the environment, and environmental decontamination are suspected to play important roles in the spread and control of environmentally transmitted infectious diseases. To account for these factors, the classic Susceptible-Infectious-Recovered-Susceptible epidemic model is modified to include a compartment representing the amount of free-living pathogen within the environment. The model accounts for host demography, direct and indirect transmission, replication of free-living pathogens in the environment, and removal of free-living pathogens by natural death or environmental decontamination. Based on the assumptions of the deterministic model, a continuous-time Markov chain model is developed. An estimate for the probability of disease extinction or a major outbreak is obtained by approximating the Markov chain with a multitype branching process. Numerical simulations illustrate important differences between the deterministic and stochastic counterparts, relevant for outbreak prevention, that depend on indirect transmission, pathogen shedding by infectious hosts, replication of free-living pathogens, and environmental decontamination. The probability of a major outbreak is computed for salmonellosis in a herd of dairy cattle as well as cholera in a human population. An explicit expression for the probability of disease extinction or a major outbreak in terms of the model parameters is obtained for systems with no direct transmission or replication of free-living pathogens.

  18. Development of a statistical tool for the estimation of riverbank erosion probability

    Science.gov (United States)

    Varouchakis, Emmanouil

    2016-04-01

    Riverbank erosion affects river morphology and local habitat, and results in riparian land loss, property and infrastructure damage, and ultimately flood defence weakening. An important issue concerning riverbank erosion is the identification of the vulnerable areas in order to predict river changes and assist stream management/restoration. An approach to predict areas vulnerable to erosion is to quantify the erosion probability by identifying the underlying relations between riverbank erosion and geomorphological or hydrological variables that prevent or stimulate erosion. In the present work, a innovative statistical methodology is proposed to predict the probability of presence or absence of erosion in a river section. A physically based model determines the locations vulnerable to erosion by quantifying the potential eroded area. The derived results are used to determine validation locations for the evaluation of the statistical tool performance. The statistical tool is based on a series of independent local variables and employs the Logistic Regression methodology. It is developed in two forms, Logistic Regression and Locally Weighted Logistic Regression, which both deliver useful and accurate results. The second form though, provides the most accurate results as it validates the presence or absence of erosion at all validation locations. The proposed tool is easy to use, accurate and can be applied to any region and river. Varouchakis, E. A., Giannakis, G. V., Lilli, M. A., Ioannidou, E., Nikolaidis, N. P., and Karatzas, G. P.: Development of a statistical tool for the estimation of riverbank erosion probability, SOIL (EGU), in print, 2016.

  19. Failure probability estimation of flaw in CANDU pressure tube considering the dimensional change

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Sang Log; Kim, Young Jin [Sungkyunkwan Univ., Suwon (Korea, Republic of); Lee, Joon Seong [Kyonggi Univ., Suwon (Korea, Republic of); Park, Youn Won [KINS, Taejon (Korea, Republic of)

    2002-11-01

    The pressure tube is a major component of the CANDU reactor, which supports nuclear fuel bundle and heavy water coolant. Pressure tubes are installed horizontally inside the reactor and only selected samples are periodically examined during in-service inspection. In this respect, a probabilistic safety assessment method is more appropriate for the assessment of overall pressure tube safety. The failure behavior of CANDU pressure tubes, however, is governed by delayed hydride cracking which is the major difference from pipings and reactor pressure vessels. Since the delayed hydride cracking has more widely distributed governing parameters, it is impossible to apply a general PFM methodology directly. In this paper, a PFM methodology for the safety assessment of CANDU pressure tubes is introduced by applying Monte Carlo simulation in determining failure probability. Initial hydrogen concentration, flaw shape and depth, axial and radial crack growth rate and fracture toughness were considered as probabilistic variables. Parametric study has been done under the base of pressure tube dimension and hydride precipitation temperature in calculating failure probability. Unstable fracture and plastic collapse are used for the failure assessment. The estimated failure probability showed about three-order difference with changing dimensions of pressure tube.

  20. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2010-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish .

  1. Eruptive probability calculation for the Yucca Mountain site, USA: statistical estimation of recurrence rates

    Science.gov (United States)

    Ho, Chih-Hsiang; Smith, Eugene I.; Feuerbach, Daniel L.; Naumann, Terry R.

    1991-12-01

    Investigations are currently underway to evaluate the impact of potentially adverse conditions (e.g. volcanism, faulting, seismicity) on the waste-isolation capability of the proposed nuclear waste repository at Yucca Mountain, Nevada, USA. This paper is the first in a series that will examine the probability of disruption of the Yucca Mountain site by volcanic eruption. In it, we discuss three estimating techniques for determining the recurrence rate of volcanic eruption (λ), an important parameter in the Poisson probability model. The first method is based on the number of events occurring over a certain observation period, the second is based on repose times, and the final is based on magma volume. All three require knowledge of the total number of eruptions in the Yucca Mountain area during the observation period ( E). Following this discussion we then propose an estimate of E which takes into account the possibility of polygenetic and polycyclic volcanism at all the volcanic centers near the Yucca Mountain site.

  2. Estimating probabilities of peptide database identifications to LC-FTICR-MS observations

    Directory of Open Access Journals (Sweden)

    Daly Don S

    2006-02-01

    Full Text Available Abstract Background The field of proteomics involves the characterization of the peptides and proteins expressed in a cell under specific conditions. Proteomics has made rapid advances in recent years following the sequencing of the genomes of an increasing number of organisms. A prominent technology for high throughput proteomics analysis is the use of liquid chromatography coupled to Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR-MS. Meaningful biological conclusions can best be made when the peptide identities returned by this technique are accompanied by measures of accuracy and confidence. Methods After a tryptically digested protein mixture is analyzed by LC-FTICR-MS, the observed masses and normalized elution times of the detected features are statistically matched to the theoretical masses and elution times of known peptides listed in a large database. The probability of matching is estimated for each peptide in the reference database using statistical classification methods assuming bivariate Gaussian probability distributions on the uncertainties in the masses and the normalized elution times. Results A database of 69,220 features from 32 LC-FTICR-MS analyses of a tryptically digested bovine serum albumin (BSA sample was matched to a database populated with 97% false positive peptides. The percentage of high confidence identifications was found to be consistent with other database search procedures. BSA database peptides were identified with high confidence on average in 14.1 of the 32 analyses. False positives were identified on average in just 2.7 analyses. Conclusion Using a priori probabilities that contrast peptides from expected and unexpected proteins was shown to perform better in identifying target peptides than using equally likely a priori probabilities. This is because a large percentage of the target peptides were similar to unexpected peptides which were included to be false positives. The use of

  3. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    Science.gov (United States)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  4. Estimating the probability of arsenic occurrence in domestic wells in the United States

    Science.gov (United States)

    Ayotte, J.; Medalie, L.; Qi, S.; Backer, L. F.; Nolan, B. T.

    2016-12-01

    Approximately 43 million people (about 14 percent of the U.S. population) rely on privately owned domestic wells as their source of drinking water. Unlike public water systems, which are regulated by the Safe Drinking Water Act, there is no comprehensive national program to ensure that the water from domestic wells is routinely tested and that is it safe to drink. A study published in 2009 from the National Water-Quality Assessment Program of the U.S. Geological Survey assessed water-quality conditions from 2,100 domestic wells within 48 states and reported that more than one in five (23 percent) of the sampled wells contained one or more contaminants at a concentration greater than a human-health benchmark. In addition, there are many activities such as resource extraction, climate change-induced drought, and changes in land use patterns that could potentially affect the quality of the ground water source for domestic wells. The Health Studies Branch (HSB) of the National Center for Environmental Health, Centers for Disease Control and Prevention, created a Clean Water for Health Program to help address domestic well concerns. The goals of this program are to identify emerging public health issues associated with using domestic wells for drinking water and develop plans to address these issues. As part of this effort, HSB in cooperation with the U.S. Geological Survey has created probability models to estimate the probability of arsenic occurring at various concentrations in domestic wells in the U.S. We will present preliminary results of the project, including estimates of the population supplied by domestic wells that is likely to have arsenic greater than 10 micrograms per liter. Nationwide, we estimate this to be just over 2 million people. Logistic regression model results showing probabilities of arsenic greater than the Maximum Contaminant Level for public supply wells of 10 micrograms per liter in domestic wells in the U.S., based on data for arsenic

  5. Estimation of probability of coastal flooding: A case study in the Norton Sound, Alaska

    Science.gov (United States)

    Kim, S.; Chapman, R. S.; Jensen, R. E.; Azleton, M. T.; Eisses, K. J.

    2010-12-01

    Along the Norton Sound, Alaska, coastal communities have been exposed to flooding induced by the extra-tropical storms. Lack of observation data especially with long-term variability makes it difficult to assess the probability of coastal flooding critical in planning for development and evacuation of the coastal communities. We estimated the probability of coastal flooding with the help of an existing storm surge model using ADCIRC and a wave model using WAM for the Western Alaska which includes the Norton Sound as well as the adjacent Bering Sea and Chukchi Sea. The surface pressure and winds as well as ice coverage was analyzed and put in a gridded format with 3 hour interval over the entire Alaskan Shelf by Ocean Weather Inc. (OWI) for the period between 1985 and 2009. The OWI also analyzed the surface conditions for the storm events over the 31 year time period between 1954 and 1984. The correlation between water levels recorded by NOAA tide gage and local meteorological conditions at Nome between 1992 and 2005 suggested strong local winds with prevailing Southerly components period are good proxies for high water events. We also selected heuristically the local winds with prevailing Westerly components at Shaktoolik which locates at the eastern end of the Norton Sound provided extra selection of flood events during the continuous meteorological data record between 1985 and 2009. The frequency analyses were performed using the simulated water levels and wave heights for the 56 year time period between 1954 and 2009. Different methods of estimating return periods were compared including the method according to FEMA guideline, the extreme value statistics, and fitting to the statistical distributions such as Weibull and Gumbel. The estimates are similar as expected but with a variation.

  6. Peritonitis-free survival in peritoneal dialysis: an update taking competing risks into account.

    Science.gov (United States)

    Evans, David W; Ryckelynck, Jean-Philippe; Fabre, Emmanuel; Verger, Christian

    2010-07-01

    Peritonitis-free survival is commonly reported in the peritoneal dialysis (PD) literature. The Kaplan-Meier method appears to be the only technique used to date, although it has known limitations for cohorts with multiple outcomes, as in PD. In the presence of these 'competing risks' outcomes, the Kaplan-Meier estimate is interpretable only under restrictive assumptions. In contrast, methods which take competing risks into account provide unbiased estimates of probabilities of outcomes as actually experienced by patients. We analysed peritonitis-free survival in a cohort of 8711 incident patients from the 'Registre de Dialyse Péritonéale de Langue Française' between 1 January 2000 and 31 December 2007 by calculating the cumulative incidence (CI) of the first episode of peritonitis using the Kaplan-Meier method and a method accounting for competing risks. We compared the CI in different patient groups by the log-rank test and a test developed for competing risk data, Gray's test. After 5 years of PD, the CI of at least one peritonitis episode was 0.4, and the probability of any outcome was 0.96. The Kaplan-Meier method overestimated the CI by a large amount. Compared with the log-rank test, Gray's test led to different conclusions in three out of seven comparisons. The competing risk approach shows that the CI of at least one peritonitis episode was lower than reported by the Kaplan-Meier method but that survival peritonitis-free and still on PD was overall low. The competing risk approach provides estimates which have a clearer interpretation than Kaplan-Meier methods and could be more widely used in PD research.

  7. METAPHOR: a machine-learning-based method for the probability density estimation of photometric redshifts

    Science.gov (United States)

    Cavuoti, S.; Amaro, V.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.

    2017-02-01

    A variety of fundamental astrophysical science topics require the determination of very accurate photometric redshifts (photo-z). A wide plethora of methods have been developed, based either on template models fitting or on empirical explorations of the photometric parameter space. Machine-learning-based techniques are not explicitly dependent on the physical priors and able to produce accurate photo-z estimations within the photometric ranges derived from the spectroscopic training set. These estimates, however, are not easy to characterize in terms of a photo-z probability density function (PDF), due to the fact that the analytical relation mapping the photometric parameters on to the redshift space is virtually unknown. We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method designed to provide a reliable PDF of the error distribution for empirical techniques. The method is implemented as a modular workflow, whose internal engine for photo-z estimation makes use of the MLPQNA neural network (Multi Layer Perceptron with Quasi Newton learning rule), with the possibility to easily replace the specific machine-learning model chosen to predict photo-z. We present a summary of results on SDSS-DR9 galaxy data, used also to perform a direct comparison with PDFs obtained by the LE PHARE spectral energy distribution template fitting. We show that METAPHOR is capable to estimate the precision and reliability of photometric redshifts obtained with three different self-adaptive techniques, i.e. MLPQNA, Random Forest and the standard K-Nearest Neighbors models.

  8. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  9. ANNz2: Photometric Redshift and Probability Distribution Function Estimation using Machine Learning

    Science.gov (United States)

    Sadeh, I.; Abdalla, F. B.; Lahav, O.

    2016-10-01

    We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister & Lahav, which now includes generation of full probability distribution functions (PDFs). ANNz2 utilizes multiple machine learning methods, such as artificial neural networks and boosted decision/regression trees. The objective of the algorithm is to optimize the performance of the photo-z estimation, to properly derive the associated uncertainties, and to produce both single-value solutions and PDFs. In addition, estimators are made available, which mitigate possible problems of non-representative or incomplete spectroscopic training samples. ANNz2 has already been used as part of the first weak lensing analysis of the Dark Energy Survey, and is included in the experiment's first public data release. Here we illustrate the functionality of the code using data from the tenth data release of the Sloan Digital Sky Survey and the Baryon Oscillation Spectroscopic Survey. The code is available for download at http://github.com/IftachSadeh/ANNZ.

  10. Estimation of Transitional Probabilities of Discrete Event Systems from Cross-Sectional Survey and its Application in Tobacco Control.

    Science.gov (United States)

    Lin, Feng; Chen, Xinguang

    2010-02-01

    In order to find better strategies for tobacco control, it is often critical to know the transitional probabilities among various stages of tobacco use. Traditionally, such probabilities are estimated by analyzing data from longitudinal surveys that are often time-consuming and expensive to conduct. Since cross-sectional surveys are much easier to conduct, it will be much more practical and useful to estimate transitional probabilities from cross-sectional survey data if possible. However, no previous research has attempted to do this. In this paper, we propose a method to estimate transitional probabilities from cross-sectional survey data. The method is novel and is based on a discrete event system framework. In particular, we introduce state probabilities and transitional probabilities to conventional discrete event system models. We derive various equations that can be used to estimate the transitional probabilities. We test the method using cross-sectional data of the National Survey on Drug Use and Health. The estimated transitional probabilities can be used in predicting the future smoking behavior for decision-making, planning and evaluation of various tobacco control programs. The method also allows a sensitivity analysis that can be used to find the most effective way of tobacco control. Since there are much more cross-sectional survey data in existence than longitudinal ones, the impact of this new method is expected to be significant.

  11. Estimation of the nuclear fuel assembly eigenfrequencies in the probability sense

    Directory of Open Access Journals (Sweden)

    Zeman V.

    2014-12-01

    Full Text Available The paper deals with upper and lower limits estimation of the nuclear fuel assembly eigenfrequencies, whose design and operation parameters are random variables. Each parameter is defined by its mean value and standard deviation or by a range of values. The gradient and three sigma criterion approach is applied to the calculation of the upper and lower limits of fuel assembly eigenfrequencies in the probability sense. Presented analytical approach used for the calculation of eigenfrequencies sensitivity is based on the modal synthesis method and the fuel assembly decomposition into six identical revolved fuel rod segments, centre tube and load-bearing skeleton linked by spacer grids. The method is applied for the Russian TVSA-T fuel assembly in the WWER1000/320 type reactor core in the Czech nuclear power plant Temelín.

  12. Estimating landholders' probability of participating in a stewardship program, and the implications for spatial conservation priorities.

    Directory of Open Access Journals (Sweden)

    Vanessa M Adams

    Full Text Available The need to integrate social and economic factors into conservation planning has become a focus of academic discussions and has important practical implications for the implementation of conservation areas, both private and public. We conducted a survey in the Daly Catchment, Northern Territory, to inform the design and implementation of a stewardship payment program. We used a choice model to estimate the likely level of participation in two legal arrangements--conservation covenants and management agreements--based on payment level and proportion of properties required to be managed. We then spatially predicted landholders' probability of participating at the resolution of individual properties and incorporated these predictions into conservation planning software to examine the potential for the stewardship program to meet conservation objectives. We found that the properties that were least costly, per unit area, to manage were also the least likely to participate. This highlights a tension between planning for a cost-effective program and planning for a program that targets properties with the highest probability of participation.

  13. Estimating the ground-state probability of a quantum simulation with product-state measurements

    Directory of Open Access Journals (Sweden)

    Bryce eYoshimura

    2015-10-01

    Full Text Available .One of the goals in quantum simulation is to adiabatically generate the ground state of a complicated Hamiltonian by starting with the ground state of a simple Hamiltonian and slowly evolving the system to the complicated one. If the evolution is adiabatic and the initial and final ground states are connected due to having the same symmetry, then the simulation will be successful. But in most experiments, adiabatic simulation is not possible because it would take too long, and the system has some level of diabatic excitation. In this work, we quantify the extent of the diabatic excitation even if we do not know {it a priori} what the complicated ground state is. Since many quantum simulator platforms, like trapped ions, can measure the probabilities to be in a product state, we describe techniques that can employ these simple measurements to estimate the probability of being in the ground state of the system after the diabatic evolution. These techniques do not require one to know any properties about the Hamiltonian itself, nor to calculate its eigenstate properties. All the information is derived by analyzing the product-state measurements as functions of time.

  14. Measuring and Modeling Fault Density for Plume-Fault Encounter Probability Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, P.D.; Oldenburg, C.M.; Nicot, J.-P.

    2011-05-15

    Emission of carbon dioxide from fossil-fueled power generation stations contributes to global climate change. Storage of this carbon dioxide within the pores of geologic strata (geologic carbon storage) is one approach to mitigating the climate change that would otherwise occur. The large storage volume needed for this mitigation requires injection into brine-filled pore space in reservoir strata overlain by cap rocks. One of the main concerns of storage in such rocks is leakage via faults. In the early stages of site selection, site-specific fault coverages are often not available. This necessitates a method for using available fault data to develop an estimate of the likelihood of injected carbon dioxide encountering and migrating up a fault, primarily due to buoyancy. Fault population statistics provide one of the main inputs to calculate the encounter probability. Previous fault population statistics work is shown to be applicable to areal fault density statistics. This result is applied to a case study in the southern portion of the San Joaquin Basin with the result that the probability of a carbon dioxide plume from a previously planned injection had a 3% chance of encountering a fully seal offsetting fault.

  15. A method for Bayesian estimation of the probability of local intensity for some cities in Japan

    Directory of Open Access Journals (Sweden)

    G. C. Koravos

    2002-06-01

    Full Text Available Seismic hazard in terms of probability of exceedance of a given intensity in a given time span,was assessed for 12 sites in Japan.The method does not use any attenuation law.Instead,the dependence of local intensity on epicentral intensity I 0 is calculated directly from the data,using a Bayesian model.According to this model (Meroni et al., 1994,local intensity follows the binomial distribution with parameters (I 0 ,p .The parameter p is considered as a random variable following the Beta distribution.This manner of Bayesian estimates of p are assessed for various values of epicentral intensity and epicentral distance.In order to apply this model for the assessment of seismic hazard,the area under consideration is divided into seismic sources (zonesof known seismicity.The contribution of each source on the seismic hazard at every site is calculated according to the Bayesian model and the result is the combined effect of all the sources.High probabilities of exceedance were calculated for the sites that are in the central part of the country,with hazard decreasing slightly towards the north and the south parts.

  16. Comparison of disjunctive kriging to generalized probability kriging in application to the estimation of simulated and real data

    Energy Technology Data Exchange (ETDEWEB)

    Carr, J.R. (Nevada Univ., Reno, NV (United States). Dept. of Geological Sciences); Mao, Nai-hsien (Lawrence Livermore National Lab., CA (United States))

    1992-01-01

    Disjunctive kriging has been compared previously to multigaussian kriging and indicator cokriging for estimation of cumulative distribution functions; it has yet to be compared extensively to probability kriging. Herein, disjunctive kriging and generalized probability kriging are applied to one real and one simulated data set and compared for estimation of the cumulative distribution functions. Generalized probability kriging is an extension, based on generalized cokriging theory, of simple probability kriging for the estimation of the indicator and uniform transforms at each cutoff, Z{sub k}. The disjunctive kriging and the generalized probability kriging give similar results for simulated data of normal distribution, but differ considerably for real data set with non-normal distribution.

  17. Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change

    Science.gov (United States)

    Field, R.; Constantine, P.; Boslough, M.

    2011-12-01

    We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We

  18. Probability density function and estimation for error of digitized map coordinates in GIS

    Institute of Scientific and Technical Information of China (English)

    童小华; 刘大杰

    2004-01-01

    Traditionally, it is widely accepted that measurement error usually obeys the normal distribution. However, in this paper a new idea is proposed that the error in digitized data which is a major derived data source in GIS does not obey the normal distribution but the p-norm distribution with a determinate parameter. Assuming that the error is random and has the same statistical properties, the probability density function of the normal distribution,Laplace distribution and p-norm distribution are derived based on the arithmetic mean axiom, median axiom and pmedian axiom, which means that the normal distribution is only one of these distributions but not the least one.Based on this idea, distribution fitness tests such as Skewness and Kurtosis coefficient test, Pearson chi-square x2 test and Kolmogorov test for digitized data are conducted. The results show that the error in map digitization obeys the p-norm distribution whose parameter is close to 1.60. A least p-norm estimation and the least square estimation of digitized data are further analyzed, showing that the least p-norm adiustment is better than the least square adjustment for digitized data processing in GIS.

  19. Estimates of probability of a cloud-free line of sight for RAPTOR TALON

    Science.gov (United States)

    Bauer, Ernest

    1994-07-01

    RAPTOR TALON is a concept that includes optical sensing of rocket plumes from low altitudes (2 km) through burnout from an air vehicle at 18-20 km, at a long range, i.e., R approx. 20-100 km. The presence of clouds can interfere with optical sensing, and thus it is important to establish the Probability of a Cloud-Free Line of Sight (PCFLOS) at locations and times of concern. Some previous estimates of PCFLOS were counter-intuitive and provided varying results; POET was asked to resolve the discrepancies. Here we ask for the PCFLOS for paths from 18 km altitude above all clouds to 2 km, at a slant range of about 20 - 100 km at two different locations (Baghdad, Iraq, and Seoul, Korea) for January and July at average cloudiness. There are very few clouds in Iraq during the summer, so for this case PCFLOS - 0.9 - 0.95. At the other locations and seasons the mean cloud cover ranges between 0.4 and 0.7, and the POFLOS values range between 0.6 and 0.7 at P = 20 km, and between 0.4 and 0.6 at P = 100 km. Estimates made by a variety of methods are generally consistent with this finding.

  20. Estimation of Time-Varying Channel State Transition Probabilities for Cognitive Radio Systems by means of Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    A. Akbulut

    2012-04-01

    Full Text Available In this study, Particle Swarm Optimization is applied for the estimation of the channel state transition probabilities. Unlike most other studies, where the channel state transition probabilities are assumed to be known and/or constant, in this study, these values are realistically considered to be time-varying parameters, which are unknown to the secondary users of the cognitive radio systems. The results of this study demonstrate the following: without any a priori information about the channel characteristics, even in a very transient environment, it is quite possible to achieve reasonable estimates of channel state transition probabilities with a practical and simple implementation.

  1. Estimating the probabilities of making a smoking quit attempt in Italy: stall in smoking cessation levels, 1986-2009

    Directory of Open Access Journals (Sweden)

    Carreras Giulia

    2012-03-01

    Full Text Available Abstract Background No data on annual smoking cessation probability (i.e., the probability of successfully quit in a given year are available for Italy at a population level. Mathematical models typically used to estimate smoking cessation probabilities do not account for smoking relapse. In this paper, we developed a mathematical model to estimate annual quitting probabilities, taking into account smoking relapse and time since cessation. Methods We developed a dynamic model describing the evolution of current, former, and never smokers. We estimated probabilities of smoking cessation by fitting the model with observed smoking prevalence in Italy, 1986-2009. Results Annual cessation probabilities were higher than 5% only in elderly persons and in women aged Conclusions Over the last 20 years, cessation probabilities among Italian smokers, particularly for those aged 30-59 years, have been very low and stalled. Quitting in Italy is considered as a practicable strategy only by women in the age of pregnancy and by elderly persons, when it’s likely that symptoms of tobacco-related diseases have already appeared. In order to increase cessation probabilities, smoking cessation treatment policies (introducing total reimbursement of cessation treatments, with a further development of quitlines and smoking cessation services should be empowered and a country-wide mass media campaign targeting smokers aged 30-59 years and focusing on promotion of quitting should be implemented.

  2. Estimation of peak discharge quantiles for selected annual exceedance probabilities in northeastern Illinois

    Science.gov (United States)

    Over, Thomas; Saito, Riki J.; Veilleux, Andrea; Sharpe, Jennifer B.; Soong, David T.; Ishii, Audrey

    2016-06-28

    This report provides two sets of equations for estimating peak discharge quantiles at annual exceedance probabilities (AEPs) of 0.50, 0.20, 0.10, 0.04, 0.02, 0.01, 0.005, and 0.002 (recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively) for watersheds in Illinois based on annual maximum peak discharge data from 117 watersheds in and near northeastern Illinois. One set of equations was developed through a temporal analysis with a two-step least squares-quantile regression technique that measures the average effect of changes in the urbanization of the watersheds used in the study. The resulting equations can be used to adjust rural peak discharge quantiles for the effect of urbanization, and in this study the equations also were used to adjust the annual maximum peak discharges from the study watersheds to 2010 urbanization conditions.The other set of equations was developed by a spatial analysis. This analysis used generalized least-squares regression to fit the peak discharge quantiles computed from the urbanization-adjusted annual maximum peak discharges from the study watersheds to drainage-basin characteristics. The peak discharge quantiles were computed by using the Expected Moments Algorithm following the removal of potentially influential low floods defined by a multiple Grubbs-Beck test. To improve the quantile estimates, generalized skew coefficients were obtained from a newly developed regional skew model in which the skew increases with the urbanized land use fraction. The drainage-basin characteristics used as explanatory variables in the spatial analysis include drainage area, the fraction of developed land, the fraction of land with poorly drained soils or likely water, and the basin slope estimated as the ratio of the basin relief to basin perimeter.This report also provides the following: (1) examples to illustrate the use of the spatial and urbanization-adjustment equations for estimating peak discharge quantiles at

  3. A probability model for evaluating the bias and precision of influenza vaccine effectiveness estimates from case-control studies.

    Science.gov (United States)

    Haber, M; An, Q; Foppa, I M; Shay, D K; Ferdinands, J M; Orenstein, W A

    2015-05-01

    As influenza vaccination is now widely recommended, randomized clinical trials are no longer ethical in many populations. Therefore, observational studies on patients seeking medical care for acute respiratory illnesses (ARIs) are a popular option for estimating influenza vaccine effectiveness (VE). We developed a probability model for evaluating and comparing bias and precision of estimates of VE against symptomatic influenza from two commonly used case-control study designs: the test-negative design and the traditional case-control design. We show that when vaccination does not affect the probability of developing non-influenza ARI then VE estimates from test-negative design studies are unbiased even if vaccinees and non-vaccinees have different probabilities of seeking medical care against ARI, as long as the ratio of these probabilities is the same for illnesses resulting from influenza and non-influenza infections. Our numerical results suggest that in general, estimates from the test-negative design have smaller bias compared to estimates from the traditional case-control design as long as the probability of non-influenza ARI is similar among vaccinated and unvaccinated individuals. We did not find consistent differences between the standard errors of the estimates from the two study designs.

  4. Estimation of the Probable Maximum Flood for a Small Lowland River in Poland

    Science.gov (United States)

    Banasik, K.; Hejduk, L.

    2009-04-01

    The planning, designe and use of hydrotechnical structures often requires the assesment of maximu flood potentials. The most common term applied to this upper limit of flooding is the probable maximum flood (PMF). The PMP/UH (probable maximum precipitation/unit hydrograph) method has been used in the study to predict PMF from a small agricultural lowland river basin of Zagozdzonka (left tributary of Vistula river) in Poland. The river basin, located about 100 km south of Warsaw, with an area - upstream the gauge of Plachty - of 82 km2, has been investigated by Department of Water Engineering and Environmenal Restoration of Warsaw University of Life Sciences - SGGW since 1962. Over 40-year flow record was used in previous investigation for predicting T-year flood discharge (Banasik et al., 2003). The objective here was to estimate the PMF using the PMP/UH method and to compare the results with the 100-year flood. A new relation of depth-duration curve of PMP for the local climatic condition has been developed based on Polish maximum observed rainfall data (Ozga-Zielinska & Ozga-Zielinski, 2003). Exponential formula, with the value of exponent of 0.47, i.e. close to the exponent in formula for world PMP and also in the formula of PMP for Great Britain (Wilson, 1993), gives the rainfall depth about 40% lower than the Wilson's one. The effective rainfall (runoff volume) has been estimated from the PMP of various duration using the CN-method (USDA-SCS, 1986). The CN value as well as parameters of the IUH model (Nash, 1957) have been established from the 27 rainfall-runoff events, recorded in the river basin in the period 1980-2004. Varibility of the parameter values with the size of the events will be discussed in the paper. The results of the analyse have shown that the peak discharge of the PMF is 4.5 times larger then 100-year flood, and volume ratio of the respective direct hydrographs caused by rainfall events of critical duration is 4.0. References 1.Banasik K

  5. Using of bayesian networks to estimate the probability of "NATECH" scenario occurrence

    Science.gov (United States)

    Dobes, Pavel; Dlabka, Jakub; Jelšovská, Katarína; Polorecká, Mária; Baudišová, Barbora; Danihelka, Pavel

    2015-04-01

    In the twentieth century, implementation of Bayesian statistics and probability was not much used (may be it wasn't a preferred approach) in the area of natural and industrial risk analysis and management. Neither it was used within analysis of so called NATECH accidents (chemical accidents triggered by natural events, such as e.g. earthquakes, floods, lightning etc.; ref. E. Krausmann, 2011, doi:10.5194/nhess-11-921-2011). Main role, from the beginning, played here so called "classical" frequentist probability (ref. Neyman, 1937), which rely up to now especially on the right/false results of experiments and monitoring and didn't enable to count on expert's beliefs, expectations and judgements (which is, on the other hand, one of the once again well known pillars of Bayessian approach to probability). In the last 20 or 30 years, there is possible to observe, through publications and conferences, the Renaissance of Baysssian statistics into many scientific disciplines (also into various branches of geosciences). The necessity of a certain level of trust in expert judgment within risk analysis is back? After several decades of development on this field, it could be proposed following hypothesis (to be checked): "We couldn't estimate probabilities of complex crisis situations and their TOP events (many NATECH events could be classified as crisis situations or emergencies), only by classical frequentist approach, but also by using of Bayessian approach (i.e. with help of prestaged Bayessian Network including expert belief and expectation as well as classical frequentist inputs). Because - there is not always enough quantitative information from monitoring of historical emergencies, there could be several dependant or independant variables necessary to consider and in generally - every emergency situation always have a little different run." In this topic, team of authors presents its proposal of prestaged typized Bayessian network model for specified NATECH scenario

  6. The probability distribution for non-Gaussianity estimators constructed from the CMB trispectrum

    CERN Document Server

    Smith, Tristan L

    2012-01-01

    Considerable recent attention has focussed on the prospects to use the cosmic microwave background (CMB) trispectrum to probe the physics of the early universe. Here we evaluate the probability distribution function (PDF) for the standard estimator tau_nle for the amplitude tau_nl of the CMB trispectrum both for the null-hypothesis (i.e., for Gaussian maps with tau_nl = 0) and for maps with a non-vanishing trispectrum (|tau_nl|>0). We find these PDFs to be highly non-Gaussian in both cases. We also evaluate the variance with which the trispectrum amplitude can be measured, , as a function of its underlying value, tau_nl. We find a strong dependence of this variance on tau_nl. We also find that the variance does not, given the highly non-Gaussian nature of the PDF, effectively characterize the distribution. Detailed knowledge of these PDFs will therefore be imperative in order to properly interpret the implications of any given trispectrum measurement. For example, if a CMB experiment with a maximum multipole ...

  7. Estimating Recovery Failure Probabilities in Off-normal Situations from Full-Scope Simulator Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    As part of this effort, KAERI developed the Human Reliability data EXtraction (HuREX) framework and is collecting full-scope simulator-based human reliability data into the OPERA (Operator PErformance and Reliability Analysis) database. In this study, with the series of estimation research for HEPs or PSF effects, significant information for a quantitative HRA analysis, recovery failure probabilities (RFPs), were produced from the OPERA database. Unsafe acts can occur at any time in safety-critical systems and the operators often manage the systems by discovering their errors and eliminating or mitigating them. To model the recovery processes or recovery strategies, there were several researches that categorize the recovery behaviors. Because the recent human error trends are required to be considered during a human reliability analysis, Jang et al. can be seen as an essential effort of the data collection. However, since the empirical results regarding soft controls were produced from a controlled laboratory environment with student participants, it is necessary to analyze a wide range of operator behaviors using full-scope simulators. This paper presents the statistics related with human error recovery behaviors obtained from the full-scope simulations that in-site operators participated in. In this study, the recovery effects by shift changes or technical support centers were not considered owing to a lack of simulation data.

  8. Maximum Entropy Estimation of Probability Distribution of Variables in Higher Dimensions from Lower Dimensional Data

    Directory of Open Access Journals (Sweden)

    Jayajit Das '

    2015-07-01

    Full Text Available A common statistical situation concerns inferring an unknown distribution Q(x from a known distribution P(y, where X (dimension n, and Y (dimension m have a known functional relationship. Most commonly, n ≤ m, and the task is relatively straightforward for well-defined functional relationships. For example, if Y1 and Y2 are independent random variables, each uniform on [0, 1], one can determine the distribution of X = Y1 + Y2; here m = 2 and n = 1. However, biological and physical situations can arise where n > m and the functional relation Y→X is non-unique. In general, in the absence of additional information, there is no unique solution to Q in those cases. Nevertheless, one may still want to draw some inferences about Q. To this end, we propose a novel maximum entropy (MaxEnt approach that estimates Q(x based only on the available data, namely, P(y. The method has the additional advantage that one does not need to explicitly calculate the Lagrange multipliers. In this paper we develop the approach, for both discrete and continuous probability distributions, and demonstrate its validity. We give an intuitive justification as well, and we illustrate with examples.

  9. An Illustration of Inverse Probability Weighting to Estimate Policy-Relevant Causal Effects.

    Science.gov (United States)

    Edwards, Jessie K; Cole, Stephen R; Lesko, Catherine R; Mathews, W Christopher; Moore, Richard D; Mugavero, Michael J; Westreich, Daniel

    2016-08-15

    Traditional epidemiologic approaches allow us to compare counterfactual outcomes under 2 exposure distributions, usually 100% exposed and 100% unexposed. However, to estimate the population health effect of a proposed intervention, one may wish to compare factual outcomes under the observed exposure distribution to counterfactual outcomes under the exposure distribution produced by an intervention. Here, we used inverse probability weights to compare the 5-year mortality risk under observed antiretroviral therapy treatment plans to the 5-year mortality risk that would had been observed under an intervention in which all patients initiated therapy immediately upon entry into care among patients positive for human immunodeficiency virus in the US Centers for AIDS Research Network of Integrated Clinical Systems multisite cohort study between 1998 and 2013. Therapy-naïve patients (n = 14,700) were followed from entry into care until death, loss to follow-up, or censoring at 5 years or on December 31, 2013. The 5-year cumulative incidence of mortality was 11.65% under observed treatment plans and 10.10% under the intervention, yielding a risk difference of -1.57% (95% confidence interval: -3.08, -0.06). Comparing outcomes under the intervention with outcomes under observed treatment plans provides meaningful information about the potential consequences of new US guidelines to treat all patients with human immunodeficiency virus regardless of CD4 cell count under actual clinical conditions.

  10. An empirical method for estimating probability density functions of gridded daily minimum and maximum temperature

    Science.gov (United States)

    Lussana, C.

    2013-04-01

    The presented work focuses on the investigation of gridded daily minimum (TN) and maximum (TX) temperature probability density functions (PDFs) with the intent of both characterising a region and detecting extreme values. The empirical PDFs estimation procedure has been realised using the most recent years of gridded temperature analysis fields available at ARPA Lombardia, in Northern Italy. The spatial interpolation is based on an implementation of Optimal Interpolation using observations from a dense surface network of automated weather stations. An effort has been made to identify both the time period and the spatial areas with a stable data density otherwise the elaboration could be influenced by the unsettled station distribution. The PDF used in this study is based on the Gaussian distribution, nevertheless it is designed to have an asymmetrical (skewed) shape in order to enable distinction between warming and cooling events. Once properly defined the occurrence of extreme events, it is possible to straightforwardly deliver to the users the information on a local-scale in a concise way, such as: TX extremely cold/hot or TN extremely cold/hot.

  11. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    of its entirely different sampling strategy, based on known but non-uniform sampling probabilities, the proportionator for the first time allows the real CE at the section level to be automatically estimated (not just predicted), unbiased - for all estimators and at no extra cost to the user.......The proportionator is a novel and radically different approach to sampling with microscopes based on well-known statistical theory (probability proportional to size - PPS sampling). It uses automatic image analysis, with a large range of options, to assign to every field of view in the section......, the desired number of fields are sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections...

  12. Estimation of probable maximum precipitation at the Kielce Upland (Poland) using meteorological method

    Science.gov (United States)

    Suligowski, Roman

    2014-05-01

    Probable Maximum Precipitation based upon the physical mechanisms of precipitation formation at the Kielce Upland. This estimation stems from meteorological analysis of extremely high precipitation events, which occurred in the area between 1961 and 2007 causing serious flooding from rivers that drain the entire Kielce Upland. Meteorological situation has been assessed drawing on the synoptic maps, baric topography charts, satellite and radar images as well as the results of meteorological observations derived from surface weather observation stations. Most significant elements of this research include the comparison between distinctive synoptic situations over Europe and subsequent determination of typical rainfall generating mechanism. This allows the author to identify the source areas of air masses responsible for extremely high precipitation at the Kielce Upland. Analysis of the meteorological situations showed, that the source areas for humid air masses which cause the largest rainfalls at the Kielce Upland are the area of northern Adriatic Sea and the north-eastern coast of the Black Sea. Flood hazard at the Kielce Upland catchments was triggered by daily precipitation of over 60 mm. The highest representative dew point temperature in source areas of warm air masses (these responsible for high precipitation at the Kielce Upland) exceeded 20 degrees Celsius with a maximum of 24.9 degrees Celsius while precipitable water amounted to 80 mm. The value of precipitable water is also used for computation of factors featuring the system, namely the mass transformation factor and the system effectiveness factor. The mass transformation factor is computed based on precipitable water in the feeding mass and precipitable water in the source area. The system effectiveness factor (as the indicator of the maximum inflow velocity and the maximum velocity in the zone of front or ascending currents, forced by orography) is computed from the quotient of precipitable water in

  13. Clinical radiobiology of glioblastoma multiforme. Estimation of tumor control probability from various radiotherapy fractionation schemes

    Energy Technology Data Exchange (ETDEWEB)

    Pedicini, Piernicola [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy); Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Fiorentino, Alba [Sacro Cuore - Don Calabria Hospital, Radiation Oncology Department, Negrar, Verona (Italy); Simeon, Vittorio [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Laboratory of Preclinical and Translational Research, Rionero-in-Vulture (Italy); Tini, Paolo; Pirtoli, Luigi [University of Siena and Tuscany Tumor Institute, Unit of Radiation Oncology, Department of Medicine Surgery and Neurological Sciences, Siena (Italy); Chiumento, Costanza [Department of Radiation and Metabolic Therapies, I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Radiotherapy, Rionero-in-Vulture (Italy); Salvatore, Marco [I.R.C.C.S. SDN Foundation, Unit of Nuclear Medicine, Napoli (Italy); Storto, Giovanni [I.R.C.C.S.-Regional-Cancer-Hospital-C.R.O.B, Unit of Nuclear Medicine, Department of Radiation and Metabolic Therapies, Rionero-in-Vulture (Italy)

    2014-10-15

    The aim of this study was to estimate a radiobiological set of parameters from the available clinical data on glioblastoma (GB). A number of clinical trial outcomes from patients affected by GB and treated with surgery and adjuvant radiochemotherapy were analyzed to estimate a set of radiobiological parameters for a tumor control probability (TCP) model. The analytical/graphical method employed to fit the clinical data allowed us to estimate the intrinsic tumor radiosensitivity (α), repair capability (b), and repopulation doubling time (T{sub d}) in a first phase, and subsequently the number of clonogens (N) and kick-off time for accelerated proliferation (T{sub k}). The results were used to formulate a hypothesis for a scheduleexpected to significantly improve local control. The 95 % confidence intervals (CI{sub 95} {sub %}) of all parameters are also discussed. The pooled analysis employed to estimate the parameters summarizes the data of 559 patients, while the studies selected to verify the results summarize data of 104 patients. The best estimates and the CI{sub 95} {sub %} are α = 0.12 Gy{sup -1} (0.10-0.14), b = 0.015 Gy{sup -2} (0.013-0.020), α/b = 8 Gy (5.0-10.8), T{sub d} = 15.4 days (13.2-19.5), N = 1 . 10{sup 4} (1.2 . 10{sup 3} - 1 . 10{sup 5}), and T{sub k} = 37 days (29-46). The dose required to offset the repopulation occurring after 1 day (D{sub prolif}) and starting after T{sub k} was estimated as 0.30 Gy/day (0.22-0.39). The analysis confirms a high value for the α/b ratio. Moreover, a high intrinsic radiosensitivity together with a long kick-off time for accelerated repopulation and moderate repopulation kinetics were found. The results indicate a substantial independence of the duration of the overall treatment and an improvement in the treatment effectiveness by increasing the total dose without increasing the dose fraction. (orig.) [German] Schaetzung eines strahlenbiologischen Parametersatzes auf der Grundlage klinischer Daten bei

  14. Nonparametric estimation of transition probabilities in the non-Markov illness-death model: A comparative study.

    Science.gov (United States)

    de Uña-Álvarez, Jacobo; Meira-Machado, Luís

    2015-06-01

    Multi-state models are often used for modeling complex event history data. In these models the estimation of the transition probabilities is of particular interest, since they allow for long-term predictions of the process. These quantities have been traditionally estimated by the Aalen-Johansen estimator, which is consistent if the process is Markov. Several non-Markov estimators have been proposed in the recent literature, and their superiority with respect to the Aalen-Johansen estimator has been proved in situations in which the Markov condition is strongly violated. However, the existing estimators have the drawback of requiring that the support of the censoring distribution contains the support of the lifetime distribution, which is not often the case. In this article, we propose two new methods for estimating the transition probabilities in the progressive illness-death model. Some asymptotic results are derived. The proposed estimators are consistent regardless the Markov condition and the referred assumption about the censoring support. We explore the finite sample behavior of the estimators through simulations. The main conclusion of this piece of research is that the proposed estimators are much more efficient than the existing non-Markov estimators in most cases. An application to a clinical trial on colon cancer is included. Extensions to progressive processes beyond the three-state illness-death model are discussed.

  15. Small-area estimation of the probability of toxocariasis in New York City based on sociodemographic neighborhood composition.

    Science.gov (United States)

    Walsh, Michael G; Haseeb, M A

    2014-01-01

    Toxocariasis is increasingly recognized as an important neglected infection of poverty (NIP) in developed countries, and may constitute the most important NIP in the United States (US) given its association with chronic sequelae such as asthma and poor cognitive development. Its potential public health burden notwithstanding, toxocariasis surveillance is minimal throughout the US and so the true burden of disease remains uncertain in many areas. The Third National Health and Nutrition Examination Survey conducted a representative serologic survey of toxocariasis to estimate the prevalence of infection in diverse US subpopulations across different regions of the country. Using the NHANES III surveillance data, the current study applied the predicted probabilities of toxocariasis to the sociodemographic composition of New York census tracts to estimate the local probability of infection across the city. The predicted probability of toxocariasis ranged from 6% among US-born Latino women with a university education to 57% among immigrant men with less than a high school education. The predicted probability of toxocariasis exhibited marked spatial variation across the city, with particularly high infection probabilities in large sections of Queens, and smaller, more concentrated areas of Brooklyn and northern Manhattan. This investigation is the first attempt at small-area estimation of the probability surface of toxocariasis in a major US city. While this study does not define toxocariasis risk directly, it does provide a much needed tool to aid the development of toxocariasis surveillance in New York City.

  16. Small-area estimation of the probability of toxocariasis in New York City based on sociodemographic neighborhood composition.

    Directory of Open Access Journals (Sweden)

    Michael G Walsh

    Full Text Available Toxocariasis is increasingly recognized as an important neglected infection of poverty (NIP in developed countries, and may constitute the most important NIP in the United States (US given its association with chronic sequelae such as asthma and poor cognitive development. Its potential public health burden notwithstanding, toxocariasis surveillance is minimal throughout the US and so the true burden of disease remains uncertain in many areas. The Third National Health and Nutrition Examination Survey conducted a representative serologic survey of toxocariasis to estimate the prevalence of infection in diverse US subpopulations across different regions of the country. Using the NHANES III surveillance data, the current study applied the predicted probabilities of toxocariasis to the sociodemographic composition of New York census tracts to estimate the local probability of infection across the city. The predicted probability of toxocariasis ranged from 6% among US-born Latino women with a university education to 57% among immigrant men with less than a high school education. The predicted probability of toxocariasis exhibited marked spatial variation across the city, with particularly high infection probabilities in large sections of Queens, and smaller, more concentrated areas of Brooklyn and northern Manhattan. This investigation is the first attempt at small-area estimation of the probability surface of toxocariasis in a major US city. While this study does not define toxocariasis risk directly, it does provide a much needed tool to aid the development of toxocariasis surveillance in New York City.

  17. Kernel density estimation and marginalized-particle based probability hypothesis density filter for multi-target tracking

    Institute of Scientific and Technical Information of China (English)

    张路平; 王鲁平; 李飚; 赵明

    2015-01-01

    In order to improve the performance of the probability hypothesis density (PHD) algorithm based particle filter (PF) in terms of number estimation and states extraction of multiple targets, a new probability hypothesis density filter algorithm based on marginalized particle and kernel density estimation is proposed, which utilizes the idea of marginalized particle filter to enhance the estimating performance of the PHD. The state variables are decomposed into linear and non-linear parts. The particle filter is adopted to predict and estimate the nonlinear states of multi-target after dimensionality reduction, while the Kalman filter is applied to estimate the linear parts under linear Gaussian condition. Embedding the information of the linear states into the estimated nonlinear states helps to reduce the estimating variance and improve the accuracy of target number estimation. The meanshift kernel density estimation, being of the inherent nature of searching peak value via an adaptive gradient ascent iteration, is introduced to cluster particles and extract target states, which is independent of the target number and can converge to the local peak position of the PHD distribution while avoiding the errors due to the inaccuracy in modeling and parameters estimation. Experiments show that the proposed algorithm can obtain higher tracking accuracy when using fewer sampling particles and is of lower computational complexity compared with the PF-PHD.

  18. Threatened species and the potential loss of phylogenetic diversity: conservation scenarios based on estimated extinction probabilities and phylogenetic risk analysis.

    Science.gov (United States)

    Faith, Daniel P

    2008-12-01

    New species conservation strategies, including the EDGE of Existence (EDGE) program, have expanded threatened species assessments by integrating information about species' phylogenetic distinctiveness. Distinctiveness has been measured through simple scores that assign shared credit among species for evolutionary heritage represented by the deeper phylogenetic branches. A species with a high score combined with a high extinction probability receives high priority for conservation efforts. Simple hypothetical scenarios for phylogenetic trees and extinction probabilities demonstrate how such scoring approaches can provide inefficient priorities for conservation. An existing probabilistic framework derived from the phylogenetic diversity measure (PD) properly captures the idea of shared responsibility for the persistence of evolutionary history. It avoids static scores, takes into account the status of close relatives through their extinction probabilities, and allows for the necessary updating of priorities in light of changes in species threat status. A hypothetical phylogenetic tree illustrates how changes in extinction probabilities of one or more species translate into changes in expected PD. The probabilistic PD framework provided a range of strategies that moved beyond expected PD to better consider worst-case PD losses. In another example, risk aversion gave higher priority to a conservation program that provided a smaller, but less risky, gain in expected PD. The EDGE program could continue to promote a list of top species conservation priorities through application of probabilistic PD and simple estimates of current extinction probability. The list might be a dynamic one, with all the priority scores updated as extinction probabilities change. Results of recent studies suggest that estimation of extinction probabilities derived from the red list criteria linked to changes in species range sizes may provide estimated probabilities for many different species

  19. Normal Tissue Complication Probability (NTCP) modeling of late rectal bleeding following external beam radiotherapy for prostate cancer: A Test of the QUANTEC-recommended NTCP model

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Mitchell; Agranovich, Alexander; Karvat, Anand; Kwan, Winkle (Fraser Valley Centre, British Columbia Cancer Centre, Surrey, BC (Canada)); Moiseenko, Vitali (Vancouver Centre, British Columbia Cancer Agency, Vancouver, BC (Canada)); Saleh, Ziad H.; Apte, Aditya A.; Deasy, Joseph O. (Dept. of Radiation Oncology and the Mallinckrodt Inst. of Radiology, Washington Univ., St. Louis, MO (United States)), e-mail: deasyj@mskcc.org

    2010-10-15

    Purpose/background. Validating a predictive model for late rectal bleeding following external beam treatment for prostate cancer would enable safer treatments or dose escalation. We tested the normal tissue complication probability (NTCP) model recommended in the recent QUANTEC review (quantitative analysis of normal tissue effects in the clinic). Material and methods. One hundred and sixty one prostate cancer patients were treated with 3D conformal radiotherapy for prostate cancer at the British Columbia Cancer Agency in a prospective protocol. The total prescription dose for all patients was 74 Gy, delivered in 2 Gy/fraction. 159 3D treatment planning datasets were available for analysis. Rectal dose volume histograms were extracted and fitted to a Lyman-Kutcher-Burman NTCP model. Results. Late rectal bleeding (>grade 2) was observed in 12/159 patients (7.5%). Multivariate logistic regression with dose-volume parameters (V50, V60, V70, etc.) was non-significant. Among clinical variables, only age was significant on a Kaplan-Meier log-rank test (p=0.007, with an optimal cut point of 77 years). Best-fit Lyman-Kutcher-Burman model parameters (with 95% confidence intervals) were: n = 0.068 (0.01, +infinity); m =0.14 (0.0, 0.86); and TD50 81 (27, 136) Gy. The peak values fall within the 95% QUANTEC confidence intervals. On this dataset, both models had only modest ability to predict complications: the best-fit model had a Spearman's rank correlation coefficient of rs = 0.099 (p = 0.11) and area under the receiver operating characteristic curve (AUC) of 0.62; the QUANTEC model had rs=0.096 (p= 0.11) and a corresponding AUC of 0.61. Although the QUANTEC model consistently predicted higher NTCP values, it could not be rejected according to the chi2 test (p = 0.44). Conclusions. Observed complications, and best-fit parameter estimates, were consistent with the QUANTEC-preferred NTCP model. However, predictive power was low, at least partly because the rectal dose

  20. Survival estimates for elite male and female Olympic athletes and tennis championship competitors.

    Science.gov (United States)

    Coate, D; Sun, R

    2013-12-01

    In this paper, we report survival estimates for male and female Olympic medal winners and for male and female finalists at the British and U S national tennis championships. We find a consistent longevity advantage of Olympic medal-winning female athletes over Olympic medal-winning male athletes competing separately in the same events since 1900 and for female finalists over male finalists competing separately in the finals of the national tennis championships of Britain and of the United States since the 1880s. This is the case for sample mean comparisons, for Kaplan-Meier survival function estimates, including life expectancy, and for Cox proportional hazard estimates, which show statistically significant lower hazard rates for women with birth year and other variables constant. The female longevity advantage over males is similar in the early period samples (birth years before 1920) and in the full period samples, and is 5-7 years.

  1. Procedures for using expert judgment to estimate human-error probabilities in nuclear power plant operations. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Seaver, D.A.; Stillwell, W.G.

    1983-03-01

    This report describes and evaluates several procedures for using expert judgment to estimate human-error probabilities (HEPs) in nuclear power plant operations. These HEPs are currently needed for several purposes, particularly for probabilistic risk assessments. Data do not exist for estimating these HEPs, so expert judgment can provide these estimates in a timely manner. Five judgmental procedures are described here: paired comparisons, ranking and rating, direct numerical estimation, indirect numerical estimation and multiattribute utility measurement. These procedures are evaluated in terms of several criteria: quality of judgments, difficulty of data collection, empirical support, acceptability, theoretical justification, and data processing. Situational constraints such as the number of experts available, the number of HEPs to be estimated, the time available, the location of the experts, and the resources available are discussed in regard to their implications for selecting a procedure for use.

  2. Development of a score and probability estimate for detecting angle closure based on anterior segment optical coherence tomography.

    Science.gov (United States)

    Nongpiur, Monisha E; Haaland, Benjamin A; Perera, Shamira A; Friedman, David S; He, Mingguang; Sakata, Lisandro M; Baskaran, Mani; Aung, Tin

    2014-01-01

    To develop a score along with an estimated probability of disease for detecting angle closure based on anterior segment optical coherence tomography (AS OCT) imaging. Cross-sectional study. A total of 2047 subjects 50 years of age and older were recruited from a community polyclinic in Singapore. All subjects underwent standardized ocular examination including gonioscopy and imaging by AS OCT (Carl Zeiss Meditec). Customized software (Zhongshan Angle Assessment Program) was used to measure AS OCT parameters. Complete data were available for 1368 subjects. Data from the right eyes were used for analysis. A stepwise logistic regression model with Akaike information criterion was used to generate a score that then was converted to an estimated probability of the presence of gonioscopic angle closure, defined as the inability to visualize the posterior trabecular meshwork for at least 180 degrees on nonindentation gonioscopy. Of the 1368 subjects, 295 (21.6%) had gonioscopic angle closure. The angle closure score was calculated from the shifted linear combination of the AS OCT parameters. The score can be converted to an estimated probability of having angle closure using the relationship: estimated probability = e(score)/(1 + e(score)), where e is the natural exponential. The score performed well in a second independent sample of 178 angle-closure subjects and 301 normal controls, with an area under the receiver operating characteristic curve of 0.94. A score derived from a single AS OCT image, coupled with an estimated probability, provides an objective platform for detection of angle closure. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Experimental estimation of the photons visiting probability profiles in time-resolved diffuse reflectance measurement.

    Science.gov (United States)

    Sawosz, P; Kacprzak, M; Weigl, W; Borowska-Solonynko, A; Krajewski, P; Zolek, N; Ciszek, B; Maniewski, R; Liebert, A

    2012-12-07

    A time-gated intensified CCD camera was applied for time-resolved imaging of light penetrating in an optically turbid medium. Spatial distributions of light penetration probability in the plane perpendicular to the axes of the source and the detector were determined at different source positions. Furthermore, visiting probability profiles of diffuse reflectance measurement were obtained by the convolution of the light penetration distributions recorded at different source positions. Experiments were carried out on homogeneous phantoms, more realistic two-layered tissue phantoms based on the human skull filled with Intralipid-ink solution and on cadavers. It was noted that the photons visiting probability profiles depend strongly on the source-detector separation, the delay between the laser pulse and the photons collection window and the complex tissue composition of the human head.

  4. Estimation of Circular Error Probability of Strapped Down Inertial Navigation System by Propagation of Error Covariance Matrix

    Directory of Open Access Journals (Sweden)

    S. Vathsal

    1994-01-01

    Full Text Available This paper provides an error model of the strapped down inertial navigation system in the state space format. A method to estimate the circular error probability is presented using time propagation of error covariance matrix. Numerical results have been obtained for a typical flight trajectory. Sensitivity studies have also been conducted for variation of sensor noise covariances and initial state uncertainty. This methodology seems to work in all the practical cases considered so far. Software has been tested for both the local vertical frame and the inertial frame. The covariance propagation technique provides accurate estimation of dispersions of position at impact. This in turn enables to estimate the circular error probability (CEP very accurately.

  5. Effects of population variability on the accuracy of detection probability estimates

    DEFF Research Database (Denmark)

    Ordonez Gloria, Alejandro

    2011-01-01

    Observing a constant fraction of the population over time, locations, or species is virtually impossible. Hence, quantifying this proportion (i.e. detection probability) is an important task in quantitative population ecology. In this study we determined, via computer simulations, the ef- fect of...

  6. How does new evidence change our estimates of probabilities? Carnap's formula revisited

    Science.gov (United States)

    Kreinovich, Vladik; Quintana, Chris

    1992-01-01

    The formula originally proposed by R. Carnap in his analysis of induction is reviewed and its natural generalization is presented. A situation is considered where the probability of a certain event is determined without using standard statistical methods due to the lack of observation.

  7. Using Preferred Outcome Distributions to estimate Value and Probability Weighting Functions in Decisions under Risk

    NARCIS (Netherlands)

    A.C.D. Donkers (Bas); T. Lourenco (Tania); B.G.C. Dellaert (Benedict); D.G. Goldstein (Daniel G.)

    2013-01-01

    textabstract In this paper we propose the use of preferred outcome distributions as a new method to elicit individuals' value and probability weighting functions in decisions under risk. Extant approaches for the elicitation of these two key ingredients of individuals' risk attitude typically rely

  8. Effects of population variability on the accuracy of detection probability estimates

    DEFF Research Database (Denmark)

    Ordonez Gloria, Alejandro

    2011-01-01

    Observing a constant fraction of the population over time, locations, or species is virtually impossible. Hence, quantifying this proportion (i.e. detection probability) is an important task in quantitative population ecology. In this study we determined, via computer simulations, the ef- fect of...

  9. Absolute probability estimates of lethal vessel strikes to North Atlantic right whales in Roseway Basin, Scotian Shelf.

    Science.gov (United States)

    van der Hoop, Julie M; Vanderlaan, Angelia S M; Taggart, Christopher T

    2012-10-01

    Vessel strikes are the primary source of known mortality for the endangered North Atlantic right whale (Eubalaena glacialis). Multi-institutional efforts to reduce mortality associated with vessel strikes include vessel-routing amendments such as the International Maritime Organization voluntary "area to be avoided" (ATBA) in the Roseway Basin right whale feeding habitat on the southwestern Scotian Shelf. Though relative probabilities of lethal vessel strikes have been estimated and published, absolute probabilities remain unknown. We used a modeling approach to determine the regional effect of the ATBA, by estimating reductions in the expected number of lethal vessel strikes. This analysis differs from others in that it explicitly includes a spatiotemporal analysis of real-time transits of vessels through a population of simulated, swimming right whales. Combining automatic identification system (AIS) vessel navigation data and an observationally based whale movement model allowed us to determine the spatial and temporal intersection of vessels and whales, from which various probability estimates of lethal vessel strikes are derived. We estimate one lethal vessel strike every 0.775-2.07 years prior to ATBA implementation, consistent with and more constrained than previous estimates of every 2-16 years. Following implementation, a lethal vessel strike is expected every 41 years. When whale abundance is held constant across years, we estimate that voluntary vessel compliance with the ATBA results in an 82% reduction in the per capita rate of lethal strikes; very similar to a previously published estimate of 82% reduction in the relative risk of a lethal vessel strike. The models we developed can inform decision-making and policy design, based on their ability to provide absolute, population-corrected, time-varying estimates of lethal vessel strikes, and they are easily transported to other regions and situations.

  10. Geospatial tools effectively estimate nonexceedance probabilities of daily streamflow at ungauged and intermittently gauged locations in Ohio

    Science.gov (United States)

    Farmer, William H.; Koltun, Greg

    2017-01-01

    Study regionThe state of Ohio in the United States, a humid, continental climate.Study focusThe estimation of nonexceedance probabilities of daily streamflows as an alternative means of establishing the relative magnitudes of streamflows associated with hydrologic and water-quality observations.New hydrological insights for the regionSeveral methods for estimating nonexceedance probabilities of daily mean streamflows are explored, including single-index methodologies (nearest-neighboring index) and geospatial tools (kriging and topological kriging). These methods were evaluated by conducting leave-one-out cross-validations based on analyses of nearly 7 years of daily streamflow data from 79 unregulated streamgages in Ohio and neighboring states. The pooled, ordinary kriging model, with a median Nash–Sutcliffe performance of 0.87, was superior to the single-site index methods, though there was some bias in the tails of the probability distribution. Incorporating network structure through topological kriging did not improve performance. The pooled, ordinary kriging model was applied to 118 locations without systematic streamgaging across Ohio where instantaneous streamflow measurements had been made concurrent with water-quality sampling on at least 3 separate days. Spearman rank correlations between estimated nonexceedance probabilities and measured streamflows were high, with a median value of 0.76. In consideration of application, the degree of regulation in a set of sample sites helped to specify the streamgages required to implement kriging approaches successfully.

  11. Classification of Knee Joint Vibration Signals Using Bivariate Feature Distribution Estimation and Maximal Posterior Probability Decision Criterion

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2013-04-01

    Full Text Available Analysis of knee joint vibration or vibroarthrographic (VAG signals using signal processing and machine learning algorithms possesses high potential for the noninvasive detection of articular cartilage degeneration, which may reduce unnecessary exploratory surgery. Feature representation of knee joint VAG signals helps characterize the pathological condition of degenerative articular cartilages in the knee. This paper used the kernel-based probability density estimation method to model the distributions of the VAG signals recorded from healthy subjects and patients with knee joint disorders. The estimated densities of the VAG signals showed explicit distributions of the normal and abnormal signal groups, along with the corresponding contours in the bivariate feature space. The signal classifications were performed by using the Fisher’s linear discriminant analysis, support vector machine with polynomial kernels, and the maximal posterior probability decision criterion. The maximal posterior probability decision criterion was able to provide the total classification accuracy of 86.67% and the area (Az of 0.9096 under the receiver operating characteristics curve, which were superior to the results obtained by either the Fisher’s linear discriminant analysis (accuracy: 81.33%, Az: 0.8564 or the support vector machine with polynomial kernels (accuracy: 81.33%, Az: 0.8533. Such results demonstrated the merits of the bivariate feature distribution estimation and the superiority of the maximal posterior probability decision criterion for analysis of knee joint VAG signals.

  12. Clinician gestalt estimate of pretest probability for acute coronary syndrome and pulmonary embolism in patients with chest pain and dyspnea.

    Science.gov (United States)

    Kline, Jeffrey A; Stubblefield, William B

    2014-03-01

    Pretest probability helps guide diagnostic testing for patients with suspected acute coronary syndrome and pulmonary embolism. Pretest probability derived from the clinician's unstructured gestalt estimate is easier and more readily available than methods that require computation. We compare the diagnostic accuracy of physician gestalt estimate for the pretest probability of acute coronary syndrome and pulmonary embolism with a validated, computerized method. This was a secondary analysis of a prospectively collected, multicenter study. Patients (N=840) had chest pain, dyspnea, nondiagnostic ECGs, and no obvious diagnosis. Clinician gestalt pretest probability for both acute coronary syndrome and pulmonary embolism was assessed by visual analog scale and from the method of attribute matching using a Web-based computer program. Patients were followed for outcomes at 90 days. Clinicians had significantly higher estimates than attribute matching for both acute coronary syndrome (17% versus 4%; Psyndrome (r(2)=0.15) and pulmonary embolism (r(2)=0.06). Areas under the receiver operating characteristic curve were lower for clinician estimate compared with the computerized method for acute coronary syndrome: 0.64 (95% confidence interval [CI] 0.51 to 0.77) for clinician gestalt versus 0.78 (95% CI 0.71 to 0.85) for attribute matching. For pulmonary embolism, these values were 0.81 (95% CI 0.79 to 0.92) for clinician gestalt and 0.84 (95% CI 0.76 to 0.93) for attribute matching. Compared with a validated machine-based method, clinicians consistently overestimated pretest probability but on receiver operating curve analysis were as accurate for pulmonary embolism but not acute coronary syndrome. Copyright © 2013 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.

  13. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  14. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni

    2015-08-28

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise / offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

  15. Use of portable antennas to estimate abundance of PIT-tagged fish in small streams: Factors affecting detection probability

    Science.gov (United States)

    O'Donnell, Matthew J.; Horton, Gregg E.; Letcher, Benjamin H.

    2010-01-01

    Portable passive integrated transponder (PIT) tag antenna systems can be valuable in providing reliable estimates of the abundance of tagged Atlantic salmon Salmo salar in small streams under a wide range of conditions. We developed and employed PIT tag antenna wand techniques in two controlled experiments and an additional case study to examine the factors that influenced our ability to estimate population size. We used Pollock's robust-design capture–mark–recapture model to obtain estimates of the probability of first detection (p), the probability of redetection (c), and abundance (N) in the two controlled experiments. First, we conducted an experiment in which tags were hidden in fixed locations. Although p and c varied among the three observers and among the three passes that each observer conducted, the estimates of N were identical to the true values and did not vary among observers. In the second experiment using free-swimming tagged fish, p and c varied among passes and time of day. Additionally, estimates of N varied between day and night and among age-classes but were within 10% of the true population size. In the case study, we used the Cormack–Jolly–Seber model to examine the variation in p, and we compared counts of tagged fish found with the antenna wand with counts collected via electrofishing. In that study, we found that although p varied for age-classes, sample dates, and time of day, antenna and electrofishing estimates of N were similar, indicating that population size can be reliably estimated via PIT tag antenna wands. However, factors such as the observer, time of day, age of fish, and stream discharge can influence the initial and subsequent detection probabilities.

  16. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes.

    Science.gov (United States)

    De Gregorio, Sofia; Camarda, Marco

    2016-01-01

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years.

  17. Estimating the probability distribution of von Mises stress for structures undergoing random excitation. Part 1: Derivation

    Energy Technology Data Exchange (ETDEWEB)

    Segalman, D.; Reese, G.

    1998-09-01

    The von Mises stress is often used as the metric for evaluating design margins, particularly for structures made of ductile materials. For deterministic loads, both static and dynamic, the calculation of von Mises stress is straightforward, as is the resulting calculation of reliability. For loads modeled as random processes, the task is different; the response to such loads is itself a random process and its properties must be determined in terms of those of both the loads and the system. This has been done in the past by Monte Carlo sampling of numerical realizations that reproduce the second order statistics of the problem. Here, the authors present a method that provides analytic expressions for the probability distributions of von Mises stress which can be evaluated efficiently and with good precision numerically. Further, this new approach has the important advantage of providing the asymptotic properties of the probability distribution.

  18. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Onar, Arzu; Bayliss, Jon; Ludwig, Larry

    2009-01-01

    In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This empirical model can be used to improve existing risk simulation models. FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.

  19. Estimates for the Tail Probability of the Supremum of a Random Walk with Independent Increments

    Institute of Scientific and Technical Information of China (English)

    Yang YANG; Kaiyong WANG

    2011-01-01

    The authors investigate the tail probability of the supremum of a random walk with independent increments and obtain some equivalent assertions in the case that the increments are independent and identically distributed random variables with Osubexponential integrated distributions.A uniform upper bound is derived for the distribution of the supremum of a random walk with independent but non-identically distributed increments,whose tail distributions are dominated by a common tail distribution with an O-subexponential integrated distribution.

  20. Information geometric algorithm for estimating switching probabilities in space-varying HMM.

    Science.gov (United States)

    Nascimento, Jacinto C; Barão, Miguel; Marques, Jorge S; Lemos, João M

    2014-12-01

    This paper proposes an iterative natural gradient algorithm to perform the optimization of switching probabilities in a space-varying hidden Markov model, in the context of human activity recognition in long-range surveillance. The proposed method is a version of the gradient method, developed under an information geometric viewpoint, where the usual Euclidean metric is replaced by a Riemannian metric on the space of transition probabilities. It is shown that the change in metric provides advantages over more traditional approaches, namely: 1) it turns the original constrained optimization into an unconstrained optimization problem; 2) the optimization behaves asymptotically as a Newton method and yields faster convergence than other methods for the same computational complexity; and 3) the natural gradient vector is an actual contravariant vector on the space of probability distributions for which an interpretation as the steepest descent direction is formally correct. Experiments on synthetic and real-world problems, focused on human activity recognition in long-range surveillance settings, show that the proposed methodology compares favorably with the state-of-the-art algorithms developed for the same purpose.

  1. A New Approximate Formula for Variance of Horvitz–Thompson Estimator using first order Inclusion Probabilities

    Directory of Open Access Journals (Sweden)

    Muhammad Qaiser Shahbaz

    2007-01-01

    Full Text Available A new approximate formula for sampling variance of Horvitz–Thompson (1952 estimator has been obtained. Empirical study of the approximate formula has been given to see its performance.

  2. Inverse problems in cancellous bone: estimation of the ultrasonic properties of fast and slow waves using Bayesian probability theory.

    Science.gov (United States)

    Anderson, Christian C; Bauer, Adam Q; Holland, Mark R; Pakula, Michal; Laugier, Pascal; Bretthorst, G Larry; Miller, James G

    2010-11-01

    Quantitative ultrasonic characterization of cancellous bone can be complicated by artifacts introduced by analyzing acquired data consisting of two propagating waves (a fast wave and a slow wave) as if only one wave were present. Recovering the ultrasonic properties of overlapping fast and slow waves could therefore lead to enhancement of bone quality assessment. The current study uses Bayesian probability theory to estimate phase velocity and normalized broadband ultrasonic attenuation (nBUA) parameters in a model of fast and slow wave propagation. Calculations are carried out using Markov chain Monte Carlo with simulated annealing to approximate the marginal posterior probability densities for parameters in the model. The technique is applied to simulated data, to data acquired on two phantoms capable of generating two waves in acquired signals, and to data acquired on a human femur condyle specimen. The models are in good agreement with both the simulated and experimental data, and the values of the estimated ultrasonic parameters fall within expected ranges.

  3. Estimation of Probability of Swarming Pedestrians Violation at Signalized Intersections in Developing Cities

    Institute of Scientific and Technical Information of China (English)

    LI Ying-feng; SHI Zhong-ke; ZHOU Zhi-na

    2009-01-01

    We made an on-site investigation about pedestrian violation of traffic violation decision and the impact of the number of pedestrians in colony that the probability of pedestrian violation rose with the waiting time for simulating mixed vehicles and pedestrians and used the on-site investigation data to validate the model.When traffic volume is light,the error between the simulated values and the measured ones is 2.47%.When traffic volume is heavy,the error is 3.38%.

  4. Limitation of Inverse Probability-of-Censoring Weights in Estimating Survival in the Presence of Strong Selection Bias

    OpenAIRE

    Howe, Chanelle J.; Cole, Stephen R.; Chmiel, Joan S.; Muñoz, Alvaro

    2011-01-01

    In time-to-event analyses, artificial censoring with correction for induced selection bias using inverse probability-of-censoring weights can be used to 1) examine the natural history of a disease after effective interventions are widely available, 2) correct bias due to noncompliance with fixed or dynamic treatment regimens, and 3) estimate survival in the presence of competing risks. Artificial censoring entails censoring participants when they meet a predefined study criterion, such as exp...

  5. Comments on “Estimating Income Variances by Probability Sampling: A Case Study by Shah and Aleem”

    Directory of Open Access Journals (Sweden)

    Jamal Abdul Nasir

    2012-06-01

    Full Text Available In this article, we wish to write comments on recently published article “Shah, A.A. and Aleem, M. (2010. Estimating income variances by probability sampling: a case study. Pakistan Journal of Commerce and Social Sciences, 4(2, 194-201”, which suggest improvement as well as criticism on the paper and also contribute effectively towardsjournal repute and ranking.

  6. Probability Density Estimation for Non-flat Functions%非平坦函数概率密度估计

    Institute of Scientific and Technical Information of China (English)

    汪洪桥; 蔡艳宁; 付光远; 王仕成

    2016-01-01

    Aiming at the probability density estimation problem for non-flat functions, this paper constructs a single slack factor multi-scale kernel support vector machine (SVM) probability density estimation model, by improving the form of constraint condition of the traditional SVM model and introducing the multi-scale kernel method. In the model, a single slack factor instead of two types of slack factors is used to control the learning error of SVM, which reduces the computational complexity of model. At the same time, by introducing the multi-scale kernel method, the model can well fit the functions with both the fiercely changed region and the flatly changed region. Through several probability density estimation experiments with typical non-flat functions, the results show that the single slack probability density estimation model has faster learning speed than the common SVM model. And compared with the single kernel method, the multi-scale kernel SVM probability density estimation model has better estimation precision.%针对非平坦函数的概率密度估计问题,通过改进支持向量机(support vector machine,SVM)概率密度估计模型约束条件的形式,并引入多尺度核方法,构建了一种单松弛因子多尺度核支持向量机概率密度估计模型。该模型采用合并的单个松弛因子来控制支持向量机的学习误差,减小了模型的计算复杂度;同时引入了多尺度核方法,使得模型既能适应函数剧烈变化的区域,也能适应平缓变化的区域。基于几种典型非平坦函数进行概率密度估计实验,结果证明,单松弛因子概率密度估计模型比常规支持向量机概率密度估计模型具有更快的学习速度;且相比于单核方法,多尺度核支持向量机概率密度估计模型具有更优的估计精度。

  7. Probability-based Clustering and Its Application to WLAN Location Estimation

    Institute of Scientific and Technical Information of China (English)

    ZHANG Ming-hua; ZHANG Shen-sheng; CAO Jian

    2008-01-01

    Wireless local area networks (WLAN) localization based on received signal strength is becoming an important enabler of location based services. Limited efficiency and accuracy are disadvantages to the deterministic location estimation techniques. The probabilistic techniques show their good accuracy but cost more computation overhead. A Gaussian mixture model based on clustering technique was presented to improve location determination efficiency. The proposed clustering algorithm reduces the number of candidate locations from the whole area to a cluster. Within a cluster, an improved nearest neighbor algorithm was used to estimate user location using signal strength from more access points. Experiments show that the location estimation time is greatly decreased while high accuracy can still be achieved.

  8. Estimating Effect Sizes and Expected Replication Probabilities from GWAS Summary Statistics

    DEFF Research Database (Denmark)

    Holland, Dominic; Wang, Yunpeng; Thompson, Wesley K;

    2016-01-01

    for estimating the degree of polygenicity of the phenotype and predicting the proportion of chip heritability explainable by genome-wide significant SNPs in future studies with larger sample sizes. We apply the model to recent GWAS of schizophrenia (N = 82,315) and putamen volume (N = 12,596), with approximately...... based on linear-regression association coefficients. We estimate the polygenicity of schizophrenia to be 0.037 and the putamen to be 0.001, while the respective sample sizes required to approach fully explaining the chip heritability are 10(6) and 10(5). The model can be extended to incorporate prior...

  9. A Method for Estimating the Probability of Floating Gate Prompt Charge Loss in a Radiation Environment

    Science.gov (United States)

    Edmonds, L. D.

    2016-01-01

    Since advancing technology has been producing smaller structures in electronic circuits, the floating gates in modern flash memories are becoming susceptible to prompt charge loss from ionizing radiation environments found in space. A method for estimating the risk of a charge-loss event is given.

  10. Estimating the Probability of Being the Best System: A Generalized Method and Nonparametric Hypothesis Test

    Science.gov (United States)

    2013-03-01

    Mendenhall , and Sheaffer [25]. For the remainder of this paper, however, we will make use of the Wilcoxon rank sum test for purposes of comparison with the...B. W. Silverman, Density Estimation for Statistics and Data Analysis, Chapman & Hall/CRC, 1986, p. 48. [25] D. D. Wackerly, W. Mendenhall III and R

  11. Estimating the benefits of single value and probability forecasting for flood warning

    Directory of Open Access Journals (Sweden)

    J. S. Verkade

    2011-12-01

    Full Text Available Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS. These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed events. This forecasting uncertainty decreases the potential reduction of flood risk, but is seldom accounted for in estimates of the benefits of FFWRSs. In the present paper, a method to estimate the benefits of (imperfect FFWRSs in reducing flood risk is presented. The method is based on a hydro-economic model of expected annual damage (EAD due to flooding, combined with the concept of Relative Economic Value (REV. The estimated benefits include not only the reduction of flood losses due to a warning response, but also consider the costs of the warning response itself, as well as the costs associated with forecasting uncertainty. The method allows for estimation of the benefits of FFWRSs that use either deterministic or probabilistic forecasts. Through application to a case study, it is shown that FFWRSs using a probabilistic forecast have the potential to realise higher benefits at all lead-times. However, it is also shown that provision of warning at increasing lead-time does not necessarily lead to an increasing reduction of flood risk, but rather that an optimal lead-time at which warnings are provided can be established as a function of forecast uncertainty and the cost-loss ratio of the user receiving and responding to the warning.

  12. Maximum likelihood estimates with order restrictions on probabilities and odds ratios: A geometric programming approach

    Directory of Open Access Journals (Sweden)

    D. L. Bricker

    1997-01-01

    Full Text Available The problem of assigning cell probabilities to maximize a multinomial likelihood with order restrictions on the probabilies and/or restrictions on the local odds ratios is modeled as a posynomial geometric program (GP, a class of nonlinear optimization problems with a well-developed duality theory and collection of algorithms. (Local odds ratios provide a measure of association between categorical random variables. A constrained multinomial MLE example from the literature is solved, and the quality of the solution is compared with that obtained by the iterative method of El Barmi and Dykstra, which is based upon Fenchel duality. Exploiting the proximity of the GP model of MLE problems to linear programming (LP problems, we also describe as an alternative, in the absence of special-purpose GP software, an easily implemented successive LP approximation method for solving this class of MLE problems using one of the readily available LP solvers.

  13. Estimated Probability of Traumatic Abdominal Injury During an International Space Station Mission

    Science.gov (United States)

    Lewandowski, Beth E.; Brooker, John E.; Weavr, Aaron S.; Myers, Jerry G., Jr.; McRae, Michael P.

    2013-01-01

    The Integrated Medical Model (IMM) is a decision support tool that is useful to spaceflight mission planners and medical system designers when assessing risks and optimizing medical systems. The IMM project maintains a database of medical conditions that could occur during a spaceflight. The IMM project is in the process of assigning an incidence rate, the associated functional impairment, and a best and a worst case end state for each condition. The purpose of this work was to develop the IMM Abdominal Injury Module (AIM). The AIM calculates an incidence rate of traumatic abdominal injury per person-year of spaceflight on the International Space Station (ISS). The AIM was built so that the probability of traumatic abdominal injury during one year on ISS could be predicted. This result will be incorporated into the IMM Abdominal Injury Clinical Finding Form and used within the parent IMM model.

  14. Inverse Probability of Censoring Weighted Estimates of Kendall’s τ for Gap Time Analyses

    OpenAIRE

    Lakhal-Chaieb, Lajmi; Cook, Richard J.; Lin, Xihong

    2010-01-01

    In life history studies interest often lies in the analysis of the inter-event, or gap times and the association between event times. Gap time analyses are challenging however, even when the length of follow-up is determined independently of the event process, since associations between gap times induce dependent censoring for second and subsequent gap times. This paper discusses nonparametric estimation of the association between consecutive gap times based on Kendall’s τ in the presence of ...

  15. Estimating Route Choice Models from Stochastically Generated Choice Sets on Large-Scale Networks Correcting for Unequal Sampling Probability

    DEFF Research Database (Denmark)

    Vacca, Alessandro; Prato, Carlo Giacomo; Meloni, Italo

    2015-01-01

    is the dependency of the parameter estimates from the choice set generation technique. Bias introduced in model estimation has been corrected only for the random walk algorithm, which has problematic applicability to large-scale networks. This study proposes a correction term for the sampling probability of routes...... extracted with stochastic route generation. The term is easily applicable to large-scale networks and various environments, given its dependence only on a random number generator and the Dijkstra shortest path algorithm. The implementation for revealed preferences data, which consist of actual route choices...... collected in Cagliari, Italy, shows the feasibility of generating routes stochastically in a high-resolution network and calculating the correction factor. The model estimation with and without correction illustrates how the correction not only improves the goodness of fit but also turns illogical signs...

  16. Limitation of inverse probability-of-censoring weights in estimating survival in the presence of strong selection bias.

    Science.gov (United States)

    Howe, Chanelle J; Cole, Stephen R; Chmiel, Joan S; Muñoz, Alvaro

    2011-03-01

    In time-to-event analyses, artificial censoring with correction for induced selection bias using inverse probability-of-censoring weights can be used to 1) examine the natural history of a disease after effective interventions are widely available, 2) correct bias due to noncompliance with fixed or dynamic treatment regimens, and 3) estimate survival in the presence of competing risks. Artificial censoring entails censoring participants when they meet a predefined study criterion, such as exposure to an intervention, failure to comply, or the occurrence of a competing outcome. Inverse probability-of-censoring weights use measured common predictors of the artificial censoring mechanism and the outcome of interest to determine what the survival experience of the artificially censored participants would be had they never been exposed to the intervention, complied with their treatment regimen, or not developed the competing outcome. Even if all common predictors are appropriately measured and taken into account, in the context of small sample size and strong selection bias, inverse probability-of-censoring weights could fail because of violations in assumptions necessary to correct selection bias. The authors used an example from the Multicenter AIDS Cohort Study, 1984-2008, regarding estimation of long-term acquired immunodeficiency syndrome-free survival to demonstrate the impact of violations in necessary assumptions. Approaches to improve correction methods are discussed.

  17. Probability and heritability estimates on primary osteoarthritis of the hip leading to total hip arthroplasty

    DEFF Research Database (Denmark)

    Skousgaard, Søren Glud; Hjelmborg, Jacob; Skytthe, Axel;

    2015-01-01

    INTRODUCTION: Primary hip osteoarthritis, radiographic as well as symptomatic, is highly associated with increasing age in both genders. However, little is known about the mechanisms behind this, in particular if this increase is caused by genetic factors. This study examined the risk and heritab......INTRODUCTION: Primary hip osteoarthritis, radiographic as well as symptomatic, is highly associated with increasing age in both genders. However, little is known about the mechanisms behind this, in particular if this increase is caused by genetic factors. This study examined the risk...... and heritability of primary osteoarthritis of the hip leading to a total hip arthroplasty, and if this heritability increased with increasing age. METHODS: In a nationwide population-based follow-up study 118,788 twins from the Danish Twin Register and 90,007 individuals from the Danish Hip Arthroplasty Register...... not have had a total hip arthroplasty at the time of follow-up. RESULTS: There were 94,063 twins eligible for analyses, comprising 835 cases of 36 concordant and 763 discordant twin pairs. The probability increased particularly from 50 years of age. After sex and age adjustment a significant additive...

  18. Probable maximum precipitation 24 hours estimation: A case study of Zanjan province of Iran

    Directory of Open Access Journals (Sweden)

    Azim Shirdeli

    2012-10-01

    Full Text Available One of the primary concerns in designing civil structures such as water storage dams and irrigation and drainage networks is to find economic scale based on possibility of natural incidents such as floods, earthquake, etc. Probable maximum precipitation (PMP is one of well known methods, which helps design a civil structure, properly. In this paper, we study the maximum one-day precipitation using 17 to 50 years of information in 13 stations located in province of Zanjan, Iran. The proposed study of this paper uses two Hershfield methods, where the first one yields 18.17 to 18.48 for precipitation where the PMP24 was between 170.14 mm and 255.28 mm. The second method reports precipitation between 2.29 and 4.95 while PMP24 was between 62.33 mm and 92.08 mm. In addition, when the out of range data were deleted from the study of the second method, precipitation rates were calculated between 2.29 and 4.31 while PMP24 was between 76.08 mm and 117.28 mm. The preliminary results indicate that the second Hershfield method provide more stable results than the first one.

  19. A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities

    Science.gov (United States)

    Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.

    1999-01-01

    A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, μ, and the aperiodicity of the mean, α (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of α = 0.5. For this value of α, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > μ⁄2, and is ~ ~ 2 ⁄ μ for all times > μ. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.

  20. Estimating probabilities of recession in real time using GDP and GDI

    OpenAIRE

    Nalewaik, Jeremy J.

    2006-01-01

    This work estimates Markov switching models on real time data and shows that the growth rate of gross domestic income (GDI), deflated by the GDP deflator, has done a better job recognizing the start of recessions than has the growth rate of real GDP. This result suggests that placing an increased focus on GDI may be useful in assessing the current state of the economy. In addition, the paper shows that the definition of a low-growth phase in the Markov switching models has changed over the pa...

  1. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers-Part I

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Bayliss, Jon; Ludwig, Larry

    2008-01-01

    Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that has a currently unknown probability associated with it. Due to contact resistance, electrical shorts may not occur at lower voltage levels. In this experiment, we study the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From this data, we can estimate the probability of an electrical short, as a function of voltage, given that a free tin whisker has bridged two adjacent exposed electrical conductors. In addition, three tin whiskers grown from the same Space Shuttle Orbiter card guide used in the aforementioned experiment were cross sectioned and studied using a focused ion beam (FIB).

  2. Estimated probability of postwildfire debris flows in the 2012 Whitewater-Baldy Fire burn area, southwestern New Mexico

    Science.gov (United States)

    Tillery, Anne C.; Matherne, Anne Marie; Verdin, Kristine L.

    2012-01-01

    In May and June 2012, the Whitewater-Baldy Fire burned approximately 1,200 square kilometers (300,000 acres) of the Gila National Forest, in southwestern New Mexico. The burned landscape is now at risk of damage from postwildfire erosion, such as that caused by debris flows and flash floods. This report presents a preliminary hazard assessment of the debris-flow potential from 128 basins burned by the Whitewater-Baldy Fire. A pair of empirical hazard-assessment models developed by using data from recently burned basins throughout the intermountain Western United States was used to estimate the probability of debris-flow occurrence and volume of debris flows along the burned area drainage network and for selected drainage basins within the burned area. The models incorporate measures of areal burned extent and severity, topography, soils, and storm rainfall intensity to estimate the probability and volume of debris flows following the fire. In response to the 2-year-recurrence, 30-minute-duration rainfall, modeling indicated that four basins have high probabilities of debris-flow occurrence (greater than or equal to 80 percent). For the 10-year-recurrence, 30-minute-duration rainfall, an additional 14 basins are included, and for the 25-year-recurrence, 30-minute-duration rainfall, an additional eight basins, 20 percent of the total, have high probabilities of debris-flow occurrence. In addition, probability analysis along the stream segments can identify specific reaches of greatest concern for debris flows within a basin. Basins with a high probability of debris-flow occurrence were concentrated in the west and central parts of the burned area, including tributaries to Whitewater Creek, Mineral Creek, and Willow Creek. Estimated debris-flow volumes ranged from about 3,000-4,000 cubic meters (m3) to greater than 500,000 m3 for all design storms modeled. Drainage basins with estimated volumes greater than 500,000 m3 included tributaries to Whitewater Creek, Willow

  3. An improved multilevel Monte Carlo method for estimating probability distribution functions in stochastic oil reservoir simulations

    Science.gov (United States)

    Lu, Dan; Zhang, Guannan; Webster, Clayton; Barbier, Charlotte

    2016-12-01

    In this work, we develop an improved multilevel Monte Carlo (MLMC) method for estimating cumulative distribution functions (CDFs) of a quantity of interest, coming from numerical approximation of large-scale stochastic subsurface simulations. Compared with Monte Carlo (MC) methods, that require a significantly large number of high-fidelity model executions to achieve a prescribed accuracy when computing statistical expectations, MLMC methods were originally proposed to significantly reduce the computational cost with the use of multifidelity approximations. The improved performance of the MLMC methods depends strongly on the decay of the variance of the integrand as the level increases. However, the main challenge in estimating CDFs is that the integrand is a discontinuous indicator function whose variance decays slowly. To address this difficult task, we approximate the integrand using a smoothing function that accelerates the decay of the variance. In addition, we design a novel a posteriori optimization strategy to calibrate the smoothing function, so as to balance the computational gain and the approximation error. The combined proposed techniques are integrated into a very general and practical algorithm that can be applied to a wide range of subsurface problems for high-dimensional uncertainty quantification, such as a fine-grid oil reservoir model considered in this effort. The numerical results reveal that with the use of the calibrated smoothing function, the improved MLMC technique significantly reduces the computational complexity compared to the standard MC approach. Finally, we discuss several factors that affect the performance of the MLMC method and provide guidance for effective and efficient usage in practice.

  4. Estimate of the penetrance of BRCA mutation and the COS software for the assessment of BRCA mutation probability.

    Science.gov (United States)

    Berrino, Jacopo; Berrino, Franco; Francisci, Silvia; Peissel, Bernard; Azzollini, Jacopo; Pensotti, Valeria; Radice, Paolo; Pasanisi, Patrizia; Manoukian, Siranoush

    2015-03-01

    We have designed the user-friendly COS software with the intent to improve estimation of the probability of a family carrying a deleterious BRCA gene mutation. The COS software is similar to the widely-used Bayesian-based BRCAPRO software, but it incorporates improved assumptions on cancer incidence in women with and without a deleterious mutation, takes into account relatives up to the fourth degree and allows researchers to consider an hypothetical third gene or a polygenic model of inheritance. Since breast cancer incidence and penetrance increase over generations, we estimated birth-cohort-specific incidence and penetrance curves. We estimated breast and ovarian cancer penetrance in 384 BRCA1 and 229 BRCA2 mutated families. We tested the COS performance in 436 Italian breast/ovarian cancer families including 79 with BRCA1 and 27 with BRCA2 mutations. The area under receiver operator curve (AUROC) was 84.4 %. The best probability threshold for offering the test was 22.9 %, with sensitivity 80.2 % and specificity 80.3 %. Notwithstanding very different assumptions, COS results were similar to BRCAPRO v6.0.

  5. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling.

    Science.gov (United States)

    Gardi, J E; Nyengaard, J R; Gundersen, H J G

    2008-03-01

    The proportionator is a novel and radically different approach to sampling with microscopes based on the well-known statistical theory (probability proportional to size-PPS sampling). It uses automatic image analysis, with a large range of options, to assign to every field of view in the section a weight proportional to some characteristic of the structure under study. A typical and very simple example, examined here, is the amount of color characteristic for the structure, marked with a stain with known properties. The color may be specific or not. In the recorded list of weights in all fields, the desired number of fields is sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections examined, which in turn leads to any of the known stereological estimates including size distributions and spatial distributions. The unbiasedness is not a function of the assumed relation between the weight and the structure, which is in practice always a biased relation from a stereological (integral geometric) point of view. The efficiency of the proportionator depends, however, directly on this relation to be positive. The sampling and estimation procedure is simulated in sections with characteristics and various kinds of noises in possibly realistic ranges. In all cases examined, the proportionator is 2-15-fold more efficient than the common systematic, uniformly random sampling. The simulations also indicate that the lack of a simple predictor of the coefficient of error (CE) due to field-to-field variation is a more severe problem for uniform sampling strategies than anticipated. Because of its entirely different sampling strategy, based on known but non-uniform sampling probabilities, the proportionator for the first time allows the real CE at the section level to

  6. Cost and Benefit of Control Strategies - Estimation of Benefit functions, enforcement-probability function and enforcement-cost function

    DEFF Research Database (Denmark)

    Kronbak, Lone Grønbæk; Jensen, Frank

      Within the EU Sixth Framework Programme an ongoing research project, COBECOS, has developed a theory of enforcement and a software code for computer modeling of different fisheries with fisheries enforcement cases. The case of the Danish fishery for Nephrops faces problems with landings...... on fishery enforcement from the COBECOS project to a specific case. It is done by estimations of functional relationships' for describing 1) the fisheries benefit function 2) the shadow value of biomass 3) the connection between the probability of being detected and apprehended for different enforcement...

  7. Developing an Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2009-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish.

  8. Methods for estimating annual exceedance probability discharges for streams in Arkansas, based on data through water year 2013

    Science.gov (United States)

    Wagner, Daniel M.; Krieger, Joshua D.; Veilleux, Andrea G.

    2016-08-04

    In 2013, the U.S. Geological Survey initiated a study to update regional skew, annual exceedance probability discharges, and regional regression equations used to estimate annual exceedance probability discharges for ungaged locations on streams in the study area with the use of recent geospatial data, new analytical methods, and available annual peak-discharge data through the 2013 water year. An analysis of regional skew using Bayesian weighted least-squares/Bayesian generalized-least squares regression was performed for Arkansas, Louisiana, and parts of Missouri and Oklahoma. The newly developed constant regional skew of -0.17 was used in the computation of annual exceedance probability discharges for 281 streamgages used in the regional regression analysis. Based on analysis of covariance, four flood regions were identified for use in the generation of regional regression models. Thirty-nine basin characteristics were considered as potential explanatory variables, and ordinary least-squares regression techniques were used to determine the optimum combinations of basin characteristics for each of the four regions. Basin characteristics in candidate models were evaluated based on multicollinearity with other basin characteristics (variance inflation factor equations apply only to locations on streams in Arkansas where annual peak discharges are not substantially affected by regulation, diversion, channelization, backwater, or urbanization. The applicability and accuracy of the regional regression equations depend on the basin characteristics measured for an ungaged location on a stream being within range of those used to develop the equations.

  9. Estimation of probability density functions of damage parameter for valve leakage detection in reciprocating pump used in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo [Div. of Mechanical Engineering, Ajou University, Suwon (Korea, Republic of)

    2016-10-15

    This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.

  10. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems

    Science.gov (United States)

    Vio, R.; Andreani, P.

    2016-05-01

    The reliable detection of weak signals is a critical issue in many astronomical contexts and may have severe consequences for determining number counts and luminosity functions, but also for optimizing the use of telescope time in follow-up observations. Because of its optimal properties, one of the most popular and widely-used detection technique is the matched filter (MF). This is a linear filter designed to maximise the detectability of a signal of known structure that is buried in additive Gaussian random noise. In this work we show that in the very common situation where the number and position of the searched signals within a data sequence (e.g. an emission line in a spectrum) or an image (e.g. a point-source in an interferometric map) are unknown, this technique, when applied in its standard form, may severely underestimate the probability of false detection. This is because the correct use of the MF relies upon a priori knowledge of the position of the signal of interest. In the absence of this information, the statistical significance of features that are actually noise is overestimated and detections claimed that are actually spurious. For this reason, we present an alternative method of computing the probability of false detection that is based on the probability density function (PDF) of the peaks of a random field. It is able to provide a correct estimate of the probability of false detection for the one-, two- and three-dimensional case. We apply this technique to a real two-dimensional interferometric map obtained with ALMA.

  11. Modeling the relationship between most probable number (MPN) and colony-forming unit (CFU) estimates of fecal coliform concentration.

    Science.gov (United States)

    Gronewold, Andrew D; Wolpert, Robert L

    2008-07-01

    Most probable number (MPN) and colony-forming-unit (CFU) estimates of fecal coliform bacteria concentration are common measures of water quality in coastal shellfish harvesting and recreational waters. Estimating procedures for MPN and CFU have intrinsic variability and are subject to additional uncertainty arising from minor variations in experimental protocol. It has been observed empirically that the standard multiple-tube fermentation (MTF) decimal dilution analysis MPN procedure is more variable than the membrane filtration CFU procedure, and that MTF-derived MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the variability in, and discrepancy between, MPN and CFU measurements. We then compare our model to water quality samples analyzed using both MPN and CFU procedures, and find that the (often large) observed differences between MPN and CFU values for the same water body are well within the ranges predicted by our probabilistic model. Our results indicate that MPN and CFU intra-sample variability does not stem from human error or laboratory procedure variability, but is instead a simple consequence of the probabilistic basis for calculating the MPN. These results demonstrate how probabilistic models can be used to compare samples from different analytical procedures, and to determine whether transitions from one procedure to another are likely to cause a change in quality-based management decisions.

  12. Estimation of the probability of exposure to machining fluids in a population-based case-control study.

    Science.gov (United States)

    Park, Dong-Uk; Colt, Joanne S; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R; Armenti, Karla R; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (0.1->0.9%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and U.S. production levels found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resources: a list of keywords in the occupational histories that were used to link study subjects to the metalworking fluids (MWFs) modules; recommendations from the literature on selection of MWFs based on type of machining operation, the metal being machined and decade; popular additives to MWFs; the number and proportion of controls who

  13. Joint Behaviour of Semirecursive Kernel Estimators of the Location and of the Size of the Mode of a Probability Density Function

    Directory of Open Access Journals (Sweden)

    Abdelkader Mokkadem

    2011-01-01

    Full Text Available Let and denote the location and the size of the mode of a probability density. We study the joint convergence rates of semirecursive kernel estimators of and . We show how the estimation of the size of the mode allows measuring the relevance of the estimation of its location. We also enlighten that, beyond their computational advantage on nonrecursive estimators, the semirecursive estimators are preferable to use for the construction of confidence regions.

  14. Building vulnerability to hydro-geomorphic hazards: Estimating damage probability from qualitative vulnerability assessment using logistic regression

    Science.gov (United States)

    Ettinger, Susanne; Mounaud, Loïc; Magill, Christina; Yao-Lafourcade, Anne-Françoise; Thouret, Jean-Claude; Manville, Vern; Negulescu, Caterina; Zuccaro, Giulio; De Gregorio, Daniela; Nardone, Stefano; Uchuchoque, Juan Alexis Luque; Arguedas, Anita; Macedo, Luisa; Manrique Llerena, Nélida

    2016-10-01

    The focus of this study is an analysis of building vulnerability through investigating impacts from the 8 February 2013 flash flood event along the Avenida Venezuela channel in the city of Arequipa, Peru. On this day, 124.5 mm of rain fell within 3 h (monthly mean: 29.3 mm) triggering a flash flood that inundated at least 0.4 km2 of urban settlements along the channel, affecting more than 280 buildings, 23 of a total of 53 bridges (pedestrian, vehicle and railway), and leading to the partial collapse of sections of the main road, paralyzing central parts of the city for more than one week. This study assesses the aspects of building design and site specific environmental characteristics that render a building vulnerable by considering the example of a flash flood event in February 2013. A statistical methodology is developed that enables estimation of damage probability for buildings. The applied method uses observed inundation height as a hazard proxy in areas where more detailed hydrodynamic modeling data is not available. Building design and site-specific environmental conditions determine the physical vulnerability. The mathematical approach considers both physical vulnerability and hazard related parameters and helps to reduce uncertainty in the determination of descriptive parameters, parameter interdependency and respective contributions to damage. This study aims to (1) enable the estimation of damage probability for a certain hazard intensity, and (2) obtain data to visualize variations in damage susceptibility for buildings in flood prone areas. Data collection is based on a post-flood event field survey and the analysis of high (sub-metric) spatial resolution images (Pléiades 2012, 2013). An inventory of 30 city blocks was collated in a GIS database in order to estimate the physical vulnerability of buildings. As many as 1103 buildings were surveyed along the affected drainage and 898 buildings were included in the statistical analysis. Univariate and

  15. 基于截尾估计的概率估计方法%Probability Estimation Method Based on Truncated Estimation

    Institute of Scientific and Technical Information of China (English)

    李熔

    2014-01-01

    能否以高概率正确重建稀疏信号是压缩感知理论中的重要研究内容。信号的稀疏度及冗余字典原子间的相关特性是研究该内容的关键因素。文中运用累积增量的概念,提出了一种基于截尾概率的累积增量满足约束界的概率估计的方法。运用该方法,判断能否利用选取的测量矩阵正确重构原始信号。通过Matlab仿真,验证了将高斯随机矩阵作为观测矩阵,在OMP重构算法下,可以高概率地正确重构出原始信号,也验证了文中所提方法的合理性。%It's an important research content in compressive sensing theory whether reconstruct the sparse signals with a high probability. The sparsity of the signals and the relevant characteristics of the atoms in the redundant dictionary are the key factors of the study. In this paper,taking use of the concept of cumulative coherence,propose a probability estimation method to estimate the probability of the cumu-lative coherence which satisfies the constraint boundary that based on the truncated estimation. It can be found whether the selected meas-urement matrix can correctly reconstruct the original signal with this method. The Matlab simulation verifies that the original signal can be reconstructed using OMP algorithm with a high probability by taking the Gaussian random matrix as the measurement matrix,at the same time,it verifies that the proposed method is reasonable.

  16. Effect of variation in probability of ascertainment by sources ("variable catchability") upon "capture-recapture" estimates of prevalence.

    Science.gov (United States)

    Hook, E B; Regal, R R

    1993-05-15

    Capture-recapture methods in epidemiology analyze data from overlapping lists of cases from various sources of ascertainment to generate estimates of missing cases and the total affected. Applications of these methods usually recognize the possibility of, and attempt to adjust for, nonindependent ascertainment by the various sources used. However, separate from the issue of dependencies between sources is the complexity of within source variation in probability of ascertainment of cases, e.g., variation in ascertainment by population subgroups, such as socioeconomic classes, races, or other subdivisions. The authors present a general approach to this issue for the two-source case that takes account of not only biases that arise from such "variable catchability" within sources but also the separate complexity of dependencies between sources. A general formula, (K - delta)/(K + delta), is derived that allows simultaneous calculation of the effects of variable catchability, delta, and source dependencies, delta, upon the accuracy of the two-source estimate. The effect of variable catchability upon accuracy and applications to data by race on the neurodegenerative disorder, Huntington's disease, are presented. In the latter analysis, multiple different two-source estimates of prevalence were made, considering each source versus all others pooled. Most of the likely bias was found to be due to source dependencies; variable catchability contributed relatively little bias. Multiple poolings of all but one source may prove a generally efficient method for overcoming the problem of likely variable catchability, at least when there are data from many distinct sources.

  17. Statistical methods for estimating the probability of spontaneous abortion in observational studies--analyzing pregnancies exposed to coumarin derivatives.

    Science.gov (United States)

    Meister, Reinhard; Schaefer, Christof

    2008-09-01

    Spontaneous abortion rates are of general interest when investigating pregnancy outcome. In most studies observations are left truncated as pregnant women enter with a delay of several weeks after conception. Apart from spontaneous abortion pregnancy may end in induced abortion or live birth. These outcomes are considered as competing events (risks). Although statistical methods for handling this setting are available since more than 10 years, studies on pregnancy outcome after drug exposure usually report crude rates of spontaneous abortions, ignoring left truncation and competing risks. The authors propose simple methods which remove bias inherent to crude rates. The probability of spontaneous abortion is estimated using an event-history based approach for the subdistribution of competing risks that handles left truncation appropriately. Variance estimation enables the construction of approximate confidence intervals and of a simple test-statistic for comparing rates between different cohorts. The proposed methods are applied to a comparative prospective study on the association of spontaneous abortion and exposure to coumarin derivatives. The naive analysis using crude rates gives substantially different results than those based on the proposed methods, with up to a twofold change. Correctly incorporating left truncation into the analysis may increase the variance of the estimators, relative to an ideal sample where all pregnancies are followed from the time of conception. The consequences of such truncation for study design are discussed. Combining corrections for left truncation and competing risks offers a powerful method for analyzing miscarriage risk.

  18. Context-adaptive binary arithmetic coding with precise probability estimation and complexity scalability for high-efficiency video coding

    Science.gov (United States)

    Karwowski, Damian; Domański, Marek

    2016-01-01

    An improved context-based adaptive binary arithmetic coding (CABAC) is presented. The idea for the improvement is to use a more accurate mechanism for estimation of symbol probabilities in the standard CABAC algorithm. The authors' proposal of such a mechanism is based on the context-tree weighting technique. In the framework of a high-efficiency video coding (HEVC) video encoder, the improved CABAC allows 0.7% to 4.5% bitrate saving compared to the original CABAC algorithm. The application of the proposed algorithm marginally affects the complexity of HEVC video encoder, but the complexity of video decoder increases by 32% to 38%. In order to decrease the complexity of video decoding, a new tool has been proposed for the improved CABAC that enables scaling of the decoder complexity. Experiments show that this tool gives 5% to 7.5% reduction of the decoding time while still maintaining high efficiency in the data compression.

  19. Effects of river reach discretization on the estimation of the probability of levee failure owing to piping

    Science.gov (United States)

    Mazzoleni, Maurizio; Brandimarte, Luigia; Barontini, Stefano; Ranzi, Roberto

    2014-05-01

    Over the centuries many societies have preferred to settle down nearby floodplains area and take advantage of the favorable environmental conditions. Due to changing hydro-meteorological conditions, over time, levee systems along rivers have been raised to protect urbanized area and reduce the impact of floods. As expressed by the so called "levee paradox", many societies might to tend to trust these levee protection systems due to an induced sense of safety and, as a consequence, invest even more in urban developing in levee protected flood prone areas. As a result, considering also the increasing number of population around the world, people living in floodplains is growing. However, human settlements in floodplains are not totally safe and have been continuously endangered by the risk of flooding. In fact, failures of levee system in case of flood event have also produced the most devastating disasters of the last two centuries due to the exposure of the developed floodprone areas to risk. In those cases, property damage is certain, but loss of life can vary dramatically with the extent of the inundation area, the size of the population at risk, and the amount of warning time available. The aim of this study is to propose an innovative methodology to estimate the reliability of a general river levee system in case of piping, considering different sources of uncertainty, and analyze the influence of different discretization of the river reach in sub-reaches in the evaluation of the probability of failure. The reliability analysis, expressed in terms of fragility curve, was performed evaluating the probability of failure, conditioned by a given hydraulic load in case of a certain levee failure mechanism, using a Monte Carlo and First Order Reliability Method. Knowing the information about fragility curve for each discrete levee reach, different fragility indexes were introduced. Using the previous information was then possible to classify the river into sub

  20. Three- to nine-year survival estimates and fracture mechanisms of zirconia- and alumina-based restorations using standardized criteria to distinguish the severity of ceramic fractures.

    Science.gov (United States)

    Moráguez, Osvaldo D; Wiskott, H W Anselm; Scherrer, Susanne S

    2015-12-01

    The aims of this study were set as follows: 1. To provide verifiable criteria to categorize the ceramic fractures into non-critical (i.e., amenable to polishing) or critical (i.e., in need of replacement) 2. To establish the corresponding survival rates for alumina and zirconia restorations 3. To establish the mechanism of fracture using fractography Fifty-eight patients restored with 115 alumina-/zirconia-based crowns and 26 zirconia-based fixed dental prostheses (FDPs) were included. Ceramic fractures were classified into four types and further subclassified into "critical" or "non-critical." Kaplan-Meier survival estimates were calculated for "critical fractures only" and "all fractures." Intra-oral replicas were taken for fractographic analyses. Kaplan-Meier survival estimates for "critical fractures only" and "all fractures" were respectively: Alumina single crowns: 90.9 and 68.3 % after 9.5 years (mean 5.71 ± 2.6 years). Zirconia single crowns: 89.4 and 80.9 % after 6.3 years (mean 3.88 ± 1.2 years). Zirconia FDPs: 68.6 % (critical fractures) and 24.6 % (all fractures) after 7.2 and 4.6 years respectively (FDP mean observation time 3.02 ± 1.4 years). No core/framework fractures were detected. Survival estimates varied significantly depending on whether "all" fractures were considered as failures or only those deemed as "critical". For all restorations, fractographic analyses of failed veneering ceramics systematically demonstrated heavy occlusal wear at the failure origin. Therefore, the relief of local contact pressures on unsupported ceramic is recommended. Occlusal contacts on mesial or distal ridges should systematically be eliminated. A classification standard for ceramic fractures into four categories with subtypes "critical" and "non-critical" provides a differentiated view of the survival of ceramic restorations.

  1. CFD modelling of most probable bubble nucleation rate from binary mixture with estimation of components' mole fraction in critical cluster

    Science.gov (United States)

    Hong, Ban Zhen; Keong, Lau Kok; Shariff, Azmi Mohd

    2016-05-01

    The employment of different mathematical models to address specifically for the bubble nucleation rates of water vapour and dissolved air molecules is essential as the physics for them to form bubble nuclei is different. The available methods to calculate bubble nucleation rate in binary mixture such as density functional theory are complicated to be coupled along with computational fluid dynamics (CFD) approach. In addition, effect of dissolved gas concentration was neglected in most study for the prediction of bubble nucleation rates. The most probable bubble nucleation rate for the water vapour and dissolved air mixture in a 2D quasi-stable flow across a cavitating nozzle in current work was estimated via the statistical mean of all possible bubble nucleation rates of the mixture (different mole fractions of water vapour and dissolved air) and the corresponding number of molecules in critical cluster. Theoretically, the bubble nucleation rate is greatly dependent on components' mole fraction in a critical cluster. Hence, the dissolved gas concentration effect was included in current work. Besides, the possible bubble nucleation rates were predicted based on the calculated number of molecules required to form a critical cluster. The estimation of components' mole fraction in critical cluster for water vapour and dissolved air mixture was obtained by coupling the enhanced classical nucleation theory and CFD approach. In addition, the distribution of bubble nuclei of water vapour and dissolved air mixture could be predicted via the utilisation of population balance model.

  2. Inverse probability weighting to estimate causal effect of a singular phase in a multiphase randomized clinical trial for multiple myeloma

    Directory of Open Access Journals (Sweden)

    Annalisa Pezzi

    2016-11-01

    Full Text Available Abstract Background Randomization procedure in randomized controlled trials (RCTs permits an unbiased estimation of causal effects. However, in clinical practice, differential compliance between arms may cause a strong violation of randomization balance and biased treatment effect among those who comply. We evaluated the effect of the consolidation phase on disease-free survival of patients with multiple myeloma in an RCT designed for another purpose, adjusting for potential selection bias due to different compliance to previous treatment phases. Methods We computed two propensity scores (PS to model two different selection processes: the first to undergo autologous stem cell transplantation, the second to begin consolidation therapy. Combined stabilized inverse probability treatment weights were then introduced in the Cox model to estimate the causal effect of consolidation therapy miming an ad hoc RCT protocol. Results We found that the effect of consolidation therapy was restricted to the first 18 months of the phase (HR: 0.40, robust 95 % CI: 0.17-0.96, after which it disappeared. Conclusions PS-based methods could be a complementary approach within an RCT context to evaluate the effect of the last phase of a complex therapeutic strategy, adjusting for potential selection bias caused by different compliance to the previous phases of the therapeutic scheme, in order to simulate an ad hoc randomization procedure. Trial registration ClinicalTrials.gov: NCT01134484 May 28, 2010 (retrospectively registered EudraCT: 2005-003723-39 December 17, 2008 (retrospectively registered

  3. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri

    Science.gov (United States)

    Southard, Rodney E.; Veilleux, Andrea G.

    2014-01-01

    Regression analysis techniques were used to develop a set of equations for rural ungaged stream sites for estimating discharges with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. Basin and climatic characteristics were computed using geographic information software and digital geospatial data. A total of 35 characteristics were computed for use in preliminary statewide and regional regression analyses. Annual exceedance-probability discharge estimates were computed for 278 streamgages by using the expected moments algorithm to fit a log-Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data from water year 1844 to 2012. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized multiple Grubbs-Beck test was used to detect potentially influential low floods. Annual peak flows less than a minimum recordable discharge at a streamgage were incorporated into the at-site station analyses. An updated regional skew coefficient was determined for the State of Missouri using Bayesian weighted least-squares/generalized least squares regression analyses. At-site skew estimates for 108 long-term streamgages with 30 or more years of record and the 35 basin characteristics defined for this study were used to estimate the regional variability in skew. However, a constant generalized-skew value of -0.30 and a mean square error of 0.14 were determined in this study. Previous flood studies indicated that the distinct physical features of the three physiographic provinces have a pronounced effect on the magnitude of flood peaks. Trends in the magnitudes of the residuals from preliminary statewide regression analyses from previous studies confirmed that regional analyses in this study were

  4. Estimating the probability of identity in a random dog population using 15 highly polymorphic canine STR markers.

    Science.gov (United States)

    Eichmann, Cordula; Berger, Burkhard; Steinlechner, Martin; Parson, Walther

    2005-06-30

    Dog DNA-profiling is becoming an important supplementary technology for the investigation of accident and crime, as dogs are intensely integrated in human social life. We investigated 15 highly polymorphic canine STR markers and two sex-related markers of 131 randomly selected dogs from the area around Innsbruck, Tyrol, Austria, which were co-amplified in three PCR multiplex reactions (ZUBECA6, FH2132, FH2087Ua, ZUBECA4, WILMSTF, PEZ15, PEZ6, FH2611, FH2087Ub, FH2054, PEZ12, PEZ2, FH2010, FH2079 and VWF.X). Linkage testing for our set of marker suggested no evidence for linkage between the loci. Heterozygosity (HET), polymorphism information content (PIC) and the probability of identity (P((ID)theoretical), P((ID)unbiased), P((ID)sib)) were calculated for each marker. The HET((exp))-values of the 15 markers lie between 0.6 (VWF.X) and 0.9 (ZUBECA6), P((ID)sib)-values were found to range between 0.49 (VWF.X) and 0.28 (ZUBECA6). Moreover, the P((ID)sib) was computed for sets of loci by sequentially adding single loci to estimate the information content and the usefulness of the selected marker sets for the identification of dogs. The estimated P((ID)sib) value of all 15 markers amounted to 8.5 x 10(-8). The presented estimations turned out to be a helpful approach for a reasonable choice of markers for the individualisation of dogs.

  5. Methods for estimating annual exceedance probability discharges for streams in Arkansas, based on data through water year 2013

    Science.gov (United States)

    Wagner, Daniel M.; Krieger, Joshua D.; Veilleux, Andrea G.

    2016-08-04

    In 2013, the U.S. Geological Survey initiated a study to update regional skew, annual exceedance probability discharges, and regional regression equations used to estimate annual exceedance probability discharges for ungaged locations on streams in the study area with the use of recent geospatial data, new analytical methods, and available annual peak-discharge data through the 2013 water year. An analysis of regional skew using Bayesian weighted least-squares/Bayesian generalized-least squares regression was performed for Arkansas, Louisiana, and parts of Missouri and Oklahoma. The newly developed constant regional skew of -0.17 was used in the computation of annual exceedance probability discharges for 281 streamgages used in the regional regression analysis. Based on analysis of covariance, four flood regions were identified for use in the generation of regional regression models. Thirty-nine basin characteristics were considered as potential explanatory variables, and ordinary least-squares regression techniques were used to determine the optimum combinations of basin characteristics for each of the four regions. Basin characteristics in candidate models were evaluated based on multicollinearity with other basin characteristics (variance inflation factor < 2.5) and statistical significance at the 95-percent confidence level (p ≤ 0.05). Generalized least-squares regression was used to develop the final regression models for each flood region. Average standard errors of prediction of the generalized least-squares models ranged from 32.76 to 59.53 percent, with the largest range in flood region D. Pseudo coefficients of determination of the generalized least-squares models ranged from 90.29 to 97.28 percent, with the largest range also in flood region D. The regional regression equations apply only to locations on streams in Arkansas where annual peak discharges are not substantially affected by regulation, diversion, channelization, backwater, or urbanization

  6. Model assembly for estimating cell surviving fraction for both targeted and nontargeted effects based on microdosimetric probability densities.

    Directory of Open Access Journals (Sweden)

    Tatsuhiko Sato

    Full Text Available We here propose a new model assembly for estimating the surviving fraction of cells irradiated with various types of ionizing radiation, considering both targeted and nontargeted effects in the same framework. The probability densities of specific energies in two scales, which are the cell nucleus and its substructure called a domain, were employed as the physical index for characterizing the radiation fields. In the model assembly, our previously established double stochastic microdosimetric kinetic (DSMK model was used to express the targeted effect, whereas a newly developed model was used to express the nontargeted effect. The radioresistance caused by overexpression of anti-apoptotic protein Bcl-2 known to frequently occur in human cancer was also considered by introducing the concept of the adaptive response in the DSMK model. The accuracy of the model assembly was examined by comparing the computationally and experimentally determined surviving fraction of Bcl-2 cells (Bcl-2 overexpressing HeLa cells and Neo cells (neomycin resistant gene-expressing HeLa cells irradiated with microbeam or broadbeam of energetic heavy ions, as well as the WI-38 normal human fibroblasts irradiated with X-ray microbeam. The model assembly reproduced very well the experimentally determined surviving fraction over a wide range of dose and linear energy transfer (LET values. Our newly established model assembly will be worth being incorporated into treatment planning systems for heavy-ion therapy, brachytherapy, and boron neutron capture therapy, given critical roles of the frequent Bcl-2 overexpression and the nontargeted effect in estimating therapeutic outcomes and harmful effects of such advanced therapeutic modalities.

  7. Model assembly for estimating cell surviving fraction for both targeted and nontargeted effects based on microdosimetric probability densities.

    Science.gov (United States)

    Sato, Tatsuhiko; Hamada, Nobuyuki

    2014-01-01

    We here propose a new model assembly for estimating the surviving fraction of cells irradiated with various types of ionizing radiation, considering both targeted and nontargeted effects in the same framework. The probability densities of specific energies in two scales, which are the cell nucleus and its substructure called a domain, were employed as the physical index for characterizing the radiation fields. In the model assembly, our previously established double stochastic microdosimetric kinetic (DSMK) model was used to express the targeted effect, whereas a newly developed model was used to express the nontargeted effect. The radioresistance caused by overexpression of anti-apoptotic protein Bcl-2 known to frequently occur in human cancer was also considered by introducing the concept of the adaptive response in the DSMK model. The accuracy of the model assembly was examined by comparing the computationally and experimentally determined surviving fraction of Bcl-2 cells (Bcl-2 overexpressing HeLa cells) and Neo cells (neomycin resistant gene-expressing HeLa cells) irradiated with microbeam or broadbeam of energetic heavy ions, as well as the WI-38 normal human fibroblasts irradiated with X-ray microbeam. The model assembly reproduced very well the experimentally determined surviving fraction over a wide range of dose and linear energy transfer (LET) values. Our newly established model assembly will be worth being incorporated into treatment planning systems for heavy-ion therapy, brachytherapy, and boron neutron capture therapy, given critical roles of the frequent Bcl-2 overexpression and the nontargeted effect in estimating therapeutic outcomes and harmful effects of such advanced therapeutic modalities.

  8. Developing a Methodology for Eliciting Subjective Probability Estimates During Expert Evaluations of Safety Interventions: Application for Bayesian Belief Networks

    Science.gov (United States)

    Wiegmann, Douglas A.a

    2005-01-01

    The NASA Aviation Safety Program (AvSP) has defined several products that will potentially modify airline and/or ATC operations, enhance aircraft systems, and improve the identification of potential hazardous situations within the National Airspace System (NAS). Consequently, there is a need to develop methods for evaluating the potential safety benefit of each of these intervention products so that resources can be effectively invested to produce the judgments to develop Bayesian Belief Networks (BBN's) that model the potential impact that specific interventions may have. Specifically, the present report summarizes methodologies for improving the elicitation of probability estimates during expert evaluations of AvSP products for use in BBN's. The work involved joint efforts between Professor James Luxhoj from Rutgers University and researchers at the University of Illinois. The Rutgers' project to develop BBN's received funding by NASA entitled "Probabilistic Decision Support for Evaluating Technology Insertion and Assessing Aviation Safety System Risk." The proposed project was funded separately but supported the existing Rutgers' program.

  9. Methods for estimating annual exceedance-probability streamflows for streams in Kansas based on data through water year 2015

    Science.gov (United States)

    Painter, Colin C.; Heimann, David C.; Lanning-Rush, Jennifer L.

    2017-08-14

    A study was done by the U.S. Geological Survey in cooperation with the Kansas Department of Transportation and the Federal Emergency Management Agency to develop regression models to estimate peak streamflows of annual exceedance probabilities of 50, 20, 10, 4, 2, 1, 0.5, and 0.2 percent at ungaged locations in Kansas. Peak streamflow frequency statistics from selected streamgages were related to contributing drainage area and average precipitation using generalized least-squares regression analysis. The peak streamflow statistics were derived from 151 streamgages with at least 25 years of streamflow data through 2015. The developed equations can be used to predict peak streamflow magnitude and frequency within two hydrologic regions that were defined based on the effects of irrigation. The equations developed in this report are applicable to streams in Kansas that are not substantially affected by regulation, surface-water diversions, or urbanization. The equations are intended for use for streams with contributing drainage areas ranging from 0.17 to 14,901 square miles in the nonirrigation effects region and, 1.02 to 3,555 square miles in the irrigation-affected region, corresponding to the range of drainage areas of the streamgages used in the development of the regional equations.

  10. Estimation of probable maximum typhoon wave for coastal nuclear power plant%滨海核电可能最大台风浪的推算

    Institute of Scientific and Technical Information of China (English)

    丁赟

    2011-01-01

    采用当前国际流行的第三代波浪模式SWAN探讨了滨海核电工程可能最大台风浪的计算,并分析了可能最大台风浪与相伴随的可能最大风暴潮成长规律.分析得可能最大台风浪通常滞后可能最大风暴潮增水峰值,推算得到的可能最大台风浪高于遮浪海洋站观测到的最大波高,为滨海核电工程可能最大台风浪的推算提供参考.%The third-generation wave model, SWAN (Simulating Waves Nearshore), was employed to estimate the probable maximum typhoon wave at a coastal engineering area. The relationship between the development of probable maximum typhoon wave and that of probable maximum storm surge was investigated. It is shown that the probable maximum typhoon wave usually occurs later than the probable maximum storm surge. The estimated probable maximum typhoon wave is higher than the historical observational maximum wave height data of Zhelang station. The approach utilized in this study to estimate probable maximum typhoon wave could provide valuable information in design of coastal engineering.

  11. Survival Analysis of Patients with End Stage Renal Disease

    Science.gov (United States)

    Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.

    2015-06-01

    This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.

  12. Stochastic approach for an unbiased estimation of the probability of a successful separation in conventional chromatography and sequential elution liquid chromatography.

    Science.gov (United States)

    Ennis, Erin J; Foley, Joe P

    2016-07-15

    A stochastic approach was utilized to estimate the probability of a successful isocratic or gradient separation in conventional chromatography for numbers of sample components, peak capacities, and saturation factors ranging from 2 to 30, 20-300, and 0.017-1, respectively. The stochastic probabilities were obtained under conditions of (i) constant peak width ("gradient" conditions) and (ii) peak width increasing linearly with time ("isocratic/constant N" conditions). The isocratic and gradient probabilities obtained stochastically were compared with the probabilities predicted by Martin et al. [Anal. Chem., 58 (1986) 2200-2207] and Davis and Stoll [J. Chromatogr. A, (2014) 128-142]; for a given number of components and peak capacity the same trend is always observed: probability obtained with the isocratic stochastic approachstochastic approach≤probability predicted by Davis and Stoll stochastic results are applied to conventional HPLC and sequential elution liquid chromatography (SE-LC), the latter is shown to provide much greater probabilities of success for moderately complex samples (e.g., PHPLC=31.2% versus PSE-LC=69.1% for 12 components and the same analysis time). For a given number of components, the density of probability data provided over the range of peak capacities is sufficient to allow accurate interpolation of probabilities for peak capacities not reported, stochastic approach include isothermal and programmed-temperature gas chromatography.

  13. Model approach to estimate the probability of accepting a lot of heterogeneously contaminated powdered food using different sampling strategies.

    Science.gov (United States)

    Valero, Antonio; Pasquali, Frédérique; De Cesare, Alessandra; Manfreda, Gerardo

    2014-08-01

    Current sampling plans assume a random distribution of microorganisms in food. However, food-borne pathogens are estimated to be heterogeneously distributed in powdered foods. This spatial distribution together with very low level of contaminations raises concern of the efficiency of current sampling plans for the detection of food-borne pathogens like Cronobacter and Salmonella in powdered foods such as powdered infant formula or powdered eggs. An alternative approach based on a Poisson distribution of the contaminated part of the lot (Habraken approach) was used in order to evaluate the probability of falsely accepting a contaminated lot of powdered food when different sampling strategies were simulated considering variables such as lot size, sample size, microbial concentration in the contaminated part of the lot and proportion of contaminated lot. The simulated results suggest that a sample size of 100g or more corresponds to the lower number of samples to be tested in comparison with sample sizes of 10 or 1g. Moreover, the number of samples to be tested greatly decrease if the microbial concentration is 1CFU/g instead of 0.1CFU/g or if the proportion of contamination is 0.05 instead of 0.01. Mean contaminations higher than 1CFU/g or proportions higher than 0.05 did not impact on the number of samples. The Habraken approach represents a useful tool for risk management in order to design a fit-for-purpose sampling plan for the detection of low levels of food-borne pathogens in heterogeneously contaminated powdered food. However, it must be outlined that although effective in detecting pathogens, these sampling plans are difficult to be applied since the huge number of samples that needs to be tested. Sampling does not seem an effective measure to control pathogens in powdered food.

  14. Developing a model to estimate the probability of bacteremia in women with community-onset febrile urinary tract infection.

    Science.gov (United States)

    Oh, Won Sup; Kim, Yeon-Sook; Yeom, Joon Sup; Choi, Hee Kyoung; Kwak, Yee Gyung; Jun, Jae-Bum; Park, Seong Yeon; Chung, Jin-Won; Rhee, Ji-Young; Kim, Baek-Nam

    2016-11-24

    Among patients with urinary tract infection (UTI), bacteremic cases show higher mortality rates than do nonbacteremic cases. Early identification of bacteremic cases is crucial for severity assessment of patients with febrile UTI. This study aimed to identify predictors associated with bacteremia in women with community-onset febrile UTI and to develop a prediction model to estimate the probability of bacteremic cases. This cross-sectional study included women consecutively hospitalized with community-onset febrile UTI at 10 hospitals in Korea. Multiple logistic regression identified predictors associated with bacteremia among candidate variables chosen from univariate analysis. A prediction model was developed using all predictors weighted by their regression coefficients. From July to September 2014, 383 women with febrile UTI were included: 115 (30.0%) bacteremic and 268 (70.0%) nonbacteremic cases. A prediction model consisted of diabetes mellitus (1 point), urinary tract obstruction by stone (2), costovertebral angle tenderness (2), a fraction of segmented neutrophils of > 90% (2), thrombocytopenia (2), azotemia (2), and the fulfillment of all criteria for systemic inflammatory response syndrome (2). The c statistic for the model was 0.807 (95% confidence interval [CI], 0.757-0.856). At a cutoff value of ≥ 3, the model had a sensitivity of 86.1% (95% CI, 78.1-91.6%) and a specificity of 54.9% (95% CI, 48.7-91.6%). Our model showed a good discriminatory power for early identification of bacteremic cases in women with community-onset febrile UTI. In addition, our model can be used to identify patients at low risk for bacteremia because of its relatively high sensitivity.

  15. Bayesian pretest probability estimation for primary malignant bone tumors based on the Surveillance, Epidemiology and End Results Program (SEER) database.

    Science.gov (United States)

    Benndorf, Matthias; Neubauer, Jakob; Langer, Mathias; Kotter, Elmar

    2017-03-01

    In the diagnostic process of primary bone tumors, patient age, tumor localization and to a lesser extent sex affect the differential diagnosis. We therefore aim to develop a pretest probability calculator for primary malignant bone tumors based on population data taking these variables into account. We access the SEER (Surveillance, Epidemiology and End Results Program of the National Cancer Institute, 2015 release) database and analyze data of all primary malignant bone tumors diagnosed between 1973 and 2012. We record age at diagnosis, tumor localization according to the International Classification of Diseases (ICD-O-3) and sex. We take relative probability of the single tumor entity as a surrogate parameter for unadjusted pretest probability. We build a probabilistic (naïve Bayes) classifier to calculate pretest probabilities adjusted for age, tumor localization and sex. We analyze data from 12,931 patients (647 chondroblastic osteosarcomas, 3659 chondrosarcomas, 1080 chordomas, 185 dedifferentiated chondrosarcomas, 2006 Ewing's sarcomas, 281 fibroblastic osteosarcomas, 129 fibrosarcomas, 291 fibrous malignant histiocytomas, 289 malignant giant cell tumors, 238 myxoid chondrosarcomas, 3730 osteosarcomas, 252 parosteal osteosarcomas, 144 telangiectatic osteosarcomas). We make our probability calculator accessible at http://ebm-radiology.com/bayesbone/index.html . We provide exhaustive tables for age and localization data. Results from tenfold cross-validation show that in 79.8 % of cases the pretest probability is correctly raised. Our approach employs population data to calculate relative pretest probabilities for primary malignant bone tumors. The calculator is not diagnostic in nature. However, resulting probabilities might serve as an initial evaluation of probabilities of tumors on the differential diagnosis list.

  16. Estimating present day extreme water level exceedance probabilities around the coastline of Australia: tides, extra-tropical storm surges and mean sea level

    Science.gov (United States)

    Haigh, Ivan D.; Wijeratne, E. M. S.; MacPherson, Leigh R.; Pattiaratchi, Charitha B.; Mason, Matthew S.; Crompton, Ryan P.; George, Steve

    2014-01-01

    The occurrence of extreme water levels along low-lying, highly populated and/or developed coastlines can lead to considerable loss of life and billions of dollars of damage to coastal infrastructure. Therefore it is vitally important that the exceedance probabilities of extreme water levels are accurately evaluated to inform risk-based flood management, engineering and future land-use planning. This ensures the risk of catastrophic structural failures due to under-design or expensive wastes due to over-design are minimised. This paper estimates for the first time present day extreme water level exceedence probabilities around the whole coastline of Australia. A high-resolution depth averaged hydrodynamic model has been configured for the Australian continental shelf region and has been forced with tidal levels from a global tidal model and meteorological fields from a global reanalysis to generate a 61-year hindcast of water levels. Output from this model has been successfully validated against measurements from 30 tide gauge sites. At each numeric coastal grid point, extreme value distributions have been fitted to the derived time series of annual maxima and the several largest water levels each year to estimate exceedence probabilities. This provides a reliable estimate of water level probabilities around southern Australia; a region mainly impacted by extra-tropical cyclones. However, as the meteorological forcing used only weakly includes the effects of tropical cyclones, extreme water level probabilities are underestimated around the western, northern and north-eastern Australian coastline. In a companion paper we build on the work presented here and more accurately include tropical cyclone-induced surges in the estimation of extreme water level. The multi-decadal hindcast generated here has been used primarily to estimate extreme water level exceedance probabilities but could be used more widely in the future for a variety of other research and practical

  17. A Method to Estimate the Probability That Any Individual Lightning Stroke Contacted the Surface Within Any Radius of Any Point

    Science.gov (United States)

    Huddleston, Lisa L.; Roeder, William; Merceret, Francis J.

    2010-01-01

    A technique has been developed to calculate the probability that any nearby lightning stroke is within any radius of any point of interest. In practice, this provides the probability that a nearby lightning stroke was within a key distance of a facility, rather than the error ellipses centered on the stroke. This process takes the current bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to get the probability that the stroke is inside any specified radius. This new facility-centric technique will be much more useful to the space launch customers and may supersede the lightning error ellipse approach discussed in [5], [6].

  18. Estimating the per-contact probability of infection by highly pathogenic avian influenza (H7N7) virus during the 2003 epidemic in the Netherlands.

    NARCIS (Netherlands)

    Ssematimba, A.; Elbers, A.R.W.; Hagenaars, T.H.J.; Jong, de M.C.M.

    2012-01-01

    Estimates of the per-contact probability of transmission between farms of Highly Pathogenic Avian Influenza virus of H7N7 subtype during the 2003 epidemic in the Netherlands are important for the design of better control and biosecurity strategies. We used standardized data collected during the epid

  19. Impact of estimated HDL particle size via the ratio of HDL-C and apoprotein A-I on short-term prognosis of diabetic patients with stable coronary artery disease.

    Science.gov (United States)

    Hong, Li-Feng; Yang, Bo; Luo, Song-Hui; Li, Jian-Jun

    2014-09-01

    Revascularization and statin therapy are routinely used in the management of stable coronary artery disease. However, it is unclear whether the estimated high-density lipoprotein (HDL) particle size (eHDL-S), the ratio of HDL cholesterol (HDL-C) to apoprotein A-I (apoA-I), is associated with the clinical outcomes of diabetic patients with stable coronary artery disease (CAD). We performed a prospective cohort study of 328 patients diagnosed with stable CAD by coronary angiography. Patients were followed up for a mean duration of 12 months. The patients were divided into three groups by the tertiles of eHDL-S: low eHDL-S ( 0.79, n = 99). The associations between the baseline eHDL-S and short-term outcomes were evaluated using the Kaplan-Meier method and Cox proportional regression. The low eHDL-S group had higher triglyceride, hemoglobin A1c, uric acid, and leukocyte count than the other groups. During the follow-up period, 47/328 patients experienced a pre-specified outcome. According to the Kaplan-Meier analysis, the incidence of pre-specified outcomes was lower in the high eHDL-S group (P = 0.04). However, eHDL-S was not independently associated with adverse outcomes in Cox proportional hazards regression (hazard ratio (HR): 0.23, 95% confidence interval (95% CI): 0.01-11.24, P = 0.493). Although the eHDL-S was associated with inflammatory biomarkers, it was not independently associated with the short-term prognosis of diabetic patients with stable CAD in the era of revascularization and potent statin therapy.

  20. A capture-recapture survival analysis model for radio-tagged animals

    Science.gov (United States)

    Pollock, K.H.; Bunck, C.M.; Winterstein, S.R.; Chen, C.-L.; North, P.M.; Nichols, J.D.

    1995-01-01

    In recent years, survival analysis of radio-tagged animals has developed using methods based on the Kaplan-Meier method used in medical and engineering applications (Pollock et al., 1989a,b). An important assumption of this approach is that all tagged animals with a functioning radio can be relocated at each sampling time with probability 1. This assumption may not always be reasonable in practice. In this paper, we show how a general capture-recapture model can be derived which allows for some probability (less than one) for animals to be relocated. This model is not simply a Jolly-Seber model because it is possible to relocate both dead and live animals, unlike when traditional tagging is used. The model can also be viewed as a generalization of the Kaplan-Meier procedure, thus linking the Jolly-Seber and Kaplan-Meier approaches to survival estimation. We present maximum likelihood estimators and discuss testing between submodels. We also discuss model assumptions and their validity in practice. An example is presented based on canvasback data collected by G. M. Haramis of Patuxent Wildlife Research Center, Laurel, Maryland, USA.

  1. Estimating detection probability for Canada lynx Lynx canadensis using snow-track surveys in the northern Rocky Mountains, Montana, USA

    Science.gov (United States)

    John R. Squires; Lucretia E. Olson; David L. Turner; Nicholas J. DeCesare; Jay A. Kolbe

    2012-01-01

    We used snow-tracking surveys to determine the probability of detecting Canada lynx Lynx canadensis in known areas of lynx presence in the northern Rocky Mountains, Montana, USA during the winters of 2006 and 2007. We used this information to determine the minimum number of survey replicates necessary to infer the presence and absence of lynx in areas of similar lynx...

  2. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki

    2017-07-28

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  3. Estimation of most probable power distribution in BWRs by least squares method using in-core measurements

    Energy Technology Data Exchange (ETDEWEB)

    Ezure, Hideo

    1988-09-01

    Effective combination of measured data with theoretical analysis has permitted deriving a mehtod for more accurately estimating the power distribution in BWRs. Use is made of least squares method for the combination between relationship of the power distribution with measured values and the model used in FLARE or in the three-dimensional two-group diffusion code. Trial application of the new method to estimating the power distribution in JPDR-1 has proved the method to provide reliable results.

  4. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    Science.gov (United States)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  5. What makes a message real? The effects of perceived realism of alcohol- and drug-related messages on personal probability estimation.

    Science.gov (United States)

    Cho, Hyunyi; Shen, Lijiang; Wilson, Kari M

    2013-03-01

    Perceived lack of realism in alcohol advertising messages promising positive outcomes and antialcohol and antidrug messages portraying negative outcomes of alcohol consumption has been a cause for public health concern. This study examined the effects of perceived realism dimensions on personal probability estimation through identification and message minimization. Data collected from college students in U.S. Midwest in 2010 (N = 315) were analyzed with multilevel structural equation modeling. Plausibility and narrative consistency mitigated message minimization, but they did not influence identification. Factuality and perceptual quality influenced both message minimization and identification, but their effects were smaller than those of typicality. Typicality was the strongest predictor of probability estimation. Implications of the results and suggestions for future research are provided.

  6. Should Coulomb stress change calculations be used to forecast aftershocks and to influence earthquake probability estimates? (Invited)

    Science.gov (United States)

    Parsons, T.

    2009-12-01

    After a large earthquake, our concern immediately moves to the likelihood that another large shock could be triggered, threatening an already weakened building stock. A key question is whether it is best to map out Coulomb stress change calculations shortly after mainshocks to potentially highlight the most likely aftershock locations, or whether it is more prudent to wait until the best information is available. It has been shown repeatedly that spatial aftershock patterns can be matched with Coulomb stress change calculations a year or more after mainshocks. However, with the onset of rapid source slip model determinations, the method has produced encouraging results like the M=8.7 earthquake that was forecast using stress change calculations from 2004 great Sumatra earthquake by McCloskey et al. [2005]. Here, I look back at two additional prospective calculations published shortly after the 2005 M=7.6 Kashmir and 2008 M=8.0 Wenchuan earthquakes. With the benefit of 1.5-4 years of additional seismicity, it is possible to assess the performance of rapid Coulomb stress change calculations. In the second part of the talk, within the context of the ongoing Working Group on California Earthquake Probabilities (WGCEP) assessments, uncertainties associated with time-dependent probability calculations are convolved with uncertainties inherent to Coulomb stress change calculations to assess the strength of signal necessary for a physics-based calculation to merit consideration into a formal earthquake forecast. Conclusions are as follows: (1) subsequent aftershock occurrence shows that prospective static stress change calculations both for Kashmir and Wenchuan examples failed to adequately predict the spatial post-mainshock earthquake distributions. (2) For a San Andreas fault example with relatively well-understood recurrence, a static stress change on the order of 30 to 40 times the annual stressing rate would be required to cause a significant (90%) perturbation to the

  7. Men who have sex with men in Great Britain: comparing methods and estimates from probability and convenience sample surveys.

    Science.gov (United States)

    Prah, Philip; Hickson, Ford; Bonell, Chris; McDaid, Lisa M; Johnson, Anne M; Wayal, Sonali; Clifton, Soazig; Sonnenberg, Pam; Nardone, Anthony; Erens, Bob; Copas, Andrew J; Riddell, Julie; Weatherburn, Peter; Mercer, Catherine H

    2016-09-01

    To examine sociodemographic and behavioural differences between men who have sex with men (MSM) participating in recent UK convenience surveys and a national probability sample survey. We compared 148 MSM aged 18-64 years interviewed for Britain's third National Survey of Sexual Attitudes and Lifestyles (Natsal-3) undertaken in 2010-2012, with men in the same age range participating in contemporaneous convenience surveys of MSM: 15 500 British resident men in the European MSM Internet Survey (EMIS); 797 in the London Gay Men's Sexual Health Survey; and 1234 in Scotland's Gay Men's Sexual Health Survey. Analyses compared men reporting at least one male sexual partner (past year) on similarly worded questions and multivariable analyses accounted for sociodemographic differences between the surveys. MSM in convenience surveys were younger and better educated than MSM in Natsal-3, and a larger proportion identified as gay (85%-95% vs 62%). Partner numbers were higher and same-sex anal sex more common in convenience surveys. Unprotected anal intercourse was more commonly reported in EMIS. Compared with Natsal-3, MSM in convenience surveys were more likely to report gonorrhoea diagnoses and HIV testing (both past year). Differences between the samples were reduced when restricting analysis to gay-identifying MSM. National probability surveys better reflect the population of MSM but are limited by their smaller samples of MSM. Convenience surveys recruit larger samples of MSM but tend to over-represent MSM identifying as gay and reporting more sexual risk behaviours. Because both sampling strategies have strengths and weaknesses, methods are needed to triangulate data from probability and convenience surveys. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  8. Analysis and Mitigation of Tropospheric Effects on Ka Band Satellite Signals and Estimation of Ergodic Capacity and Outage Probability for Terrestrial Links

    OpenAIRE

    Enserink, Scott Warren

    2012-01-01

    The first part of this work covers the effect of the troposphere onKa band (20-30 GHz) satellite signals. The second part deals withthe estimation of the capacity and outage probability forterrestrial links when constrained to quadrature amplitudemodulations.The desire for higher data rates and the need for availablebandwidth has pushed satellite communications into the Ka band(20-30 GHz). At these higher carrier frequencies the effects ofscintillation and rain attenuation are increased. In...

  9. Comparisons of estimates of annual exceedance-probability discharges for small drainage basins in Iowa, based on data through water year 2013

    Science.gov (United States)

    Eash, David A.

    2015-01-01

    Traditionally, the Iowa Department of Transportation has used the Iowa Runoff Chart and single-variable regional-regression equations (RREs) from a U.S. Geological Survey report (published in 1987) as the primary methods to estimate annual exceedance-probability discharge (AEPD) for small (20 square miles or less) drainage basins in Iowa. With the publication of new multi- and single-variable RREs by the U.S. Geological Survey (published in 2013), the Iowa Department of Transportation needs to determine which methods of AEPD estimation provide the best accuracy and the least bias for small drainage basins in Iowa.

  10. Efficiency of using correlation function for estimation of probability of substance detection on the base of THz spectral dynamics

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Peskov, Nikolay V.; Kirillov, Dmitry A.

    2012-10-01

    One of the problems arising in Time-Domain THz spectroscopy for the problem of security is the developing the criteria for assessment of probability for the detection and identification of the explosive and drugs. We analyze the efficiency of using the correlation function and another functional (more exactly, spectral norm) for this aim. These criteria are applied to spectral lines dynamics. For increasing the reliability of the assessment we subtract the averaged value of THz signal during time of analysis of the signal: it means deleting the constant from this part of the signal. Because of this, we can increase the contrast of assessment. We compare application of the Fourier-Gabor transform with unbounded (for example, Gaussian) window, which slides along the signal, for finding the spectral lines dynamics with application of the Fourier transform in short time interval (FTST), in which the Fourier transform is applied to parts of the signals, for the same aim. These methods are close each to other. Nevertheless, they differ by series of frequencies which they use. It is important for practice that the optimal window shape depends on chosen method for obtaining the spectral dynamics. The probability enhancements if we can find the train of pulses with different frequencies, which follow sequentially. We show that there is possibility to get pure spectral lines dynamics even under the condition of distorted spectrum of the substance response on the action of the THz pulse.

  11. Estimation of the probability of radiation failures and single particle upsets of integrated microcircuits onboard the Fobos-Grunt spacecraft

    NARCIS (Netherlands)

    Kuznetsov, NV; Popov, VD; Khamidullina, NM

    2005-01-01

    When designing the radio-electronic equipment for long-term operation in a space environment, one of the most important problems is a correct estimation of radiation stability of its electric and radio components (ERC) against radiation-stimulated doze failures and one-particle effects (upsets). The

  12. Estimation of Partial Safety Factors and Target Failure Probability Based on Cost Optimization of Rubble Mound Breakwaters

    DEFF Research Database (Denmark)

    Kim, Seung-Woo; Suh, Kyung-Duck; Burcharth, Hans F.

    2010-01-01

    The breakwaters are designed by considering the cost optimization because a human risk is seldom considered. Most breakwaters, however, were constructed without considering the cost optimization. In this study, the optimum return period, target failure probability and the partial safety factors...... were evaluated by applying the cost optimization to the rubble mound breakwaters in Korea. The applied method was developed by Hans F. Burcharth and John D. Sorensen in relation to the PIANC Working Group 47. The optimum return period was determined as 50 years in many cases and was found as 100 years...... of the national design standard and then the overall safety factor is calculated as 1.09. It is required that the nominal diameter and weight of armor are respectively 9% and 30% larger than those of the existing design method. Moreover, partial safety factors considering the cost optimization were compared...

  13. Estimation of Extreme Responses and Failure Probability of Wind Turbines under Normal Operation by Controlled Monte Carlo Simulation

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri

    order statistical moments. The results obtained by extrapolation of the extreme values to the stipulated design period of the wind turbine depend strongly on the relevance of these adopted extreme value distributions. The problem is that this relevance cannot be decided from the data obtained....... The solution of the Fokker-Planck-Kolmogorov (FPK) equation for systems governed by a stochastic differential equation driven by Gaussian white noise will give the sought time variation of the probability density function. However the analytical solution of the FPK is available for only a few dynamic systems...... and the numerical solution is difficult for dynamic problem of more than 2-3 degrees of freedom. This confines the applicability of the FPK to a very narrow range of problems. On the other hand the recently introduced Generalized Density Evolution Method (GDEM), has opened a new way toward realization...

  14. Estimated probabilities, volumes, and inundation areas depths of potential postwildfire debris flows from Carbonate, Slate, Raspberry, and Milton Creeks, near Marble, Gunnison County, Colorado

    Science.gov (United States)

    Stevens, Michael R.; Flynn, Jennifer L.; Stephens, Verlin C.; Verdin, Kristine L.

    2011-01-01

    During 2009, the U.S. Geological Survey, in cooperation with Gunnison County, initiated a study to estimate the potential for postwildfire debris flows to occur in the drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble, Colorado. Currently (2010), these drainage basins are unburned but could be burned by a future wildfire. Empirical models derived from statistical evaluation of data collected from recently burned basins throughout the intermountain western United States were used to estimate the probability of postwildfire debris-flow occurrence and debris-flow volumes for drainage basins occupied by Carbonate, Slate, Raspberry, and Milton Creeks near Marble. Data for the postwildfire debris-flow models included drainage basin area; area burned and burn severity; percentage of burned area; soil properties; rainfall total and intensity for the 5- and 25-year-recurrence, 1-hour-duration-rainfall; and topographic and soil property characteristics of the drainage basins occupied by the four creeks. A quasi-two-dimensional floodplain computer model (FLO-2D) was used to estimate the spatial distribution and the maximum instantaneous depth of the postwildfire debris-flow material during debris flow on the existing debris-flow fans that issue from the outlets of the four major drainage basins. The postwildfire debris-flow probabilities at the outlet of each drainage basin range from 1 to 19 percent for the 5-year-recurrence, 1-hour-duration rainfall, and from 3 to 35 percent for 25-year-recurrence, 1-hour-duration rainfall. The largest probabilities for postwildfire debris flow are estimated for Raspberry Creek (19 and 35 percent), whereas estimated debris-flow probabilities for the three other creeks range from 1 to 6 percent. The estimated postwildfire debris-flow volumes at the outlet of each creek range from 7,500 to 101,000 cubic meters for the 5-year-recurrence, 1-hour-duration rainfall, and from 9,400 to 126,000 cubic meters for

  15. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  16. Evaluation of test-strategies for estimating probability of low prevalence of paratuberculosis in Danish dairy herds

    DEFF Research Database (Denmark)

    Sergeant, E.S.G.; Nielsen, Søren S.; Toft, Nils

    2008-01-01

    . Using this model, five herd-testing strategies were evaluated: (1) milk-ELISA on all lactating cows; (2) milk-ELISA on lactating cows 4 years old; (4) faecal culture on all lactating cows; and (5) milk-ELISA plus faecal culture in series on all lactating cows. The five testing strategies were evaluated...... using observed milk-ELISA results from 19 Danish dairy herds as well as for simulated results from the same herds assuming that they were uninfected. Whole-herd milk-ELISA was the preferred strategy, and considered the most cost-effective strategy of the five alternatives. The five strategies were all...... efficient in detecting infection, i.e. estimating a low Pr-Low in infected herds, however, Pr-Low estimates for milk-ELISA on age-cohorts were too low in simulated uninfected herds and the strategies involving faecal culture were too expensive to be of practical interest. For simulated uninfected herds...

  17. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  18. On the Correct Estimate of the Probability of False Detection of the Matched Filter in Weak-Signal Detection Problems

    CERN Document Server

    Vio, Roberto

    2016-01-01

    The detection reliability of weak signals is a critical issue in many astronomical contexts and may have severe consequences for determining number counts and luminosity functions, but also for optimising the use of telescope time in follow-up observations. Because of its optimal properties, one of the most popular and widely-used detection technique is the matched filter (MF). This is a linear filter designed to maximise the detectability of a signal of known structure that is buried in additive Gaussian random noise. In this work we show that in the very common situation where the number and position of the searched signals within a data sequence (e.g. an emission line in a spectrum) or an image (e.g. a point-source in an interferometric map) are unknown, this technique, when applied in its standard form, may severely underestimate the probability of false detection. This is because the correct use of the MF relies upon a-priori knowledge of the position of the signal of interest. In the absence of this inf...

  19. A Method to Estimate the Probability that Any Individual Cloud-to-Ground Lightning Stroke was Within Any Radius of Any Point

    Science.gov (United States)

    Huddleston, Lisa; Roeder, WIlliam P.; Merceret, Francis J.

    2011-01-01

    A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station. Future applications could include forensic meteorology.

  20. Estimation of the Probability of Radiation Failures and Single Particle Upsets of Integrated Microcircuits onboard the Fobos-Grunt Spacecraft

    Science.gov (United States)

    Kuznetsov, N. V.; Popov, V. D.; Khamidullina, N. M.

    2005-05-01

    When designing the radio-electronic equipment for long-term operation in a space environment, one of the most important problems is a correct estimation of radiation stability of its electric and radio components (ERC) against radiation-stimulated doze failures and one-particle effects (upsets). These problems are solved in this paper for the integrated microcircuits (IMC) of various types that are to be installed onboard the Fobos-Grunt spacecraft designed at the Federal State Unitary Enterprise “Lavochkin Research and Production Association.” The launching of this spacecraft is planned for 2009.

  1. Improving Ranking Using Quantum Probability

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.

  2. A Review of Mycotoxins in Food and Feed Products in Portugal and Estimation of Probable Daily Intakes.

    Science.gov (United States)

    Abrunhosa, Luís; Morales, Héctor; Soares, Célia; Calado, Thalita; Vila-Chã, Ana Sofia; Pereira, Martinha; Venâncio, Armando

    2016-01-01

    Mycotoxins are toxic secondary metabolites produced by filamentous fungi that occur naturally in agricultural commodities worldwide. Aflatoxins, ochratoxin A, patulin, fumonisins, zearalenone, trichothecenes, and ergot alkaloids are presently the most important for food and feed safety. These compounds are produced by several species that belong to the Aspergillus, Penicillium, Fusarium, and Claviceps genera and can be carcinogenic, mutagenic, teratogenic, cytotoxic, neurotoxic, nephrotoxic, estrogenic, and immunosuppressant. Human and animal exposure to mycotoxins is generally assessed by taking into account data on the occurrence of mycotoxins in food and feed as well as data on the consumption patterns of the concerned population. This evaluation is crucial to support measures to reduce consumer exposure to mycotoxins. This work reviews the occurrence and levels of mycotoxins in Portuguese food and feed to provide a global overview of this issue in Portugal. With the information collected, the exposure of the Portuguese population to those mycotoxins is assessed, and the estimated dietary intakes are presented.

  3. Common Cause Case Study: An Estimated Probability of Four Solid Rocket Booster Hold-Down Post Stud Hang-ups

    Science.gov (United States)

    Cross, Robert

    2005-01-01

    Until Solid Rocket Motor ignition, the Space Shuttle is mated to the Mobil Launch Platform in part via eight (8) Solid Rocket Booster (SRB) hold-down bolts. The bolts are fractured using redundant pyrotechnics, and are designed to drop through a hold-down post on the Mobile Launch Platform before the Space Shuttle begins movement. The Space Shuttle program has experienced numerous failures where a bolt has hung up. That is, it did not clear the hold-down post before liftoff and was caught by the SRBs. This places an additional structural load on the vehicle that was not included in the original certification requirements. The Space Shuttle is currently being certified to withstand the loads induced by up to three (3) of eight (8) SRB hold-down experiencing a "hang-up". The results of loads analyses performed for (4) stud hang-ups indicate that the internal vehicle loads exceed current structural certification limits at several locations. To determine the risk to the vehicle from four (4) stud hang-ups, the likelihood of the scenario occurring must first be evaluated. Prior to the analysis discussed in this paper, the likelihood of occurrence had been estimated assuming that the stud hang-ups were completely independent events. That is, it was assumed that no common causes or factors existed between the individual stud hang-up events. A review of the data associated with the hang-up events, showed that a common factor (timing skew) was present. This paper summarizes a revised likelihood evaluation performed for the four (4) stud hang-ups case considering that there are common factors associated with the stud hang-ups. The results show that explicitly (i.e. not using standard common cause methodologies such as beta factor or Multiple Greek Letter modeling) taking into account the common factor of timing skew results in an increase in the estimated likelihood of four (4) stud hang-ups of an order of magnitude over the independent failure case.

  4. 风速概率分布参数估计的低阶概率权重矩法%Low-order Probability-weighted Moments Method for Wind Speed Probability Distribution Parameter Estimation

    Institute of Scientific and Technical Information of China (English)

    潘晓春

    2012-01-01

    It is necessary to describe the statistical properties of wind speed using three-parameter Weibull distribution for offshore wind energy resource assessment and utilization.According to the functional relation between parameters and probability-weighted moments(PWM),the functions were fitted with the shape parameter and PWM using logistic curve.Two formulae of parameter estimation were studied out based on low-order insufficient and exceeding PWM.Accuracy test results show that these formulae had higher precision in large-scale range.Through comparative analysis with high-order PWM method for example,the author believes the low-order PWM methods in this paper are worth popularizing.%为便于进行海上风能资源评估与利用,采用三参数Weibull分布来描述风的统计特性是必要的。根据Weibull分布的三参数与概率权重矩(probability-weighted moment,PWM)的关系,应用罗吉斯蒂曲线拟合形状参数与PWM的函数关系,提出低阶不及PWM和超过PWM 2种参数估计方法。精度检验显示,文中方法在较大范围内均具有较高的精度。通过算例分析比较,认为提出的低阶PWM法值得推广使用。

  5. On the consideration of scaling properties of extreme rainfall in Madrid (Spain) for developing a generalized intensity-duration-frequency equation and assessing probable maximum precipitation estimates

    Science.gov (United States)

    Casas-Castillo, M. Carmen; Rodríguez-Solà, Raúl; Navarro, Xavier; Russo, Beniamino; Lastra, Antonio; González, Paula; Redaño, Angel

    2016-11-01

    The fractal behavior of extreme rainfall intensities registered between 1940 and 2012 by the Retiro Observatory of Madrid (Spain) has been examined, and a simple scaling regime ranging from 25 min to 3 days of duration has been identified. Thus, an intensity-duration-frequency (IDF) master equation of the location has been constructed in terms of the simple scaling formulation. The scaling behavior of probable maximum precipitation (PMP) for durations between 5 min and 24 h has also been verified. For the statistical estimation of the PMP, an envelope curve of the frequency factor (k m ) based on a total of 10,194 station-years of annual maximum rainfall from 258 stations in Spain has been developed. This curve could be useful to estimate suitable values of PMP at any point of the Iberian Peninsula from basic statistical parameters (mean and standard deviation) of its rainfall series.

  6. Multi-scale occupancy approach to estimate Toxoplasma gondii prevalence and detection probability in tissues: an application and guide for field sampling.

    Science.gov (United States)

    Elmore, Stacey A; Huyvaert, Kathryn P; Bailey, Larissa L; Iqbal, Asma; Su, Chunlei; Dixon, Brent R; Alisauskas, Ray T; Gajadhar, Alvin A; Jenkins, Emily J

    2016-08-01

    Increasingly, birds are recognised as important hosts for the ubiquitous parasite Toxoplasma gondii, although little experimental evidence exists to determine which tissues should be tested to maximise the detection probability of T. gondii. Also, Arctic-nesting geese are suspected to be important sources of T. gondii in terrestrial Arctic ecosystems, but the parasite has not previously been reported in the tissues of these geese. Using a domestic goose model, we applied a multi-scale occupancy framework to demonstrate that the probability of detection of T. gondii was highest in the brain (0.689, 95% confidence interval=0.486, 0.839) and the heart (0.809, 95% confidence interval=0.693, 0.888). Inoculated geese had an estimated T. gondii infection probability of 0.849, (95% confidence interval=0.643, 0.946), highlighting uncertainty in the system, even under experimental conditions. Guided by these results, we tested the brains and hearts of wild Ross's Geese (Chen rossii, n=50) and Lesser Snow Geese (Chen caerulescens, n=50) from Karrak Lake, Nunavut, Canada. We detected 51 suspected positive tissue samples from 33 wild geese using real-time PCR with melt-curve analysis. The wild goose prevalence estimates generated by our multi-scale occupancy analysis were higher than the naïve estimates of prevalence, indicating that multiple PCR repetitions on the same organs and testing more than one organ could improve T. gondii detection. Genetic characterisation revealed Type III T. gondii alleles in six wild geese and Sarcocystis spp. in 25 samples. Our study demonstrates that Arctic nesting geese are capable of harbouring T. gondii in their tissues and could transport the parasite from their southern overwintering grounds into the Arctic region. We demonstrate how a multi-scale occupancy framework can be used in a domestic animal model to guide resource-limited sample collection and tissue analysis in wildlife. Secondly, we confirm the value of traditional occupancy in

  7. Imprecise Estimation for Conditional Outage Probabilities of Power Components%电力设备停运概率的非精确条件估计

    Institute of Scientific and Technical Information of China (English)

    刁浩然; 杨明; 韩学山; 马世英; 刘道伟; 王剑辉

    2016-01-01

    By using limited outage samples, it is difficult to obtain the operational reliability indexes of power components accurately. Therefore, estimating the fluctuation ranges of the indexes can provide a more objective decision-making basis for power system operational risk control. In this paper, a novel approach was proposed to estimate the imprecise conditional outage probabilities of power components based on the credal network (CN), which could evaluate the interval ranges of conditional probabilities. Based on the historical outage statistics and the component operational conditions at the target period, a credal network, which used the imprecise Dirichlet model (IDM) to get the imprecise probabilistic dependencies, was set up to estimate the imprecise conditional outage probabilities. Therefore, the imprecise outage probabilities of components can be obtained by using the reasoning algorithm of the credal network. The proposed approach can reflect the variation of the conditional outage probabilities with respected to the operational conditions of the power components, and breaks a new way for the reliability assessment of power components with limited outage observations. Results on estimating the imprecise conditional outage probabilities of LGJ-300 transmission lines located in Shandong province illustrate the effectiveness of the proposed approach.%在小样本条件下,电力设备的运行可靠性参数难以精确获取,对其波动范围进行估计,可为电力系统运行提供更为客观、全面的评估与决策依据。由此,该文提出了一种基于信度网络(credal network, CN)的电力设备停运概率的非精确条件估计方法,对暴露型设备运行中条件相依的停运概率指标的区间范围进行估计。该方法基于设备停运的历史统计数据和估计目标时段的运行工况条件,构建了处理非精确条件概率推断问题的信度网络,并利用多状态随机变量的

  8. Estimating reach-specific fish movement probabilities in rivers with a Bayesian state-space model: application to sea lamprey passage and capture at dams

    Science.gov (United States)

    Holbrook, Christopher M.; Johnson, Nicholas S.; Steibel, Juan P.; Twohey, Michael B.; Binder, Thomas R.; Krueger, Charles C.; Jones, Michael L.

    2014-01-01

    Improved methods are needed to evaluate barriers and traps for control and assessment of invasive sea lamprey (Petromyzon marinus) in the Great Lakes. A Bayesian state-space model provided reach-specific probabilities of movement, including trap capture and dam passage, for 148 acoustic tagged invasive sea lamprey in the lower Cheboygan River, Michigan, a tributary to Lake Huron. Reach-specific movement probabilities were combined to obtain estimates of spatial distribution and abundance needed to evaluate a barrier and trap complex for sea lamprey control and assessment. Of an estimated 21 828 – 29 300 adult sea lampreys in the river, 0%–2%, or 0–514 untagged lampreys, could have passed upstream of the dam, and 46%–61% were caught in the trap. Although no tagged lampreys passed above the dam (0/148), our sample size was not sufficient to consider the lock and dam a complete barrier to sea lamprey. Results also showed that existing traps are in good locations because 83%–96% of the population was vulnerable to existing traps. However, only 52%–69% of lampreys vulnerable to traps were caught, suggesting that traps can be improved. The approach used in this study was a novel use of Bayesian state-space models that may have broader applications, including evaluation of barriers for other invasive species (e.g., Asian carp (Hypophthalmichthys spp.)) and fish passage structures for other diadromous fishes.

  9. A model to estimate the probability of human immunodeficiency virus and hepatitis C infection despite negative nucleic acid testing among increased-risk organ donors.

    Science.gov (United States)

    Annambhotla, Pallavi D; Gurbaxani, Brian M; Kuehnert, Matthew J; Basavaraju, Sridhar V

    2017-04-01

    In 2013, guidelines were released for reducing the risk of viral bloodborne pathogen transmission through organ transplantation. Eleven criteria were described that result in a donor being designated at increased infectious risk. Human immunodeficiency virus (HIV) and hepatitis C virus (HCV) transmission risk from an increased-risk donor (IRD), despite negative nucleic acid testing (NAT), likely varies based on behavior type and timing. We developed a Monte Carlo risk model to quantify probability of HIV among IRDs. The model included NAT performance, viral load dynamics, and per-act risk of acquiring HIV by each behavior. The model also quantifies the probability of HCV among IRDs by non-medical intravenous drug use (IVDU). Highest risk is among donors with history of unprotected, receptive anal male-to-male intercourse with partner of unknown HIV status (MSM), followed by sex with an HIV-infected partner, IVDU, and sex with a commercial sex worker. With NAT screening, the estimated risk of undetected HIV remains small even at 1 day following a risk behavior. The estimated risk for HCV transmission through IVDU is likewise small and decreases quicker with time owing to the faster viral growth dynamics of HCV compared with HIV. These findings may allow for improved organ allocation, utilization, and recipient informed consent. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. Using probability-based spatial estimation of the river pollution index to assess urban water recreational quality in the Tamsui River watershed.

    Science.gov (United States)

    Jang, Cheng-Shin

    2016-01-01

    The Tamsui River watershed situated in Northern Taiwan provides a variety of water recreational opportunities such as riverbank park activities, fishing, cruising, rowing, sailing, and swimming. However, river water quality strongly affects water recreational quality. Moreover, the health of recreationists who are partially or fully exposed to polluted river water may be jeopardized. A river pollution index (RPI) composed of dissolved oxygen, biochemical oxygen demand, suspended solids, and ammonia nitrogen is typically used to gauge the river water quality and regulate the water body use in Taiwan. The purpose of this study was to probabilistically determine the RPI categories in the Tamsui River watershed and to assess the urban water recreational quality on the basis of the estimated RPI categories. First, according to various RPI categories, one-dimensional indicator kriging (IK) was adopted to estimate the occurrence probabilities of the RPI categories. The maximum occurrence probability among the categories was then employed to determine the most suitable RPI category. Finally, the most serious categories and seasonal variations of RPI were adopted to evaluate the quality of current water recreational opportunities in the Tamsui River watershed. The results revealed that the midstream and downstream sections of the Tamsui River and its tributaries with poor river water quality afford low water recreational quality, and water recreationists should avoid full or limited exposure to these bodies of water. However, the upstream sections of the Tamsui River watershed with high river water quality are suitable for all water recreational activities.

  11. A technique for estimating the probability of radiation-stimulated failures of integrated microcircuits in low-intensity radiation fields: Application to the Spektr-R spacecraft

    Science.gov (United States)

    Popov, V. D.; Khamidullina, N. M.

    2006-10-01

    In developing radio-electronic devices (RED) of spacecraft operating in the fields of ionizing radiation in space, one of the most important problems is the correct estimation of their radiation tolerance. The “weakest link” in the element base of onboard microelectronic devices under radiation effect is the integrated microcircuits (IMC), especially of large scale (LSI) and very large scale (VLSI) degree of integration. The main characteristic of IMC, which is taken into account when making decisions on using some particular type of IMC in the onboard RED, is the probability of non-failure operation (NFO) at the end of the spacecraft’s lifetime. It should be noted that, until now, the NFO has been calculated only from the reliability characteristics, disregarding the radiation effect. This paper presents the so-called “reliability” approach to determination of radiation tolerance of IMC, which allows one to estimate the probability of non-failure operation of various types of IMC with due account of radiation-stimulated dose failures. The described technique is applied to RED onboard the Spektr-R spacecraft to be launched in 2007.

  12. 领域本体中概念间语义相关度的概率估计%Probability estimation for semantic association on domain ontology

    Institute of Scientific and Technical Information of China (English)

    田萱; 李冬梅

    2011-01-01

    A probability model based on Bayesian principles is given to measure the semantic association from a concept to its direct-related concept in domain ontology.The model is based on different semantic relationships,and is estimated according to maximum likelihood estimation. Semantic distance is used to estimate semantic relationships in estimating period.Based on the proposed model,a method to measure semantic association of any two concepts in ontology is given.Experiment results of semantic retrieval on open data show that semantic query expansion performs better than classic semantic query expansion.%根据贝叶斯定理提出一种衡量领域本体中概念间语义相关度的概率模型.该模型定义在不同语义关系之上,基于极大似然估计法利用语义距离来对语义关系进行参数估计.并在此基础给出一种计算任意两个概念之间语义相关度的方法.公开数据集上的实验结果表明该方法估计出的概念语义相关度具有相当的有效性,应用在语义查询扩展中可明显提高检索效果.

  13. Estimation of reproduction number and probable vector density of the first autochthonous dengue outbreak in Japan in the last 70 years.

    Science.gov (United States)

    Furuya, Hiroyuki

    2015-11-01

    The first autochthonous case of dengue fever in Japan since 1945 was reported on August 27, 2014. Infection was transmitted by Aedes albopictus mosquitoes in Tokyo's Yoyogi Park. A total of 65 cases with no history of overseas travel and who may have been infected around the park were reported as of September 5, 2014. To quantify infection risk of the local epidemic, the reproduction number and vector density per person at the onset of the epidemic were estimated. The estimated probability distribution and the number of female mosquitoes per person (MPP) were determined from the data of the initial epidemic. The estimated distribution R(0i) for the initial epidemic was fitted to a Gamma distribution using location parameter 4.25, scale parameter 0.19, and shape parameter 7.76 with median 7.78 and IQR (7.21-8.40). The MPP was fitted to a normal distribution with mean 5.71 and standard deviation 0.53. Both estimated reproduction number and vector density per person at the onset of the epidemic were higher than previously reported values. These results indicate the potential for dengue outbreaks in places with elevated vector density per person, even in dengue non-endemic countries. To investigate the cause of this outbreak, further studies will be needed, including assessments of social, behavioral, and environmental factors that may have contributed to this epidemic by altering host and vector conditions in the park.

  14. Estimated probability of becoming a case of drug dependence in relation to duration of drug-taking experience: a functional analysis approach.

    Science.gov (United States)

    Vsevolozhskaya, Olga A; Anthony, James C

    2016-06-29

    Measured as elapsed time from first use to dependence syndrome onset, the estimated "induction interval" for cocaine is thought to be short relative to the cannabis interval, but little is known about risk of becoming dependent during first months after onset of use. Virtually all published estimates for this facet of drug dependence epidemiology are from life histories elicited years after first use. To improve estimation, we turn to new month-wise data from nationally representative samples of newly incident drug users identified via probability sampling and confidential computer-assisted self-interviews for the United States National Surveys on Drug Use and Health, 2004-2013. Standardized modules assessed first and most recent use, and dependence syndromes, for each drug subtype. A four-parameter Hill function depicts the drug dependence transition for subgroups defined by units of elapsed time from first to most recent use, with an expectation of greater cocaine dependence transitions for cocaine versus cannabis. This study's novel estimates for cocaine users one month after first use show 2-4% with cocaine dependence; 12-17% are dependent when use has persisted. Corresponding cannabis estimates are 0-1% after one month, but 10-23% when use persists. Duration or persistence of cannabis smoking beyond an initial interval of a few months of use seems to be a signal of noteworthy risk for, or co-occurrence of, rapid-onset cannabis dependence, not too distant from cocaine estimates, when we sort newly incident users into subgroups defined by elapsed time from first to most recent use. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Estimation of Corrosion Probability for Steel Tank Floor-plates%钢质油罐底板腐蚀概率估计

    Institute of Scientific and Technical Information of China (English)

    杨廷鸿; 何超; 吴松林; 王春林

    2013-01-01

    This paper focuses on the floor⁃plate corrosion,which is the main issue of corrosion of steel oil tank. Firstly,it analyses the characteristic of data from engineering detection of oil tank floor⁃plate corrosion and the statistical analysis theory of corrosion test data,and finds that the information of engineering detection data for floor⁃plate corrosion is incomplete for corrosion probability estimation. Then based on the above analysis and theory,the model of corrosion probability estimation is established by the estimation of slight corrosion area and with the goal of maximal corrosion depth prediction. Finally the correctness and feasibility of the model and the maximal corrosion depth of oil tank floor⁃plate are tested through 27 tanks in Guangzhou and other areas of approximately 900 engineering detection data. The maximal relative error is less than 45%,and about 80%of relative error is less than 30%.%  针对钢质油罐底板腐蚀,首先分析了油罐底板腐蚀工程检测数据的特点和腐蚀试验数据统计分析理论;然后针对工程检测信息不完全的特性,以最大腐蚀深度的预测估计为目标,建立了以轻微腐蚀面积估计来实现腐蚀概率修正估计的模型;最后利用广州等地27个罐约900条检测数据估计了油罐底板的最大腐蚀深度。其最大相对误差小于45%,约80%的相对误差优于30%。

  16. Estimation of flood discharges at selected annual exceedance probabilities for unregulated, rural streams in Vermont, with a section on Vermont regional skew regression

    Science.gov (United States)

    Olson, Scott A.; with a section by Veilleux, Andrea G.

    2014-01-01

    This report provides estimates of flood discharges at selected annual exceedance probabilities (AEPs) for streamgages in and adjacent to Vermont and equations for estimating flood discharges at AEPs of 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent (recurrence intervals of 2-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-years, respectively) for ungaged, unregulated, rural streams in Vermont. The equations were developed using generalized least-squares regression. Flood-frequency and drainage-basin characteristics from 145 streamgages were used in developing the equations. The drainage-basin characteristics used as explanatory variables in the regression equations include drainage area, percentage of wetland area, and the basin-wide mean of the average annual precipitation. The average standard errors of prediction for estimating the flood discharges at the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent AEP with these equations are 34.9, 36.0, 38.7, 42.4, 44.9, 47.3, 50.7, and 55.1 percent, respectively. Flood discharges at selected AEPs for streamgages were computed by using the Expected Moments Algorithm. To improve estimates of the flood discharges for given exceedance probabilities at streamgages in Vermont, a new generalized skew coefficient was developed. The new generalized skew for the region is a constant, 0.44. The mean square error of the generalized skew coefficient is 0.078. This report describes a technique for using results from the regression equations to adjust an AEP discharge computed from a streamgage record. This report also describes a technique for using a drainage-area adjustment to estimate flood discharge at a selected AEP for an ungaged site upstream or downstream from a streamgage. The final regression equations and the flood-discharge frequency data used in this study will be available in StreamStats. StreamStats is a World Wide Web application providing automated regression-equation solutions for user-selected sites on streams.

  17. On ARMA Probability Density Estimation.

    Science.gov (United States)

    1981-12-01

    definitions of the constants B0k=Ol ,...,q) and ak(kl,...,p) will be given which, for a given function f(.), uniquely define an approximator f pCq .) for each...satisfied. When using f pCq () for approximation purposes it is thus important to always verify whether or not this condition is met. In concluding

  18. 不等概分类整群抽样下的均值估计及其性质%The Estimation of Mean Under the Circumstances of Non-equal-probability Classified Cluster Sample Survey and Its Nature

    Institute of Scientific and Technical Information of China (English)

    孙道德

    2003-01-01

    Basing on the multi nomial distribution and its nature, the paper conducts analysis of method of non-equal probability classified cluster sample survey and the un-bias estimation of mean, and further researches and provides the deviation square of the estimation and the un-bias estimation of the deviation square.

  19. Estimating the probability of occurrence of earthquakes (M>6) in the Western part of the Corinth rift using fault-based and classical seismotectonic approaches.

    Science.gov (United States)

    Boiselet, Aurelien; Scotti, Oona; Lyon-Caen, Hélène

    2014-05-01

    -SISCOR Working Group. On the basis of this consensual logic tree, median probability of occurrences of M>=6 events were computed for the region of study. Time-dependent models (Brownian Passage time and Weibull probability distributions) were also explored. The probability of a M>=6.0 event is found to be greater in the western region compared to the eastern part of the Corinth rift, whether a fault-based or a classical seismotectonic approach is used. Percentile probability estimates are also provided to represent the range of uncertainties in the results. The percentile results show that, in general, probability estimates following the classical approach (based on the definition of seismotectonic source zones), cover the median values estimated following the fault-based approach. On the contrary, the fault-based approach in this region is still affected by a high degree of uncertainty, because of the poor constraints on the 3D geometries of the faults and the high uncertainties in their slip rates.

  20. Combining information from surveys of several species to estimate the probability of freedom from Echinococcus multilocularis in Sweden, Finland and mainland Norway

    Directory of Open Access Journals (Sweden)

    Hjertqvist Marika

    2011-02-01

    Full Text Available Abstract Background The fox tapeworm Echinococcus multilocularis has foxes and other canids as definitive host and rodents as intermediate hosts. However, most mammals can be accidental intermediate hosts and the larval stage may cause serious disease in humans. The parasite has never been detected in Sweden, Finland and mainland Norway. All three countries require currently an anthelminthic treatment for dogs and cats prior to entry in order to prevent introduction of the parasite. Documentation of freedom from E. multilocularis is necessary for justification of the present import requirements. Methods The probability that Sweden, Finland and mainland Norway were free from E. multilocularis and the sensitivity of the surveillance systems were estimated using scenario trees. Surveillance data from five animal species were included in the study: red fox (Vulpes vulpes, raccoon dog (Nyctereutes procyonoides, domestic pig, wild boar (Sus scrofa and voles and lemmings (Arvicolinae. Results The cumulative probability of freedom from EM in December 2009 was high in all three countries, 0.98 (95% CI 0.96-0.99 in Finland and 0.99 (0.97-0.995 in Sweden and 0.98 (0.95-0.99 in Norway. Conclusions Results from the model confirm that there is a high probability that in 2009 the countries were free from E. multilocularis. The sensitivity analyses showed that the choice of the design prevalences in different infected populations was influential. Therefore more knowledge on expected prevalences for E. multilocularis in infected populations of different species is desirable to reduce residual uncertainty of the results.

  1. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  2. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  3. Un modelo de opciones barreras para estimar las probabilidades de fracasos financieros de empresas. Barrier options model for estimate firm´s probabilities for financial distress

    Directory of Open Access Journals (Sweden)

    Gastón S. Milanesi

    2016-11-01

    probabilities of financial distress. The exotic barrier options make an alternative approach for predicting financial distress, and its structure fits better to the firm valuevolatility relationship. The paper proposes a “naive” barrier option model, because it simplifies the estimation of the unobservable variables, like firm asset’s value and risk. First, a simple call and barrier option models are developed in order to value the firm’s capital and estimate the financial distress probability. Using an hypothetical case, it is proposed a sensibility exercise over period and volatility. Similar exercise is applied to estimate the capital value and financial distress probability over two firms of Argentinian capitals, with different leverage degree, confirming the consistency in the relationship between volatility-value-financial distress probability of the proposed model. Finally, the main conclusions are shown.

  4. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  5. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  6. Revised estimates of the risk of fetal loss following a prenatal diagnosis of trisomy 13 or trisomy 18.

    Science.gov (United States)

    Cavadino, Alana; Morris, Joan K

    2017-04-01

    Edwards syndrome (trisomy 18) and Patau syndrome (trisomy 13) both have high natural fetal loss rates. The aim of this study was to provide estimates of these fetal loss rates by single gestational week of age using data from the National Down Syndrome Cytogenetic Register. Data from all pregnancies with Edwards or Patau syndrome that were prenatally detected in England and Wales from 2004 to 2014 was analyzed using Kaplan-Meier survival estimates. Pregnancies were entered into the analysis at the time of gestation at diagnosis, and were considered "under observation" until the gestation at outcome. There were 4088 prenatal diagnoses of trisomy 18 and 1471 of trisomy 13 in the analysis. For trisomy 18, 30% (95%CI: 25-34%) of viable fetuses at 12 weeks will result in a live birth and at 39 weeks gestation 67% (60-73%) will result in a live birth. For trisomy 13 the survival is 50% (41-58%) at 12 weeks and 84% (73-90%) at 39 weeks. There was no significant difference in survival between males and females when diagnosed at 12 weeks for trisomy 18 (P-value = 0.27) or trisomy 13 (P-value = 0.47). This paper provides the most precise gestational age-specific estimates currently available for the risk of fetal loss in trisomy 13 and trisomy 18 pregnancies in a general population. © 2017 Wiley Periodicals, Inc.

  7. A Unique Equation to Estimate Flash Points of Selected Pure Liquids Application to the Correction of Probably Erroneous Flash Point Values

    Science.gov (United States)

    Catoire, Laurent; Naudet, Valérie

    2004-12-01

    A simple empirical equation is presented for the estimation of closed-cup flash points for pure organic liquids. Data needed for the estimation of a flash point (FP) are the normal boiling point (Teb), the standard enthalpy of vaporization at 298.15 K [ΔvapH°(298.15 K)] of the compound, and the number of carbon atoms (n) in the molecule. The bounds for this equation are: -100⩽FP(°C)⩽+200; 250⩽Teb(K)⩽650; 20⩽Δvap H°(298.15 K)/(kJ mol-1)⩽110; 1⩽n⩽21. Compared to other methods (empirical equations, structural group contribution methods, and neural network quantitative structure-property relationships), this simple equation is shown to predict accurately the flash points for a variety of compounds, whatever their chemical groups (monofunctional compounds and polyfunctional compounds) and whatever their structure (linear, branched, cyclic). The same equation is shown to be valid for hydrocarbons, organic nitrogen compounds, organic oxygen compounds, organic sulfur compounds, organic halogen compounds, and organic silicone compounds. It seems that the flash points of organic deuterium compounds, organic tin compounds, organic nickel compounds, organic phosphorus compounds, organic boron compounds, and organic germanium compounds can also be predicted accurately by this equation. A mean absolute deviation of about 3 °C, a standard deviation of about 2 °C, and a maximum absolute deviation of 10 °C are obtained when predictions are compared to experimental data for more than 600 compounds. For all these compounds, the absolute deviation is equal or lower than the reproductibility expected at a 95% confidence level for closed-cup flash point measurement. This estimation technique has its limitations concerning the polyhalogenated compounds for which the equation should be used with caution. The mean absolute deviation and maximum absolute deviation observed and the fact that the equation provides unbiaised predictions lead to the conclusion that

  8. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    Science.gov (United States)

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  9. Sensor Fusion Based on an Integrated Neural Network and Probability Density Function (PDF) Dual Kalman Filter for On-Line Estimation of Vehicle Parameters and States.

    Science.gov (United States)

    Vargas-Melendez, Leandro; Boada, Beatriz L; Boada, Maria Jesus L; Gauchia, Antonio; Diaz, Vicente

    2017-04-29

    Vehicles with a high center of gravity (COG), such as light trucks and heavy vehicles, are prone to rollover. This kind of accident causes nearly 33 % of all deaths from passenger vehicle crashes. Nowadays, these vehicles are incorporating roll stability control (RSC) systems to improve their safety. Most of the RSC systems require the vehicle roll angle as a known input variable to predict the lateral load transfer. The vehicle roll angle can be directly measured by a dual antenna global positioning system (GPS), but it is expensive. For this reason, it is important to estimate the vehicle roll angle from sensors installed onboard in current vehicles. On the other hand, the knowledge of the vehicle's parameters values is essential to obtain an accurate vehicle response. Some of vehicle parameters cannot be easily obtained and they can vary over time. In this paper, an algorithm for the simultaneous on-line estimation of vehicle's roll angle and parameters is proposed. This algorithm uses a probability density function (PDF)-based truncation method in combination with a dual Kalman filter (DKF), to guarantee that both vehicle's states and parameters are within bounds that have a physical meaning, using the information obtained from sensors mounted on vehicles. Experimental results show the effectiveness of the proposed algorithm.

  10. Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution%Estimation method for random sonic fatigue life of thin-walled structure of a combustor liner based on stress probability distribution

    Institute of Scientific and Technical Information of China (English)

    SHA Yun-dong; GUO Xiao-peng; LIAO Lian-fang; XIE Li-juan

    2011-01-01

    As to the sonic fatigue problem of an aero-engine combustor liner structure under the random acoustic loadings, an effective method for predicting the fatigue life of a structure under random loadings was studied. Firstly, the probability distribution of Von Mises stress of thin-walled structure under random loadings was studied, analysis suggested that probability density function of Von Mises stress process accord approximately with two-parameter Weibull distribution. The formula for calculating Weibull parameters were given. Based on the Miner linear theory, the method for predicting the random sonic fatigue life based on the stress probability density was developed, and the model for fatigue life prediction was constructed. As an example, an aero-engine combustor liner structure was considered. The power spectrum density (PSD) of the vibrational stress response was calculated by using the coupled FEM/BEM (finite element method/boundary element method) model, the fatigue life was estimated by using the constructed model. And considering the influence of the wide frequency band, the calculated results were modified. Comparetive analysis shows that the estimated results of sonic fatigue of the combustor liner structure by using Weibull distribution of Von Mises stress are more conservative than using Dirlik distribution to some extend. The results show that the methods presented in this paper are practical for the random fatigue life analysis of the aeronautical thin-walled structures.

  11. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  12. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  13. Comparison of Parameter Estimation Methods for Transformer Weibull Lifetime Modelling

    Institute of Scientific and Technical Information of China (English)

    ZHOU Dan; LI Chengrong; WANG Zhongdong

    2013-01-01

    Two-parameter Weibull distribution is the most widely adopted lifetime model for power transformers.An appropriate parameter estimation method is essential to guarantee the accuracy of a derived Weibull lifetime model.Six popular parameter estimation methods (i.e.the maximum likelihood estimation method,two median rank regression methods including the one regressing X on Y and the other one regressing Y on X,the Kaplan-Meier method,the method based on cumulative hazard plot,and the Li's method) are reviewed and compared in order to find the optimal one that suits transformer's Weibull lifetime modelling.The comparison took several different scenarios into consideration:10 000 sets of lifetime data,each of which had a sampling size of 40 ~ 1 000 and a censoring rate of 90%,were obtained by Monte-Carlo simulations for each scienario.Scale and shape parameters of Weibull distribution estimated by the six methods,as well as their mean value,median value and 90% confidence band are obtained.The cross comparison of these results reveals that,among the six methods,the maximum likelihood method is the best one,since it could provide the most accurate Weibull parameters,i.e.parameters having the smallest bias in both mean and median values,as well as the shortest length of the 90% confidence band.The maximum likelihood method is therefore recommended to be used over the other methods in transformer Weibull lifetime modelling.

  14. Elaboration of a clinical and paraclinical score to estimate the probability of herpes simplex virus encephalitis in patients with febrile, acute neurologic impairment.

    Science.gov (United States)

    Gennai, S; Rallo, A; Keil, D; Seigneurin, A; Germi, R; Epaulard, O

    2016-06-01

    Herpes simplex virus (HSV) encephalitis is associated with a high risk of mortality and sequelae, and early diagnosis and treatment in the emergency department are necessary. However, most patients present with non-specific febrile, acute neurologic impairment; this may lead clinicians to overlook the diagnosis of HSV encephalitis. We aimed to identify which data collected in the first hours in a medical setting were associated with the diagnosis of HSV encephalitis. We conducted a multicenter retrospective case-control study in four French public hospitals from 2007 to 2013. The cases were the adult patients who received a confirmed diagnosis of HSV encephalitis. The controls were all the patients who attended the emergency department of Grenoble hospital with a febrile acute neurologic impairment, without HSV detection by polymerase chain reaction (PCR) in the cerebrospinal fluid (CSF), in 2012 and 2013. A multivariable logistic model was elaborated to estimate factors significantly associated with HSV encephalitis. Finally, an HSV probability score was derived from the logistic model. We identified 36 cases and 103 controls. Factors independently associated with HSV encephalitis were the absence of past neurological history (odds ratio [OR] 6.25 [95 % confidence interval (CI): 2.22-16.7]), the occurrence of seizure (OR 8.09 [95 % CI: 2.73-23.94]), a systolic blood pressure ≥140 mmHg (OR 5.11 [95 % CI: 1.77-14.77]), and a C-reactive protein probability score was calculated summing the value attributed to each independent factor. HSV encephalitis diagnosis may benefit from the use of this score based upon some easily accessible data. However, diagnostic evocation and probabilistic treatment must remain the rule.

  15. 含裂结构脆性断裂的失效概率计算%The Estimation for Failure Probability of Brittle Fracture Structure

    Institute of Scientific and Technical Information of China (English)

    薛红军; 吕国志

    2001-01-01

    采用随机有限元法研究结构脆性断裂的失效概率问题,借助于能描述裂尖奇异应变场的裂尖元,将随机有限元法扩展到概率断裂力学的研究领域,简化了应力强度因子对各随机变量的求导过程,针对各随机变量的不确定性,计算了应力强度因子的统计矩,并利用优化方法确定了可靠性指标,采用一阶可靠性方法给出脆性断裂的失效概率。Ⅰ型断裂的算例表明了计算模型和方法的有效性。%We extended the probabilistic finite element method to probabilistic fracture mechanics using a singular element that embeds the near crack-tip singular strain field. The computation procedures for derivatives of the stress intensity factors with respect to the various random variables were simplified through the adjoint approach. We calculated statistical moments of stress intensity factors for various uncertainties. We also determined the probability of fracture by using an optimization procedure to perform a first-order reliability analysis. The results on a mode I fracture example show the effectiveness of computation procedures for the established model. This makes the method useful for estimating the safety and the probability of fracture of a flawed structure.

  16. Estimating Survival Rates in Gastric Cancer Based on Pathologic and Demographic Factors in Fars Cancer Registry (2001-2005

    Directory of Open Access Journals (Sweden)

    Rajaeifard Abdolreza

    2009-03-01

    Full Text Available Background: Gastric cancer remains as one of the leading causes of death worldwide. In patients with gastric cancer, the survival rate after diagnosis is relatively low. The present study aimed to evaluate the impact of demographic factors in estimation of survival rate in patients with gastric cancer in order to develop updated documents in these patients. Materials and Methods: All gastric cancer patients registered in Fars cancer registry from 2001-2006 were entered in the study. Vital status of the patients was asked by telephone contact. Survival rates were estimated using Kaplan-Meier method and compared by Log-rank test. All calculations were performed using STATA (v.8 software. The p value0.05. Conclusion: Our results showed that the survival rates of gastric cancer patients in our study were relatively low. Late diagnosis and delayed therapy are important reasons for low survival in these patients. Therefore, improving public education about primary symptoms of gastric cancer by media is recommended

  17. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  18. APJE-SLIM Based Method for Marine Human Error Probability Estimation%基于APJE-SLIM的海运人因失误概率的确定

    Institute of Scientific and Technical Information of China (English)

    席永涛; 陈伟炯; 夏少生; 张晓东

    2011-01-01

    Safety is the eternal theme in shipping industry.Research shows that human error is the main reason of maritime accidents.In order to research marine human errors, the PSF are discussed, and the human error probability (HEP) is estimated under the influence of PSF.Based on the detailed investigation of human errors in collision avoidance behavior which is the most key mission in navigation and the PSF, human reliability of mariners in collision avoidance is analyzed by using the integration of APJE and SLIM.Result shows that PSF such as fatigue and health status, knowledge, experience and training, task complexity, safety management and organizational effectiveness, etc.have varying influence on HEP.If the level of PSF can be improved, the HEP can decreased.Using APJE to determine the absolute human error probabilities of extreme point can solve the problem that the probability of reference point is hard to obtain in SLIM method, and obtain the marine HEP under the different influence levels of PSF.%安全是海运行业永恒的主题,调查研究表明,人因失误是造成海事的主要原因.为了对海运人因失误进行研究,探讨引起人因失误的行为形成因子(PSF),确定在PSF影响下的人因失误概率.在调查海上避让行为的人因失误和这些失误的行为形成因子的基础上,采用APJE和SLIM 相结合的方法对航海人员避让行为中的可靠性进行分析.结果表明,航海人员疲劳与健康程度、知识、经验与培训水平、任务复杂程度、安全管理水平与组织有效性等PSF对人因失误概率有着不同程度的影响,相应提高PSF水平,可极大地减少人因失误概率.利用APJE确定端点绝对失误概率,解决了SLIM方法中难以获得参考点概率的问题,获得了在不同种类不同水平PSF影响下的海运人因失误概率.

  19. First estimates of the probability of survival in a small-bodied, high-elevation frog (Boreal Chorus Frog, Pseudacris maculata), or how historical data can be useful

    Science.gov (United States)

    Muths, Erin L.; Scherer, R. D.; Amburgey, S. M.; Matthews, T.; Spencer, A. W.; Corn, P.S.

    2016-01-01

    In an era of shrinking budgets yet increasing demands for conservation, the value of existing (i.e., historical) data are elevated. Lengthy time series on common, or previously common, species are particularly valuable and may be available only through the use of historical information. We provide first estimates of the probability of survival and longevity (0.67–0.79 and 5–7 years, respectively) for a subalpine population of a small-bodied, ostensibly common amphibian, the Boreal Chorus Frog (Pseudacris maculata (Agassiz, 1850)), using historical data and contemporary, hypothesis-driven information–theoretic analyses. We also test a priori hypotheses about the effects of color morph (as suggested by early reports) and of drought (as suggested by recent climate predictions) on survival. Using robust mark–recapture models, we find some support for early hypotheses regarding the effect of color on survival, but we find no effect of drought. The congruence between early findings and our analyses highlights the usefulness of historical information in providing raw data for contemporary analyses and context for conservation and management decisions.

  20. Application of a maximum entropy method to estimate the probability density function of nonlinear or chaotic behavior in structural health monitoring data

    Science.gov (United States)

    Livingston, Richard A.; Jin, Shuang

    2005-05-01

    Bridges and other civil structures can exhibit nonlinear and/or chaotic behavior under ambient traffic or wind loadings. The probability density function (pdf) of the observed structural responses thus plays an important role for long-term structural health monitoring, LRFR and fatigue life analysis. However, the actual pdf of such structural response data often has a very complicated shape due to its fractal nature. Various conventional methods to approximate it can often lead to biased estimates. This paper presents recent research progress at the Turner-Fairbank Highway Research Center of the FHWA in applying a novel probabilistic scaling scheme for enhanced maximum entropy evaluation to find the most unbiased pdf. The maximum entropy method is applied with a fractal interpolation formulation based on contraction mappings through an iterated function system (IFS). Based on a fractal dimension determined from the entire response data set by an algorithm involving the information dimension, a characteristic uncertainty parameter, called the probabilistic scaling factor, can be introduced. This allows significantly enhanced maximum entropy evaluation through the added inferences about the fine scale fluctuations in the response data. Case studies using the dynamic response data sets collected from a real world bridge (Commodore Barry Bridge, PA) and from the simulation of a classical nonlinear chaotic system (the Lorenz system) are presented in this paper. The results illustrate the advantages of the probabilistic scaling method over conventional approaches for finding the unbiased pdf especially in the critical tail region that contains the larger structural responses.

  1. Variation of normal tissue complication probability (NTCP) estimates of radiation-induced hypothyroidism in relation to changes in delineation of the thyroid gland

    DEFF Research Database (Denmark)

    Rønjom, Marianne Feen; Brink, Carsten; Laugaard Lorenzen, Ebbe

    2015-01-01

    Background. To examine the variations of risk-estimates of radiation-induced hypothyroidism (HT) from our previously developed normal tissue complication probability (NTCP) model in patients with head and neck squamous cell carcinoma (HNSCC) in relation to variability of delineation of the thyroid...... gland. Patients and methods. In a previous study for development of an NTCP model for HT, the thyroid gland was delineated in 246 treatment plans of patients with HNSCC. Fifty of these plans were randomly chosen for re-delineation for a study of the intra- and inter-observer variability of thyroid......-observer variability resulted in a mean difference in thyroid volume and Dmean of 0.4 cm(3) (SD ± 1.6) and -0.5 Gy (SD ± 1.0), respectively, and 0.3 cm(3) (SD ± 1.8) and 0.0 Gy (SD ± 1.3) for inter-observer variability. The corresponding mean differences of NTCP values for radiation-induced HT due to intra- and inter...

  2. Exploiting an ensemble of regional climate models to provide robust estimates of projected changes in monthly temperature and precipitation probability distribution functions

    Energy Technology Data Exchange (ETDEWEB)

    Tapiador, Francisco J.; Sanchez, Enrique; Romera, Raquel (Inst. of Environmental Sciences, Univ. of Castilla-La Mancha (UCLM), 45071 Toledo (Spain)). e-mail: francisco.tapiador@uclm.es

    2009-07-01

    Regional climate models (RCMs) are dynamical downscaling tools aimed to improve the modelling of local physical processes. Ensembles of RCMs are widely used to improve the coarse-grain estimates of global climate models (GCMs) since the use of several RCMs helps to palliate uncertainties arising from different dynamical cores and numerical schemes methods. In this paper, we analyse the differences and similarities in the climate change response for an ensemble of heterogeneous RCMs forced by one GCM (HadAM3H), and one emissions scenario (IPCC's SRES-A2 scenario). As a difference with previous approaches using PRUDENCE database, the statistical description of climate characteristics is made through the spatial and temporal aggregation of the RCMs outputs into probability distribution functions (PDF) of monthly values. This procedure is a complementary approach to conventional seasonal analyses. Our results provide new, stronger evidence on expected marked regional differences in Europe in the A2 scenario in terms of precipitation and temperature changes. While we found an overall increase in the mean temperature and extreme values, we also found mixed regional differences for precipitation

  3. Probability Estimates of Solar Particle Event Doses During a Period of Low Sunspot Number for Thinly-Shielded Spacecraft and Short Duration Missions

    Science.gov (United States)

    Atwell, William; Tylka, Allan J.; Dietrich, William; Rojdev, Kristina; Matzkind, Courtney

    2016-01-01

    In an earlier paper (Atwell, et al., 2015), we investigated solar particle event (SPE) radiation exposures (absorbed dose) to small, thinly-shielded spacecraft during a period when the sunspot number (SSN) was less than 30. These SPEs contain Ground Level Events (GLE), sub-GLEs, and sub-sub-GLEs (Tylka and Dietrich, 2009, Tylka and Dietrich, 2008, and Atwell, et al., 2008). GLEs are extremely energetic solar particle events having proton energies extending into the several GeV range and producing secondary particles in the atmosphere, mostly neutrons, observed with ground station neutron monitors. Sub-GLE events are less energetic, extending into the several hundred MeV range, but do not produce secondary atmospheric particles. Sub-sub GLEs are even less energetic with an observable increase in protons at energies greater than 30 MeV, but no observable proton flux above 300 MeV. In this paper, we consider those SPEs that occurred during 1973-2010 when the SSN was greater than 30 but less than 50. In addition, we provide probability estimates of absorbed dose based on mission duration with a 95% confidence level (CL). We also discuss the implications of these data and provide some recommendations that may be useful to spacecraft designers of these smaller spacecraft.

  4. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.

  5. Actuarial and actual analysis of surgical results: empirical validation.

    Science.gov (United States)

    Grunkemeier, G L; Anderson, R P; Starr, A

    2001-06-01

    This report validates the use of the Kaplan-Meier (actuarial) method of computing survival curves by comparing 12-year estimates published in 1978 with current assessments. It also contrasts cumulative incidence curves, referred to as "actual" analysis in the cardiac-related literature with Kaplan-Meier curves for thromboembolism and demonstrates that with the former estimate the percentage of events that will actually occur.

  6. Impact of definitions of loss to follow-up (LTFU) in antiretroviral therapy program evaluation: variation in the definition can have an appreciable impact on estimated proportions of LTFU.

    Science.gov (United States)

    Grimsrud, Anna Thora; Cornell, Morna; Egger, Matthias; Boulle, Andrew; Myer, Landon

    2013-09-01

    To examine the impact of different definitions of loss to follow-up (LTFU) on estimates of program outcomes in cohort studies of patients on antiretroviral therapy (ART). We examined the impact of different definitions of LTFU using data from the International Epidemiological Databases to Evaluate AIDS-Southern Africa. The reference approach, Definition A, was compared with five alternative scenarios that differed in eligibility for analysis and the date assigned to the LTFU outcome. Kaplan-Meier estimates of LTFU were calculated up to 2 years after starting ART. Estimated cumulative LTFU were 14% and 22% at 12 and 24 months, respectively, using the reference approach. Differences in the proportion LTFU were reported in the alternative scenarios with 12-month estimates of LTFU varying by up to 39% compared with Definition A. Differences were largest when the date assigned to the LTFU outcome was 6 months after the date of last contact and when the site-specific definition of LTFU was used. Variation in the definitions of LTFU within cohort analyses can have an appreciable impact on estimated proportions of LTFU over 2 years of follow-up. Use of a standardized definition of LTFU is needed to accurately measure program effectiveness and comparability between programs. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  8. Estimation of probability for the presence of claw and digital skin diseases by combining cow- and herd-level information using a Bayesian network

    DEFF Research Database (Denmark)

    Ettema, Jehan Frans; Østergaard, Søren; Kristensen, Anders Ringgaard

    2009-01-01

    Cross sectional data on the prevalence of claw and (inter) digital skin diseases on 4854 Holstein Friesian cows in 50 Danish dairy herds was used in a Bayesian network to create herd specific probability distributions for the presence of lameness causing diseases. Parity and lactation stage...... probabilities and random herd effects are used to formulate cow-level probability distributions of disease presence in a specific Danish dairy herd. By step-wise inclusion of information on cow- and herd-level risk factors, lameness prevalence and clinical diagnosis of diseases on cows in the herd, the Bayesian...... network systematically adjusts the probability distributions for disease presence in the specific herd. Information on population-, herd- and cow-level is combined and the uncertainty in inference on disease probability is quantified....

  9. Markov chains and semi-Markov models in time-to-event analysis.

    Science.gov (United States)

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  10. 基于条件概率的短时睡眠状态实时估计%Sleep level estimation based on conditional probability for nap

    Institute of Scientific and Technical Information of China (English)

    王蓓; 张俊民; 张涛; 王行愚

    2015-01-01

    目的:根据脑电信号的特征,提出基于条件概率的睡眠状态实时估计方法,为睡眠监测提供反映睡眠状态连续变化的客观评价依据。方法在白天短时睡眠过程中,同步采集了4导与睡眠相关的脑电信号( C3-A2,C4-A1,O1-A2,O2-A1),对每5秒记录数据进行傅里叶变换,分别计算了8~13 Hz和2~7 Hz 的脑电节律能量占空比特征参数。主要方法包含了学习和测试两个阶段:在学习阶段,根据训练数据获得脑电特征参数的概率密度分布;在测试阶段,根据当前特征,得到各睡眠分期的条件概率,并计算获得睡眠状态的估计值。结果分析和测试了12名受试者的短时睡眠数据。通过与睡眠分期的人工判读结果相比较,睡眠状态估计值呈现了睡眠深度的连续变化。觉醒期的显著性差异为2.94,睡眠一期和二期分别为1.78和1.62,分析结果符合实际规律。结论本文所定义的睡眠状态估计值蕴含了睡眠分期的特征,较好地反映了睡眠阶段在持续和过渡期间的连续变化过程,能够为白天短时睡眠状态分析提供实时监测和分析的客观评价依据。%Objective According to the characteristics of electroencephalograph( EEG),an automatic sleep level estimation method based on conditional probability is developed. The ultimate purpose is to obtain and realize the real-time sleep level evaluation. Methods There are 4 EEG channels(O2 -A1 ,O1 -A2 ,C4 -A1 , C3 -A2 )recorded during nap. For every 5-second data,two characteristic parameters of ratio of EEG rhythms (8-13 Hz,2-7 Hz)are calculated after fast Fourier transformation(FFT). The main method consists of two models:learning and testing. During the learning stage,the probability density functions of EEG parameters are obtained based on the training data. During the testing stage,the sleep level is estimated based on the conditional probability of sleep stages. Results The

  11. Raster dataset showing the probability of elevated concentrations of nitrate in ground water in Colorado, hydrogeomorphic regions included and fertilizer use estimates not included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  12. Raster dataset showing the probability of detecting atrazine/desethyl-atrazine in ground water in Colorado, hydrogeomorphic regions included and atrazine use estimates not included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  13. Raster dataset showing the probability of detecting atrazine/desethyl-atrazine in ground water in Colorado, hydrogeomorphic regions not included and atrazine use estimates included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  14. Raster dataset showing the probability of elevated concentrations of nitrate in ground water in Colorado, hydrogeomorphic regions and fertilizer use estimates included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  15. Raster dataset showing the probability of detecting atrazine/desethyl-atrazine in ground water in Colorado, hydrogeomorphic regions and atrazine use estimates included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  16. Raster dataset showing the probability of detecting atrazine/desethyl-atrazine in ground water in Colorado, hydrogeomorphic regions and atrazine use estimates not included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  17. Raster dataset showing the probability of elevated concentrations of nitrate in ground water in Colorado, hydrogeomorphic regions and fertilizer use estimates not included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  18. Raster dataset showing the probability of elevated concentrations of nitrate in ground water in Colorado, hydrogeomorphic regions not included and fertilizer use estimates included.

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset is one of eight datasets produced by this study. Four of the datasets predict the probability of detecting atrazine and(or) desethyl-atrazine (a...

  19. Comparison of the unstructured clinician gestalt, the wells score, and the revised Geneva score to estimate pretest probability for suspected pulmonary embolism.

    Science.gov (United States)

    Penaloza, Andrea; Verschuren, Franck; Meyer, Guy; Quentin-Georget, Sybille; Soulie, Caroline; Thys, Frédéric; Roy, Pierre-Marie

    2013-08-01

    The assessment of clinical probability (as low, moderate, or high) with clinical decision rules has become a cornerstone of diagnostic strategy for patients with suspected pulmonary embolism, but little is known about the use of physician gestalt assessment of clinical probability. We evaluate the performance of gestalt assessment for diagnosing pulmonary embolism. We conducted a retrospective analysis of a prospective observational cohort of consecutive suspected pulmonary embolism patients in emergency departments. Accuracy of gestalt assessment was compared with the Wells score and the revised Geneva score by the area under the curve (AUC) of receiver operating characteristic curves. Agreement between the 3 methods was determined by κ test. The study population was 1,038 patients, with a pulmonary embolism prevalence of 31.3%. AUC differed significantly between the 3 methods and was 0.81 (95% confidence interval [CI] 0.78 to 0.84) for gestalt assessment, 0.71 (95% CI 0.68 to 0.75) for Wells, and 0.66 (95% CI 0.63 to 0.70) for the revised Geneva score. The proportion of patients categorized as having low clinical probability was statistically higher with gestalt than with revised Geneva score (43% versus 26%; 95% CI for the difference of 17%=13% to 21%). Proportion of patients categorized as having high clinical probability was higher with gestalt than with Wells (24% versus 7%; 95% CI for the difference of 17%=14% to 20%) or revised Geneva score (24% versus 10%; 95% CI for the difference of 15%=13% to 21%). Pulmonary embolism prevalence was significantly lower with gestalt versus clinical decision rules in low clinical probability (7.6% for gestalt versus 13.0% for revised Geneva score and 12.6% for Wells score) and non-high clinical probability groups (18.3% for gestalt versus 29.3% for Wells and 27.4% for revised Geneva score) and was significantly higher with gestalt versus Wells score in high clinical probability groups (72.1% versus 58.1%). Agreement

  20. Inferring Beliefs as Subjectively Imprecise Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.;

    2012-01-01

    We propose a method for estimating subjective beliefs, viewed as a subjective probability distribution. The key insight is to characterize beliefs as a parameter to be estimated from observed choices in a well-defined experimental task and to estimate that parameter as a random coefficient. The e...... probabilities are indeed best characterized as probability distributions with non-zero variance....

  1. DETERMINING TYPE Ia SUPERNOVA HOST GALAXY EXTINCTION PROBABILITIES AND A STATISTICAL APPROACH TO ESTIMATING THE ABSORPTION-TO-REDDENING RATIO R{sub V}

    Energy Technology Data Exchange (ETDEWEB)

    Cikota, Aleksandar [European Southern Observatory, Karl-Schwarzschild-Strasse 2, D-85748 Garching b. München (Germany); Deustua, Susana [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Marleau, Francine, E-mail: acikota@eso.org [Institute for Astro- and Particle Physics, University of Innsbruck, Technikerstrasse 25/8, A-6020 Innsbruck (Austria)

    2016-03-10

    We investigate limits on the extinction values of Type Ia supernovae (SNe Ia) to statistically determine the most probable color excess, E(B – V), with galactocentric distance, and use these statistics to determine the absorption-to-reddening ratio, R{sub V}, for dust in the host galaxies. We determined pixel-based dust mass surface density maps for 59 galaxies from the Key Insight on Nearby Galaxies: a Far-infrared Survey with Herschel (KINGFISH). We use SN Ia spectral templates to develop a Monte Carlo simulation of color excess E(B – V) with R{sub V} = 3.1 and investigate the color excess probabilities E(B – V) with projected radial galaxy center distance. Additionally, we tested our model using observed spectra of SN 1989B, SN 2002bo, and SN 2006X, which occurred in three KINGFISH galaxies. Finally, we determined the most probable reddening for Sa–Sap, Sab–Sbp, Sbc–Scp, Scd–Sdm, S0, and irregular galaxy classes as a function of R/R{sub 25}. We find that the largest expected reddening probabilities are in Sab–Sb and Sbc–Sc galaxies, while S0 and irregular galaxies are very dust poor. We present a new approach for determining the absorption-to-reddening ratio R{sub V} using color excess probability functions and find values of R{sub V} = 2.71 ± 1.58 for 21 SNe Ia observed in Sab–Sbp galaxies, and R{sub V} = 1.70 ± 0.38, for 34 SNe Ia observed in Sbc–Scp galaxies.

  2. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  3. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  4. Determining Type Ia Supernovae Host galaxy extinction probabilities and a statistical approach to estimating the absorption-to-reddening ratio $R_V$

    CERN Document Server

    Cikota, Aleksandar; Marleau, Francine

    2016-01-01

    We investigate limits on the extinction values of Type Ia supernovae to statistically determine the most probable color excess, E(B-V), with galactocentric distance, and use these statistics to determine the absorption-to-reddening ratio, $R_V$, for dust in the host galaxies. We determined pixel-based dust mass surface density maps for 59 galaxies from the Key Insight on Nearby Galaxies: a Far-Infrared Survey with \\textit{Herschel} (KINGFISH, Kennicutt et al. (2011)). We use Type Ia supernova spectral templates (Hsiao et al. 2007) to develop a Monte Carlo simulation of color excess E(B-V) with $R_V$ = 3.1 and investigate the color excess probabilities E(B-V) with projected radial galaxy center distance. Additionally, we tested our model using observed spectra of SN 1989B, SN 2002bo and SN 2006X, which occurred in three KINGFISH galaxies. Finally, we determined the most probable reddening for Sa-Sap, Sab-Sbp, Sbc-Scp, Scd-Sdm, S0 and Irregular galaxy classes as a function of $R/R_{25}$. We find that the larges...

  5. 基于最大熵原理的Web服务QoS概率分布获取%Probability distribution estimation for Web service QoS based on max entropy principle

    Institute of Scientific and Technical Information of China (English)

    代志华; 付晓东; 黄袁; 贾楠

    2012-01-01

    为了进行服务风险管理,需要了解服务质量(QoS)的随机特性,而描述QoS随机特性的一种有效手段是获得其准确的概率分布.为此,提出了一种基于最大熵原理在小样本情况下获取Web服务QoS概率分布的方法.方法采用最大熵原理将小样本情况下QoS概率分布获取的问题规约为一个由已知QoS数据确定约束条件的最优化问题进行求解,获得QoS概率密度函数的解析式,然后设计了对该概率密度函数解析式参数进行估计的算法.最后,以实际的Web服务QoS数据为基础,通过实验验证了该方法对不同QoS分布获取时的有效性和合理性,并验证了分布获取算法的效率和终止性.%To manage the risk of service, it is necessary to obtain stochastic character of Quality of Service (QoS) that is represented as accurate probability distribution. This paper presented an approach to estimate probability distribution of Web service QoS in the case of small number of samples. Using max entropy principle, the analytical formula of the probability density function can be obtained by transforming the probability distribution estimation problem into an optimal problem with constraints obtained from sampling QoS data. Then an algorithm to estimate parameters of the probability density function was designed. The experimental and simulation results based on real Web service QoS data show the effectiveness of the proposed approach for probability distribution estimation of different QoS attribute. The efficiency and feasibility of the distribution estimation algorithm have got validated by experiments too.

  6. Estimating the five-year survival of cervical cancer patients treated in hospital universiti sains malaysia.

    Science.gov (United States)

    Razak, Nuradhiathy Abd; Mn, Khattak; Zubairi, Yong Zulina; Naing, Nyi Nyi; Zaki, Nik Mohamed

    2013-01-01

    The objective of this study was to determine the five-year survival among patients with cervical cancer treated in Hospital Universiti Sains Malaysia. One hundred and twenty cervical cancer patients diagnosed between 1st July 1995 and 30th June 2007 were identified. Data were obtained from medical records. The survival probability was determined using the Kaplan-Meier method and the log-rank test was applied to compare the survival distribution between groups. The overall five-year survival was 39.7% [95%CI (Confidence Interval): 30.7, 51.3] with a median survival time of 40.8 (95%CI: 34.0, 62.0) months. The log-rank test showed that there were survival differences between the groups for the following variables: stage at diagnosis (p=0.005); and primary treatment (p=0.0242). Patients who were diagnosed at the latest stage (III-IV) were found to have the lowest survival, 18.4% (95%CI: 6.75, 50.1), compared to stage I and II where the five-year survival was 54.7% (95%CI: 38.7, 77.2) and 40.8% (95%CI: 27.7, 60.3), respectively. The five-year survival was higher in patients who received surgery [52.6% (95%CI: 37.5, 73.6)] as a primary treatment compared to the non-surgical group [33.3% (95%CI: 22.9, 48.4)]. The five-year survival of cervical cancer patients in this study was low. The survival of those diagnosed at an advanced stage was low compared to early stages. In addition, those who underwent surgery had higher survival than those who had no surgery for primary treatment.

  7. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  8. The probabilities of unique events.

    Directory of Open Access Journals (Sweden)

    Sangeet S Khemlani

    Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.

  9. Subjective and Objective Data Integration Method for Estimating Probabilities of Emergency Scenario%突发事件情景概率估计的主客观信息集成方法

    Institute of Scientific and Technical Information of China (English)

    于超; 刘洋; 樊治平

    2012-01-01

      如何在突发事件发生之初,采用科学的方法估计不同情景发生的概率是正确选择应急决策方案的前提和关键。针对突发事件情景概率估计问题,提出了一种主客观信息集成方法。首先,采用相似度计算方法确定同类历史案例与当前突发事件的相似度,并依据相似度筛选得到相似历史案例,通过统计相似案例灾害情景确定当前突发事件各可能情景发生的客观概率;然后,采用线性加权方法对专家主观判断给出的情景概率信息进行集结得到当前突发事件各情景发生的主观概率;进一步地,通过集成客观和主观概率信息确定突发事件各情景发生的概率;最后,通过一个算例说明了本文提出方法的可行性与有效性%  How to estimate the probabilities of different scenarios scientifically in the beginning of emergency is a premise and key problem for emergency decision making. This paper proposes a method for estimating scenario probabilities by combining the subjective and objective data. In this method, firstly, the history cases are screened according to the calculated similarity degree between history cases and the current emergency event. Then, the objective scenario probabilities are calculated according to the statistics of scenario of similar cases. Furthermore, the subjective scenario probabilities are estimated according to the weighted average result of expert judgments. Moreover, scenario probabilities are obtained by integrating the subjective and objective scenario probabilities. Finally, a numerical example is used to illustrate the feasibility and significance of the proposed method.

  10. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...... in the relevant parameters and which match the lower bounds known for these classes. Moreover, the learning algorithms are efficient.......In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  11. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  12. Towards the prediction of pre-mining stresses in the European continent. [Estimates of vertical and probable maximum lateral stress in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Blackwood, R.L.

    1980-05-15

    There are now available sufficient data from in-situ, pre-mining stress measurements to allow a first attempt at predicting the maximum stress magnitudes likely to occur in a given mining context. The sub-horizontal (lateral) stress generally dominates the stress field, becoming critical to stope stability in many cases. For cut-and-fill mining in particular, where developed fill pressures are influenced by lateral displacement of pillars or stope backs, extraction maximization planning by mathematical modelling techniques demands the best available estimate of pre-mining stresses. While field measurements are still essential for this purpose, in the present paper it is suggested that the worst stress case can be predicted for preliminary design or feasibility study purposes. In the Eurpoean continent the vertical component of pre-mining stress may be estimated by adding 2 MPa to the pressure due to overburden weight. The maximum lateral stress likely to be encountered is about 57 MPa at depths of some 800m to 1000m below the surface.

  13. Estimating the probability of polyreactive antibodies 4E10 and 2F5 disabling a gp41 trimer after T cell-HIV adhesion.

    Directory of Open Access Journals (Sweden)

    Bin Hu

    2014-01-01

    Full Text Available A few broadly neutralizing antibodies, isolated from HIV-1 infected individuals, recognize epitopes in the membrane proximal external region (MPER of gp41 that are transiently exposed during viral entry. The best characterized, 4E10 and 2F5, are polyreactive, binding to the viral membrane and their epitopes in the MPER. We present a model to calculate, for any antibody concentration, the probability that during the pre-hairpin intermediate, the transient period when the epitopes are first exposed, a bound antibody will disable a trivalent gp41 before fusion is complete. When 4E10 or 2F5 bind to the MPER, a conformational change is induced that results in a stably bound complex. The model predicts that for these antibodies to be effective at neutralization, the time to disable an epitope must be shorter than the time the antibody remains bound in this conformation, about five minutes or less for 4E10 and 2F5. We investigate the role of avidity in neutralization and show that 2F5 IgG, but not 4E10, is much more effective at neutralization than its Fab fragment. We attribute this to 2F5 interacting more stably than 4E10 with the viral membrane. We use the model to elucidate the parameters that determine the ability of these antibodies to disable epitopes and propose an extension of the model to analyze neutralization data. The extended model predicts the dependencies of IC50 for neutralization on the rate constants that characterize antibody binding, the rate of fusion of gp41, and the number of gp41 bridging the virus and target cell at the start of the pre-hairpin intermediate. Analysis of neutralization experiments indicate that only a small number of gp41 bridges must be disabled to prevent fusion. However, the model cannot determine the exact number from neutralization experiments alone.

  14. Estimation of Functional Failure Probability of Passive Systems Based on Subset Simulation Method%基于子集模拟法非能动系统功能故障概率评估

    Institute of Scientific and Technical Information of China (English)

    王冬青; 王宝生; 姜晶; 张建民

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods.%针对非能动系统多维不确定性参数和小功能故障概率问题,提出基于马尔可夫链蒙特卡罗子集模拟的可靠性分析方法.该方法通过引入适当的中间失效事件,将小功能故障概率表达为一系列较大的中间失效事件条件概率乘积的形式,进而利用马尔可夫链模拟的条件样本点来计算条件失效概率.以AP1000非能动余热排出系统为研究对象,考虑热工水力学模型和输入参数的不确定性,对其进行功能故障概率评估.结果表明:与其它概率评估方法相比,子集模拟法具有较高的计算效率,同时又能保证很高的计算精度;对非能动安全系统非线性功能函数有很强的适应性.

  15. Probability distributions for magnetotellurics

    Energy Technology Data Exchange (ETDEWEB)

    Stodt, John A.

    1982-11-01

    Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.

  16. Using hierarchical Bayesian multi-species mixture models to estimate tandem hoop-net based habitat associations and detection probabilities of fishes in reservoirs

    Science.gov (United States)

    Stewart, David R.; Long, James M.

    2015-01-01

    Species distribution models are useful tools to evaluate habitat relationships of fishes. We used hierarchical Bayesian multispecies mixture models to evaluate the relationships of both detection and abundance with habitat of reservoir fishes caught using tandem hoop nets. A total of 7,212 fish from 12 species were captured, and the majority of the catch was composed of Channel Catfish Ictalurus punctatus (46%), Bluegill Lepomis macrochirus(25%), and White Crappie Pomoxis annularis (14%). Detection estimates ranged from 8% to 69%, and modeling results suggested that fishes were primarily influenced by reservoir size and context, water clarity and temperature, and land-use types. Species were differentially abundant within and among habitat types, and some fishes were found to be more abundant in turbid, less impacted (e.g., by urbanization and agriculture) reservoirs with longer shoreline lengths; whereas, other species were found more often in clear, nutrient-rich impoundments that had generally shorter shoreline length and were surrounded by a higher percentage of agricultural land. Our results demonstrated that habitat and reservoir characteristics may differentially benefit species and assemblage structure. This study provides a useful framework for evaluating capture efficiency for not only hoop nets but other gear types used to sample fishes in reservoirs.

  17. Ruin Probability and Asymptotic Estimate of a Compound Binomial Distribution Model%复合二项分布模型的破产概率及其渐近估计

    Institute of Scientific and Technical Information of China (English)

    许璐; 赵闻达

    2012-01-01

    运用古典概率的有关知识,通过建立合适的数学模型导出了复合二项分布的破产概率的显式解,进而得到了它的渐近估计表达式。所得结论包含了有关文献的结果。%The classical probability theory is used to derive solution of the ultimate ruin prob- ability in a compound binomial distribution model, and its asymptotic estimation is obtained. The conclusion has improved the result in related literature.

  18. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  19. Probability Ranking in Vector Spaces

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.

  20. Holographic probabilities in eternal inflation.

    Science.gov (United States)

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  1. 分组截尾数据下离散型寿命概率分布的估计方法%Estimation on survival discrete probability distributions with grouped censored data

    Institute of Scientific and Technical Information of China (English)

    侯超钧; 吴东庆; 王前; 杨志伟

    2012-01-01

    At present, the grouped data to unknown survival discrete probability distributions are less studied. To a-void solving complex nonlinear maximum likelihood equations,the probability distribution formula with recursion relations is derived from the likelihood equations by joining the Lagrange multiplier. Then the maximum likelihood estimation of p1 is obtained through the degradation of single interval model. Thus , the probability distribution of pi is calculated successively. The experiment result shows that this method is effective.%目前针对离散型未知寿命分布的分组数据研究较少.为避免求解复杂的非线性极大似然方程组,由加入拉格朗日乘子的似然方程组中推导具有递推关系的概率分布计算公式,并通过退化的单区间模型得到p1的极大似然估计,从而完成概率分布律p1的递推计算.实验说明该方法有效.

  2. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  3. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  4. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  5. Comparação de distribuições de probabilidade e estimativa da precipitação provável para região de Barbacena, MG Comparasion of probability distribution models and estimative of the probable rainfall for the Barbacena County, MG

    Directory of Open Access Journals (Sweden)

    Bruno Teixeira Ribeiro

    2007-10-01

    Full Text Available Estudos probabilísticos envolvendo variáveis climáticas são de extrema importância para as atividades da agropecuária, construção civil, turismo, transporte, dentre outros. Visando contribuir para o planejamento da agricultura irrigada, este trabalho teve como objetivos comparar distribuições de probabilidade ajustadas às séries históricas decendiais e mensais, e estimar as precipitações prováveis para o município de Barbacena, MG. Foram estudados os meses de dezembro, janeiro e fevereiro, no período de 1942 a 2003, constituindo-se séries históricas com 62 anos de observações. As lâminas diárias foram totalizadas em períodos mensais e decendiais, sendo aplicadas as distribuições log-Normal 2 parâmetros, log-Normal 3 parâmetros e Gama. Para avaliar a adequabilidade das distribuições, nos períodos estudados, utilizou-se o teste de Qui-quadrado (chi2, ao nível de 5% de significância. As precipitações prováveis foram estimadas para cada período estudado utilizando a distribuição que apresentou o menor valor de chi2, nos níveis de probabilidade de excedência de 75, 90 e 98%. A distribuição Gama foi a que melhor se ajustou aos dados. O estudo de precipitações prováveis é uma boa ferramenta no auxílio da tomada de decisão quanto ao planejamento e uso da irrigação.Probabilistic studies involving climatic variables are of extreme importance for farming activities, construction, tourism, among others. Seeking to contribute for the planning of irrigate agriculture, this work had as objectives to compare adjusted probability distribution models to the monthly and decennial historical series and to estimate the probable rainfall for the Barbacena County, Minas Gerais State, Brazil. Rainfall data of December, January and February, from 1942 to 2003, were studied, constituting historical series with 62 years of observations. Daily rainfall depths were added for 10 and 30 days, applying Gama, log-Normal 2 and

  6. Adaptive Buffer Management Strategy with Message Delivery Probability Estimating Method in Opportunistic Networks%带有消息投递概率估计的机会网络自适应缓存管理策略

    Institute of Scientific and Technical Information of China (English)

    吴大鹏; 张普宁; 王汝言

    2014-01-01

    高效的缓存管理策略能够有效提高机会网络中节点的缓存资源利用率。消息的投递概率直接决定了消息的转发与存储必要性,该文提出一种带有消息投递概率估计的自适应缓存管理策略,通过构建节点连接状态分析模型,以分布式的方式感知节点服务能力,从而估计消息的投递概率,进而确定消息的转发与删除优先级,以执行缓存管理相关操作。结果表明,所提出的缓存管理策略可降低网络负载57%,并有效提高消息的成功投递率,降低消息的平均投递时延。%The buffer resource utilization in opportunistic networks can be improved by efficient buffer management strategy. The delivery probability of message is directly related to its necessity of forwarding and buffering. An adaptive buffer management strategy with message delivery probability estimating method is proposed. Through establishing the node connection status analysis model, the diversity of node service ability could be evaluated. Accordingly, to estimate the delivery probability of the message. Furthermore, the transmitting and removing priority could be determined reasonably to perform buffer management operations. Numerical results show that overhead ratio can be reduced about 57% by the proposed buffer management strategy. The delivery ratio is improved and the delivery latency is reduced efficiently.

  7. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  8. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  9. [Incidence of congenital heart disease and factors associated with mortality in children born in two Hospitals in the State of Mexico].

    Science.gov (United States)

    Mendieta-Alcántara, Gustavo Gabriel; Santiago-Alcántara, Elia; Mendieta-Zerón, Hugo; Dorantes-Piña, Ramsés; Ortiz de Zárate-Alarcón, Gabriela; Otero-Ojeda, Gloria A

    2013-01-01

    We studied the incidence, survival, and risk factors for mortality in a cohort of infants for a period of five years, born in two hospitals, one a second-level General Hospital, the second a tertiary perinatal hospital, both in the City of Toluca. The analysis of survival was performed with the Kaplan-Meier method, and Cox regression was used to estimate the risk of death according to different factors. We found an overall incidence of 7.4 per 1,000 live births; in preterm infants, the rate was 35.6 per 1,000, and in term newborns it was 3.68 per 1,000. The most common heart disease was the ductus arteriosus in the overall group and in preterm infants; in term newborns the most common was the atrial septal defect. The specific mortality was 18.64%, follow-up was 579 days, where we found, according to Kaplan-Meier, survival of an average of 437.92 days, with 95% confidence intervals of 393.25 to 482.6 days, with a standard error of 22.79 days; the cumulative probability of survival was 0.741, with a standard error of 0.44. In Cox regression, two variables had a high hazard ratio (HR): these were the presence or absence of cyanosis and the hospital where they were treated as newborns.

  10. Cluster Membership Probability: Polarimetric Approach

    CERN Document Server

    Medhi, Biman J

    2013-01-01

    Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...

  11. Assessing the effect of quantitative and qualitative predictors on gastric cancer individuals survival using hierarchical artificial neural network models.

    Science.gov (United States)

    Amiri, Zohreh; Mohammad, Kazem; Mahmoudi, Mahmood; Parsaeian, Mahbubeh; Zeraati, Hojjat

    2013-01-01

    There are numerous unanswered questions in the application of artificial neural network models for analysis of survival data. In most studies, independent variables have been studied as qualitative dichotomous variables, and results of using discrete and continuous quantitative, ordinal, or multinomial categorical predictive variables in these models are not well understood in comparison to conventional models. This study was designed and conducted to examine the application of these models in order to determine the survival of gastric cancer patients, in comparison to the Cox proportional hazards model. We studied the postoperative survival of 330 gastric cancer patients who suffered surgery at a surgical unit of the Iran Cancer Institute over a five-year period. Covariates of age, gender, history of substance abuse, cancer site, type of pathology, presence of metastasis, stage, and number of complementary treatments were entered in the models, and survival probabilities were calculated at 6, 12, 18, 24, 36, 48, and 60 months using the Cox proportional hazards and neural network models. We estimated coefficients of the Cox model and the weights in the neural network (with 3, 5, and 7 nodes in the hidden layer) in the training group, and used them to derive predictions in the study group. Predictions with these two methods were compared with those of the Kaplan-Meier product limit estimator as the gold standard. Comparisons were performed with the Friedman and Kruskal-Wallis tests. Survival probabilities at different times were determined using the Cox proportional hazards and a neural network with three nodes in the hidden layer; the ratios of standard errors with these two methods to the Kaplan-Meier method were 1.1593 and 1.0071, respectively, revealed a significant difference between Cox and Kaplan-Meier (P neural network, and the neural network and the standard (Kaplan-Meier), as well as better accuracy for the neural network (with 3 nodes in the hidden layer

  12. Parameter Estimation of Generalized Partial Probability Weighted Moments for the Generalized Pareto Distribution%广义Pareto分布的广义有偏概率加权矩估计方法

    Institute of Scientific and Technical Information of China (English)

    赵旭; 程维虎; 李婧兰

    2012-01-01

    The generalized Pareto distribution (GPD) is one of the most important distribution in statistics analysis that has been widely used in finance, insurance, hydrology and meteorology applications and so on. While traditional estimation methods, such as maximum likelihood (ML), methods of moments (MOM) and probability weighted moments (PWM) methods have been extensively applied, the use of these methods are often restricted. Alternative approaches (e.g., generalized probability weighted moments, L-moments and LH-moments) exist but they use complete or non-censored samples. However, censored samples are often encountered in hydrology and meteorology fields. In this article, we propose a computationally easy method from censored data for fitting the GPD, which is resistant against extremely small or large outliers, I.e., they will be robust with the lower and upper breakdown points. This method is based on probability weighted moments. Firstly, we solve shape parameter estimator which has high estimated precision, then the location and scale parameters are given for the GPD. Simulation studies show that the proposed method performs well compared to traditional techniques.%广义Pareto分布(GPD)是统计分析中一个极为重要的分布,被广泛应用于金融、保险、水文及气象等领域.传统的参数估计方法如极大似然估计、矩估计及概率加权矩估计方法等已被广泛应用,但使用中存在一定的局限性.虽然提出很多改进方法如广义概率加权矩估计、L矩和LH矩法等,但都是研究完全样本的估计问题,而在水文及气象等应用领域常出现截尾样本.本文基于概率加权矩理论,利用截尾样本对三参数GPD提出一种应用范围广且简单易行的参数估计方法,可有效减弱异常值的影响.首先求解出具有较高精度的形状参数的参数估计,其次得出位置参数及尺度参数的参数估计.通过Monte Carlo模拟说明该方法估计精度较高.

  13. Detonation probabilities of high explosives

    Energy Technology Data Exchange (ETDEWEB)

    Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.

    1995-07-01

    The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.

  14. Comparison of an automated most-probable-number technique with traditional plating methods for estimating populations of total aerobes, coliforms, and Escherichia coli associated with freshly processed broiler chickens.

    Science.gov (United States)

    Line, J E; Stern, N J; Oakley, B B; Seal, B S

    2011-09-01

    An instrument (TEMPO) has been developed to automate the most-probable-number (MPN) technique and reduce the effort required to estimate some bacterial populations. We compared the automated MPN technique with traditional microbiological plating methods and Petrifilm methods for estimating the total viable count of aerobic microorganisms (TVC), total coliforms (CC), and Escherichia coli populations (EC) on freshly processed broiler chicken carcasses (postchill whole carcass rinse [WCR] samples) and cumulative drip-line samples from a commercial broiler processing facility. Overall, 120 broiler carcasses, 36 prechill drip-line samples, and 40 postchill drip-line samples were collected over 5 days (representing five individual flocks) and analyzed by the automated MPN and direct agar plating and Petrifilm methods. The TVC correlation coefficient between the automated MPN and traditional methods was very high (0.972) for the prechill drip samples, which had mean log-transformed values of 3.09 and 3.02, respectively. The TVC correlation coefficient was lower (0.710) for the postchill WCR samples, which had lower mean log values of 1.53 and 1.31, respectively. Correlations between the methods for the prechill CC and EC samples were 0.812 and 0.880, respectively. The estimated number of total aerobes was generally greater than the total number of coliforms or E. coli recovered for all sample types (P methods and the automated MPN method.

  15. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha...

  16. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  17. Estimating the mean and standard deviation of environmental data with below detection limit observations: Considering highly skewed data and model misspecification.

    Science.gov (United States)

    Shoari, Niloofar; Dubé, Jean-Sébastien; Chenouri, Shoja'eddin

    2015-11-01

    In environmental studies, concentration measurements frequently fall below detection limits of measuring instruments, resulting in left-censored data. Some studies employ parametric methods such as the maximum likelihood estimator (MLE), robust regression on order statistic (rROS), and gamma regression on order statistic (GROS), while others suggest a non-parametric approach, the Kaplan-Meier method (KM). Using examples of real data from a soil characterization study in Montreal, we highlight the need for additional investigations that aim at unifying the existing literature. A number of studies have examined this issue; however, those considering data skewness and model misspecification are rare. These aspects are investigated in this paper through simulations. Among other findings, results show that for low skewed data, the performance of different statistical methods is comparable, regardless of the censoring percentage and sample size. For highly skewed data, the performance of the MLE method under lognormal and Weibull distributions is questionable; particularly, when the sample size is small or censoring percentage is high. In such conditions, MLE under gamma distribution, rROS, GROS, and KM are less sensitive to skewness. Related to model misspecification, MLE based on lognormal and Weibull distributions provides poor estimates when the true distribution of data is misspecified. However, the methods of rROS, GROS, and MLE under gamma distribution are generally robust to model misspecifications regardless of skewness, sample size, and censoring percentage. Since the characteristics of environmental data (e.g., type of distribution and skewness) are unknown a priori, we suggest using MLE based on gamma distribution, rROS and GROS.

  18. A Novel Probability-of-Detection-Difference Maximum-Based TOA Estimation Method in UWB Dense Multipath Channels%UWB密集多径下基于检测概率差值最大的TOA估计方法

    Institute of Scientific and Technical Information of China (English)

    刘泽龙; 刘文彦; 丁宏; 黄晓涛

    2012-01-01

    In dense multipath channels, a practical ultra-wideband ( UWB) ranging system usually adopts time-of-arrival-based (TOA-based) energy detection (ED) receiver. The accuracy of TOA estimation determines the accuracy of ranging. Threshold comparison is used a lot in ED and the designing of threshold affects the accuracy of TOA estimation greatly. In this paper, we apply the maximum probability of detection (MPD) method to the energy-based TOA estimators. By improving the method of calculating the probability of detection, a new criteria to determine the threshold value and detect the direct path ( DP) is proposed. The algorithm calculates the probability of detection of DP during searching the optimal threshold. The DP is determined when the difference of the DP detection probability reaches maximum. And then the optimal threshold is also acquired. The paper analyzes theoretically and presents the novel procedure of the proposed TOA estimation method. In the Simulation, we compare our method with other methods. Simulation results show that our method outperforms others and verify the effectiveness of our method.%密集多径信道下,较为实际的超宽带(ultra-wideband,UWB)测距系统一般采用基于到达时间(time of arrival,TOA)的能量检测器(energy detector,ED)接收机.TOA的估计精度决定着测距精度.在ED中多采用阈值比较的方法来估计TOA,阈值的设计对TOA估计的精度有着非常重要的影响.本文将最大概率检测(maximumprobability of detection,MPD)算法应用于基于ED的TOA估计器中,并对样本检测概率计算方法进行改进,在搜索阈值的基础上计算出每次正确检测到直达路径( direct path,DP)的概率,提出一种阈值确定和DP检测新准则,即把相邻两个DP检测概率差值最大时对应的路径作为DP,则该次搜索所对应的阈值即为最佳阈值.文中给出了这种准则的理论分析及TOA估计算法流程.最后通过仿真比较考察了不同信噪比下

  19. Probability and Relative Frequency

    Science.gov (United States)

    Drieschner, Michael

    2016-01-01

    The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.

  20. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  1. 非线性结构动力学系统的首次穿越%An important sampling procedure for estimating failure probabilities of non-linear dynamic systems

    Institute of Scientific and Technical Information of China (English)

    任丽梅; 徐伟; 李战国

    2013-01-01

    The failure probability is one of the most important reliability measures in structural reliability assessment of dynamical systems. Here, a procedure for estimating failure probabilities of non-linear systems based on the important sampling technique was presented. Firstly ,by using Rice formula,the equivalent linear version of the non-linear systems was derived. Using the equivalent linear equation, the design point of the equivalent linear systems was used to construct control function. Secondly, an important sampling technique was used to estimate the first excursion probabilities for the non-linear system. Finally, a Duffing oscillator was taken for example. The simulation results showed that the proposed method is correct and effective; the number of samples and the computational time are reduced significantly compared with those of direct Monte Carlo simulations.%在结构动力学系统的可靠性分析中,动力学系统的首次穿越失效一直是研究重点问题之一.基于重要抽样法基础上研究了非线性结构动力学系统的首次穿越.首先根据Rice公式,得到与非线性系统方程具有相同平均上穿率的等效线性化系统方程,利用此等效方程得到设计点的解析表达式,并用此解析式来构造控制函数,最后将此控制函数运用到非线性系统中,利用重要抽样法估计非线性系统的首穿失效概率.以Duffing振子为例,模拟结果显示了方法的正确性与有效性,与原始蒙特卡罗模拟方法相比较,样本数量、计算所需时间都有明显减小.

  2. 基于小概率事件法估计大坝混凝土实际抗拉强度%Estimation of Actual Tensile Strength of Dam Concrete Based on Small Probability Event Method

    Institute of Scientific and Technical Information of China (English)

    黄耀英; 郑宏; 周宜红; 武先伟

    2012-01-01

    结合三维弹性徐变仿真计算公式分析了应变计组实测值转化为实际应力的计算公式,认为当前采用应变计组实测应变计算三维空间实际应力的公式不够完善,给出了理论严谨的应变计组测值转化三维空间实际应力计算公式.提出了利用小概率事件法估计大坝混凝土实际抗拉强度,结合混凝土大坝埋设的应变计组和无应力计实测值,采用小概率事件法初步探讨了大坝混凝土的实际抗拉强度和极限拉伸变形的估计,实践表明,当获得应变计组长时间测值系列,以及获得较多的应变计组的测值样本后,基于小概率事件法可以得到符合大坝混凝土实际情况的抗拉强度和极限拉伸变形.%The calculation formula of transforming the stress measured by strain gauge group into the actual one was analyzed in this paper by combining three dimensional (3D) elastic creep simulation calculation formula. It is believed the current 3D spatial actual stress formulas calculated by measured value of the strain gauge group are not perfect. Therefore, the theoretical rigorous formula of strain gauge group's measured value transforming to 3D spatial actual stress was provided in this paper. Furthermore, the way of estimating the concrete dam practical tensile strength through small probability events method was put forward, and the estimation of actual tensile strength and ultimate tensile strain was also it preliminarily studied by small probability event method. It is demonstrated by the engineering practices that when getting a long time strain gauge measurements series and more strain gauge group measured samples, it can obtain the actual situation of the tensile strength and ultimate tensile strain conform to the concrete dam based on small probability e-vent method.

  3. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  4. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  5. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  6. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  7. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  8. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  9. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  10. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  11. 基于蒙特卡洛模拟算法的化工装置失效概率估算%Estimation of the chemical device failure probability based on Monte Carlo simulation algorithm

    Institute of Scientific and Technical Information of China (English)

    窦站; 蒋军成; 朱常龙; 张明广; 赵声萍

    2012-01-01

    分别采用分析算法和蒙特卡洛模拟算法计算装置的瞬时失效概率.首先对瞬时失效概率进行理论研究,依据可靠度理论进行分析,得到瞬时失效概率理论计算公式,但此方法存在理论公式复杂以及计算误差等问题;其次利用蒙特卡洛模拟分析构建化工装置失效概率估算方法,得出利用矩阵实验室(MATLAB)运行的程序流程;最后通过实例计算某个装置的瞬时失效概率,比较分析算法和蒙特卡洛模拟算法所得结果.结果表明,蒙特卡洛算法就整体性而言是可以接受的,而且随着装置数量的增加,装置的瞬时失效概率下降速率加快,这一情况与实际相符.%In this paper, we would like to propose a new method based on Monte Carlo Simulation (MCS) theory for chemical device failure probability estimation, particularly that of the chemical facili- ties failure characteristic of domino effect. As is seen, in petroleum chemical works, it is often the case that chemical device failure often leads to domino effect, which might be not only affected by the internal factors, but also by external factors, such as domino effect brought about from the outside. Therefore, it is of great necessity to consider the domino effect when we come to the case of such kinds of device failure. Needless to say, chemical equipment accidents with domino effect usually take place in a complicated manner due to the diversified reasons or unexpected or expected causes. Great knowledge and vast experience are needed to make a sound assessment or estimation of such events based on the probability theory. Even if the situation was made clear, errors in calculation, such as round-off error, things would remain suspicious because the data used for such assessment may not be adequate in size and accuracy for the number of likely scenarios is too huge. Therefore, at the first stage (initialization) , it is necessary to specify the data and information of the

  12. Dynamical Simulation of Probabilities

    Science.gov (United States)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.

  13. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  14. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  15. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  16. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  17. Probability and radical behaviorism

    Science.gov (United States)

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  18. Probability and radical behaviorism

    OpenAIRE

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...

  19. ESTIMATE THE PROBABILITY OF WHETHER A PROTEIN FAMILY CONTRIBUTES A NEW FOLDS%估计一个蛋白质家族属于一个新折叠子的概率

    Institute of Scientific and Technical Information of China (English)

    吕波; 刘心声

    2011-01-01

    The statistical inference for the protein families, structures and new functions is a frontier research field in the applied statistics. Based on the SCOP (Structural Classification of Proteins database) and Pfam (Sequence Classification database), and the dynamic information of the SCOP database, we first estimate the total number of the folds, which are needed to cover the current Pfam database. By constructing the Bayesian Model with the prior information of whether the folds are previously known which cover the Pfam families mapping by the newly appeared SCOP families, we then estimate the probability distributions of whether a Pfam family with given size contributes a new folds.%关于蛋白质家族、结构和新功能的统计推断是应用数理统计的一个前沿交叉研究方向.本文以蛋白质结构分类数据库SCOP和序列分类数据库Pfam为基础,结合SCOP数据库的动态信息,我们估计出覆盖当前Pfam数据库所需的折叠子总数;通过SCOP中新增家族在Pfam中的对应家族所属的折叠子是否已知为先验信息构建贝叶斯模型,估计了不同规模的Pfam家族贡献新折叠子的概率分布.

  20. PROBABILITY AND STATISTICS.

    Science.gov (United States)

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  1. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  2. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  3. On Quantum Conditional Probability

    Directory of Open Access Journals (Sweden)

    Isabel Guerra Bobo

    2013-02-01

    Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.

  4. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  5. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  6. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  7. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  8. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  9. Recent Developments in Applied Probability and Statistics

    CERN Document Server

    Devroye, Luc; Kohler, Michael; Korn, Ralf

    2010-01-01

    This book presents surveys on recent developments in applied probability and statistics. The contributions include topics such as nonparametric regression and density estimation, option pricing, probabilistic methods for multivariate interpolation, robust graphical modelling and stochastic differential equations. Due to its broad coverage of different topics the book offers an excellent overview of recent developments in applied probability and statistics.

  10. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  11. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  12. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  13. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  14. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  15. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  16. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  17. THE NUCLEAR ENCOUNTER PROBABILITY

    NARCIS (Netherlands)

    SMULDERS, PJM

    1994-01-01

    This Letter dicusses the nuclear encounter probability as used in ion channeling analysis. A formulation is given, incorporating effects of large beam angles and beam divergence. A critical examination of previous definitions is made.

  18. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  19. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  20. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  1. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  2. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  3. UT Biomedical Informatics Lab (BMIL probability wheel

    Directory of Open Access Journals (Sweden)

    Sheng-Cheng Huang

    2016-01-01

    Full Text Available A probability wheel app is intended to facilitate communication between two people, an “investigator” and a “participant”, about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  4. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  5. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  6. PROBABILITY SAMPLING DESIGNS FOR VETERINARY EPIDEMIOLOGY

    OpenAIRE

    Xhelil Koleci; Coryn, Chris L.S.; Kristin A. Hobson; Rruzhdi Keci

    2011-01-01

    The objective of sampling is to estimate population parameters, such as incidence or prevalence, from information contained in a sample. In this paper, the authors describe sources of error in sampling; basic probability sampling designs, including simple random sampling, stratified sampling, systematic sampling, and cluster sampling; estimating a population size if unknown; and factors influencing sample size determination for epidemiological studies in veterinary medicine.

  7. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  8. Carbonic anhydrase IX and response to postmastectomy radiotherapy in high-risk breast cancer: a subgroup analysis of the DBCG82 b and c trials

    DEFF Research Database (Denmark)

    Kyndi, M.; Sorensen, F.B.; Alsner, J.;

    2008-01-01

    -points were loco-regional recurrence, distant metastases, disease-specific survival and overall survival. Statistical analyses included kappa statistics, chi(2) or exact tests, Kaplan-Meier probability plots, Log-rank test and Cox regression analyses. Results CA IX was assessable in 945 cores. The percentage...

  9. Carbonic anhydrase IX and response to postmastectomy radiotherapy in high-risk breast cancer: a subgroup analysis of the DBCG82 b and c trials

    DEFF Research Database (Denmark)

    Kyndi, Marianne; Sørensen, Flemming Brandt; Knudsen, Helle;

    2008-01-01

    -points were loco-regional recurrence, distant metastases, disease-specific survival and overall survival. Statistical analyses included kappa statistics, chi2 or exact tests, Kaplan-Meier probability plots, Log-rank test and Cox regression analyses. RESULTS: CA IX was assessable in 945 cores. The percentage...

  10. Interim analysis for binary outcome trials with a long fixed follow-up time and repeated outcome assessments at pre-specified times.

    Science.gov (United States)

    Parpia, Sameer; Julian, Jim A; Gu, Chushu; Thabane, Lehana; Levine, Mark N

    2014-01-01

    In trials with binary outcomes, assessed repeatedly at pre-specified times and where the subject is considered to have experienced a failure at the first occurrence of the outcome, interim analyses are performed, generally, after half or more of the subjects have completed follow-up. Depending on the duration of accrual relative to the length of follow-up, this may be inefficient, since there is a possibility that the trial will have completed accrual prior to the interim analysis. An alternative is to plan the interim analysis after subjects have completed follow-up to a time that is less than the fixed full follow-up duration. Using simulations, we evaluated three methods to estimate the event proportion for the interim analysis in terms of type I and II errors and the probability of early stopping. We considered: 1) estimation of the event proportion based on subjects who have been followed for a pre-specified time (less than the full follow-up duration) or who experienced the outcome; 2) estimation of the event proportion based on data from all subjects that have been randomized by the time of the interim analysis; and 3) the Kaplan-Meier approach to estimate the event proportion at the time of the interim analysis. Our results show that all methods preserve and have comparable type I and II errors in certain scenarios. In these cases, we recommend using the Kaplan-Meier method because it incorporates all the available data and has greater probability of early stopping when the treatment effect exists.

  11. Probabilities from Envariance

    CERN Document Server

    Zurek, W H

    2004-01-01

    I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...

  12. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  13. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  14. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  15. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  16. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  17. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  18. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  19. Varga: On Probability.

    Science.gov (United States)

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  20. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  1. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  2. Epidemiology of cryptosporidiosis among European AIDS patients

    DEFF Research Database (Denmark)

    Pedersen, C; Danner, S; Lazzarin, A;

    1996-01-01

    OBJECTIVE: To study epidemiology and possible risk factors associated with the development of cryptosporidiosis among European patients with AIDS. METHODS: An inception cohort of 6548 patients with AIDS, consecutively diagnosed from 1979 to 1989, from 52 centres in 17 European countries was studied....... Data on all AIDS defining events were collected retrospectively from patients' clinical records. Kaplan-Meier estimates, log rank tests and Cox proportional hazard models were used to examine for possible risk factors associated with cryptosporidiosis. RESULTS: Cryptosporidiosis was diagnosed in 432 (6.......6%) patients, 216 at time of the AIDS diagnosis and 216 during follow-up. The probability of being diagnosed with cryptosporidiosis at AIDS diagnosis was significantly lower for intravenous drug users (1.3%) than for homosexual men (4.1%) and for patients belonging to other transmission categories (4.0%) (p...

  3. Stavudine- and nevirapine-related drug toxicity while on generic fixed-dose antiretroviral treatment: incidence, timing and risk factors in a three-year cohort in Kigali, Rwanda.

    Science.gov (United States)

    van Griensven, Johan; Zachariah, Rony; Rasschaert, Freya; Mugabo, Jules; Atté, Edi F; Reid, Tony

    2010-02-01

    This cohort study was conducted to report on the incidence, timing and risk factors for stavudine (d4T)- and nevirapine (NVP)-related severe drug toxicity (requiring substitution) with a generic fixed-dose combination under program conditions in Kigali, Rwanda. Probability of 'time to first toxicity-related drug substitution' was estimated using the Kaplan-Meier method and Cox-proportional hazards modeling was used to identify risk factors. Out of 2190 adults (median follow-up: 1.5 years), d4T was replaced in 175 patients (8.0%) for neuropathy, 69 (3.1%) for lactic acidosis and 157 (7.2%) for lipoatrophy, which was the most frequent toxicity by 3 years of antiretroviral treatment (ART). NVP was substituted in 4.9 and 1.3% of patients for skin rash and hepatotoxicity, respectively. Use of d4T 40 mg was associated with increased risk of lipoatrophy and early (strategies.

  4. Improved fertility following conservative surgical treatment of ectopic pregnancy

    DEFF Research Database (Denmark)

    Bangsgaard, Nannie; Lund, Claus Otto; Ottesen, Bent;

    2003-01-01

    OBJECTIVE: To evaluate fertility after salpingectomy or tubotomy for ectopic pregnancy. DESIGN: Retrospective cohort study. SETTING: Clinical University Center, Hvidovre Hospital, Copenhagen. POPULATION: Two hundred and seventy-six women undergoing salpingectomy or tubotomy for their first ectopic...... pregnancy between January 1992 and January 1999 and who actively attempted to conceive were followed for a minimum of 18 months. METHODS: Retrospective cohort study combined with questionnaire to compare reproductive outcome following salpingectomy or tubotomy for ectopic pregnancy. Cumulative probabilities...... of pregnancy for each group were calculated by the Kaplan-Meier estimator and compared by Cox regression analysis to control for potential confounders. MAIN OUTCOME MEASURES: Intrauterine pregnancy rates and recurrence rates of ectopic pregnancy after surgery for ectopic pregnancy. RESULTS: The cumulative...

  5. Epidemiology of cryptosporidiosis among European AIDS patients

    DEFF Research Database (Denmark)

    Pedersen, C; Danner, S; Lazzarin, A

    1996-01-01

    OBJECTIVE: To study epidemiology and possible risk factors associated with the development of cryptosporidiosis among European patients with AIDS. METHODS: An inception cohort of 6548 patients with AIDS, consecutively diagnosed from 1979 to 1989, from 52 centres in 17 European countries was studied....... Data on all AIDS defining events were collected retrospectively from patients' clinical records. Kaplan-Meier estimates, log rank tests and Cox proportional hazard models were used to examine for possible risk factors associated with cryptosporidiosis. RESULTS: Cryptosporidiosis was diagnosed in 432 (6.......6%) patients, 216 at time of the AIDS diagnosis and 216 during follow-up. The probability of being diagnosed with cryptosporidiosis at AIDS diagnosis was significantly lower for intravenous drug users (1.3%) than for homosexual men (4.1%) and for patients belonging to other transmission categories (4.0%) (p...

  6. Improved fertility following conservative surgical treatment of ectopic pregnancy

    DEFF Research Database (Denmark)

    Bangsgaard, Nannie; Lund, Claus Otto; Ottesen, Bent

    2003-01-01

    OBJECTIVE: To evaluate fertility after salpingectomy or tubotomy for ectopic pregnancy. DESIGN: Retrospective cohort study. SETTING: Clinical University Center, Hvidovre Hospital, Copenhagen. POPULATION: Two hundred and seventy-six women undergoing salpingectomy or tubotomy for their first ectopic...... pregnancy between January 1992 and January 1999 and who actively attempted to conceive were followed for a minimum of 18 months. METHODS: Retrospective cohort study combined with questionnaire to compare reproductive outcome following salpingectomy or tubotomy for ectopic pregnancy. Cumulative probabilities...... of pregnancy for each group were calculated by the Kaplan-Meier estimator and compared by Cox regression analysis to control for potential confounders. MAIN OUTCOME MEASURES: Intrauterine pregnancy rates and recurrence rates of ectopic pregnancy after surgery for ectopic pregnancy. RESULTS: The cumulative...

  7. Understanding Y haplotype matching probability.

    Science.gov (United States)

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  8. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  9. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  10. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  11. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  12. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  13. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  14. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  15. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  16. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  17. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  18. Searching with Probabilities

    Science.gov (United States)

    1983-07-26

    DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75

  19. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  20. 带干扰的相关风险下的破产概率的渐近估计%The asymptotic estimation of ruin probability of correlated Insurance risk model by interference

    Institute of Scientific and Technical Information of China (English)

    陈安平; 王传玉; 郭红财

    2011-01-01

    This paper studies correlated two-type insurance risk model by diffusion by using the stochastic process to derive the integral-differential equations satisfied by the survival probabilities under the assumed model and the upper bound of the ruin probabilities.%考虑了一类带干扰的双险种相关风险模型,利用随机过程的方法得出了该模型的生存概率所满足的积分-微分方程和破产概率的上界.

  1. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  2. Probabilities of exoplanet signals from posterior samplings

    CERN Document Server

    Tuomi, Mikko

    2011-01-01

    Estimating the marginal likelihoods is an essential feature of model selection in the Bayesian context. It is especially crucial to have good estimates when assessing the number of planets orbiting stars when the models explain the noisy data with different numbers of Keplerian signals. We introduce a simple method for approximating the marginal likelihoods in practice when a statistically representative sample from the parameter posterior density is available. We use our truncated posterior mixture estimate to receive accurate model probabilities for models with differing number of Keplerian signals in radial velocity data. We test this estimate in simple scenarios to assess its accuracy and rate of convergence in practice when the corresponding estimates calculated using deviance information criterion can be applied to receive trustworthy results for reliable comparison. As a test case, we determine the posterior probability of a planet orbiting HD 3651 given Lick and Keck radial velocity data. The posterio...

  3. Site occupancy models with heterogeneous detection probabilities

    Science.gov (United States)

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  4. Good outcome of total hip replacement in patients with cerebral palsy

    Science.gov (United States)

    King, Garry; Hunt, Linda P; Wilkinson, J Mark; Blom, Ashley W

    2016-01-01

    Background and purpose — People with cerebral palsy (CP) often have painful deformed hips, but they are seldom treated with hip replacement as the surgery is considered to be high risk. However, few data are available on the outcome of hip replacement in these patients. Patients and methods — We linked Hospital Episode Statistics (HES) records to the National Joint Registry for England and Wales to identify 389 patients with CP who had undergone hip replacement. Their treatment and outcomes were compared with those of 425,813 patients who did not have CP. Kaplan-Meier estimates were calculated to describe implant survivorship and the curves were compared using log-rank tests, with further stratification for age and implant type. Reasons for revision were quantified as patient-time incidence rates (PTIRs). Nationally collected patient-reported outcomes (PROMS) before and 6 months after operation were compared if available. Cumulative mortality (Kaplan-Meier) was estimated at 90 days and at 1, 3, and 5 years. Results — The cumulative probability of revision at 5 years post-surgery was 6.4% (95% CI: 3.8–11) in the CP cohort as opposed to 2.9% (CI 2.9–3%) in the non-CP cohort (p < 0.001). Patient-reported outcomes showed that CP patients had worse pain and function preoperatively, but had equivalent postoperative improvement. The median improvement in Oxford hip score at 6 months was 23 (IQR: 14–28) in CP and it was 21 (14–28) in non-CP patients. 91% of CP patients reported good or excellent satisfaction with their outcome. The cumulative probability of mortality for CP up to 7 years was similar to that in the controls after stratification for age and sex. Interpretation — Hip replacement for cerebral palsy appears to be safe and effective, although implant revision rates are higher than those in patients without cerebral palsy. PMID:26863583

  5. Age-dependent trends in postoperative mortality and preoperative comorbidity in isolated coronary artery bypass surgery

    DEFF Research Database (Denmark)

    Thorsteinsson, Kristinn; Fonager, Kirsten; Mérie, Charlotte;

    2016-01-01

    . Predictors of 30-day mortality were analysed in a multivariable Cox proportional-hazard models and survival at 1 and 5 years was estimated by Kaplan-Meier curves. RESULTS: A total of 38 830 patients were included; the median age was 65.4 ± 9.5 years, increasing over time to 66.6 ± 9.5 years. Males comprised...

  6. Factors associated with the risk of secondary progression in multiple sclerosis

    NARCIS (Netherlands)

    Koch, M; Uyttenboogaart, M; van Harten, A; De Keyser, J

    2008-01-01

    Objective To investigate factors associated with the risk of secondary progression in relapsing-remitting onset multiple sclerosis (MS). Methods We used Kaplan-Meier survival analyses and a multivariable Cox regression model to estimate the influence of the factors: gender, age at disease onset, use

  7. Reoperation for urinary incontinence

    DEFF Research Database (Denmark)

    Foss Hansen, Margrethe; Lose, Gunnar; Kesmodel, Ulrik Schiøler

    2016-01-01

    on a nationwide population. STUDY DESIGN: We used the Danish National Patient Registry to identify women who had surgery for urinary incontinence from 1998 through 2007 and the outcome was a reoperation within 5 years. Kaplan-Meier curves were used to estimate the rate of reoperation for 6 types of surgery...

  8. The association between biventricular pacing and cardiac resynchronization therapy-defibrillator efficacy when compared with implantable cardioverter defibrillator on outcomes and reverse remodelling

    DEFF Research Database (Denmark)

    Ruwald, Anne-Christine; Kutyifa, Valentina; Ruwald, Martin H;

    2015-01-01

    : Using Kaplan-Meier plots, we estimated the threshold of BIV pacing percentage needed for CRT-D to be superior to ICD on the end-point of heart failure (HF) or death in 1219 left bundle branch block (LBBB) patients in the MADIT-CRT trial. Patients were censored at the time of crossover. In multivariable...

  9. A nationwide study of serous “borderline” ovarian tumors in Denmark 1978–2002

    DEFF Research Database (Denmark)

    Hannibal, Charlotte Gerd; Vang, Russell; Junge, Jette;

    2014-01-01

    as noninvasive or invasive. Medical records were collected from hospital departments and reviewed. Data were analyzed using Kaplan-Meier and relative survival was estimated with follow-up through September 2, 2013. RESULTS: A cohort of 1042 women with a confirmed SBT diagnosis was identified. Women with stage I...

  10. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  11. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  12. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  13. The correct estimate of the probability of false detection of the matched filter in weak-signal detection problems . II. Further results with application to a set of ALMA and ATCA data

    Science.gov (United States)

    Vio, R.; Vergès, C.; Andreani, P.

    2017-08-01

    The matched filter (MF) is one of the most popular and reliable techniques to the detect signals of known structure and amplitude smaller than the level of the contaminating noise. Under the assumption of stationary Gaussian noise, MF maximizes the probability of detection subject to a constant probability of false detection or false alarm (PFA). This property relies upon a priori knowledge of the position of the searched signals, which is usually not available. Recently, it has been shown that when applied in its standard form, MF may severely underestimate the PFA. As a consequence the statistical significance of features that belong to noise is overestimated and the resulting detections are actually spurious. For this reason, an alternative method of computing the PFA has been proposed that is based on the probability density function (PDF) of the peaks of an isotropic Gaussian random field. In this paper we further develop this method. In particular, we discuss the statistical meaning of the PFA and show that, although useful as a preliminary step in a detection procedure, it is not able to quantify the actual reliability of a specific detection. For this reason, a new quantity is introduced called the specific probability of false alarm (SPFA), which is able to carry out this computation. We show how this method works in targeted simulations and apply it to a few interferometric maps taken with the Atacama Large Millimeter/submillimeter Array (ALMA) and the Australia Telescope Compact Array (ATCA). We select a few potential new point sources and assign an accurate detection reliability to these sources.

  14. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  15. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  16. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  17. Probabilities for Solar Siblings

    CERN Document Server

    Valtonen, M; Bobylev, V V; Myllari, A

    2015-01-01

    We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  18. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  19. Learning unbelievable marginal probabilities

    CERN Document Server

    Pitkow, Xaq; Miller, Ken D

    2011-01-01

    Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...

  20. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...