WorldWideScience

Sample records for random survival probabilities

  1. On Randomness and Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.

  2. On Randomness and Probability

    Indian Academy of Sciences (India)

    casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...

  3. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  4. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  5. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out...

  6. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  7. A nonparametric method for predicting survival probabilities

    NARCIS (Netherlands)

    van der Klaauw, B.; Vriend, S.

    2015-01-01

    Public programs often use statistical profiling to assess the risk that applicants will become long-term dependent on the program. The literature uses linear probability models and (Cox) proportional hazard models to predict duration outcomes. These either focus on one threshold duration or impose

  8. Energy dependence of gap survival probability and antishadowing

    OpenAIRE

    Troshin, S M; Tyurin, N. E.

    2004-01-01

    We discuss energy dependence of gap survival probability which follows from rational form of amplitude unitarization. In contrast to eikonal form of unitarization which leads to decreasing energy dependence of gap survival probability, we predict a non-monotonous form for this dependence.

  9. Nonequilibrium random matrix theory: Transition probabilities

    Science.gov (United States)

    Pedro, Francisco Gil; Westphal, Alexander

    2017-03-01

    In this paper we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  10. Survival probability and order statistics of diffusion on disordered media.

    Science.gov (United States)

    Acedo, L; Yuste, S B

    2002-07-01

    We investigate the first passage time t(j,N) to a given chemical or Euclidean distance of the first j of a set of N>1 independent random walkers all initially placed on a site of a disordered medium. To solve this order-statistics problem we assume that, for short times, the survival probability (the probability that a single random walker is not absorbed by a hyperspherical surface during some time interval) decays for disordered media in the same way as for Euclidean and some class of deterministic fractal lattices. This conjecture is checked by simulation on the incipient percolation aggregate embedded in two dimensions. Arbitrary moments of t(j,N) are expressed in terms of an asymptotic series in powers of 1/ln N, which is formally identical to those found for Euclidean and (some class of) deterministic fractal lattices. The agreement of the asymptotic expressions with simulation results for the two-dimensional percolation aggregate is good when the boundary is defined in terms of the chemical distance. The agreement worsens slightly when the Euclidean distance is used.

  11. Duality of circulation decay statistics and survival probability

    Science.gov (United States)

    2010-09-01

    Survival probability and circulation decay history have both been used for setting wake turbulence separation standards. Conceptually a strong correlation should exist between these two characterizations of the vortex behavior, however, the literatur...

  12. Negative probability of random multiplier in turbulence

    Science.gov (United States)

    Bai, Xuan; Su, Weidong

    2017-11-01

    The random multiplicative process (RMP), which has been proposed for over 50 years, is a convenient phenomenological ansatz of turbulence cascade. In the RMP, the fluctuation in a large scale is statistically mapped to the one in a small scale by the linear action of an independent random multiplier (RM). Simple as it is, the RMP is powerful enough since all of the known scaling laws can be included in this model. So far as we know, however, a direct extraction for the probability density function (PDF) of RM has been absent yet. The reason is the deconvolution during the process is ill-posed. Nevertheless, with the progress in the studies of inverse problems, the situation can be changed. By using some new regularization techniques, for the first time we recover the PDFs of the RMs in some turbulent flows. All the consistent results from various methods point to an amazing observation-the PDFs can attain negative values in some intervals; and this can also be justified by some properties of infinitely divisible distributions. Despite the conceptual unconventionality, the present study illustrates the implications of negative probability in turbulence in several aspects, with emphasis on its role in describing the interaction between fluctuations at different scales. This work is supported by the NSFC (No. 11221062 and No. 11521091).

  13. Probability Distributions for Random Quantum Operations

    Science.gov (United States)

    Schultz, Kevin

    Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.

  14. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  15. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  16. Seasonal survival probabilities suggest low migration mortality in migrating bats.

    Directory of Open Access Journals (Sweden)

    Simone Giavi

    Full Text Available Migration is adaptive if survival benefits are larger than costs of residency. Many aspects of bat migration ecology such as migratory costs, stopover site use and fidelity are largely unknown. Since many migrating bats are endangered, such information is urgently needed to promote conservation. We selected the migrating Leisler's bat (Nyctalus leisleri as model species and collected capture-recapture data in southern Switzerland year round during 6 years. We estimated seasonal survival and site fidelity with Cormack-Jolly-Seber models that accounted for the presence of transients fitted with Bayesian methods and assessed differences between sexes and seasons. Activity peaked in autumn and spring, whereas very few individuals were caught during summer. We hypothesize that the study site is a migratory stopover site used during fall and spring migration for most individuals, but there is also evidence for wintering. Additionally, we found strong clues for mating during fall. Summer survival that included two major migratory journeys was identical to winter survival in males and slightly higher in females, suggesting that the migratory journeys did not bear significant costs in terms of survival. Transience probability was in both seasons higher in males than in females. Our results suggest that, similarly to birds, Leisler's bat also use stopover sites during migration with high site fidelity. In contrast to most birds, the stopover site was also used for mating and migratory costs in terms of survival seemed to be low. Transients' analyses highlighted strong individual variation in site use which makes particularly challenging the study and modelling of their populations as well as their conservation.

  17. Seasonal survival probabilities suggest low migration mortality in migrating bats.

    Science.gov (United States)

    Giavi, Simone; Moretti, Marco; Bontadina, Fabio; Zambelli, Nicola; Schaub, Michael

    2014-01-01

    Migration is adaptive if survival benefits are larger than costs of residency. Many aspects of bat migration ecology such as migratory costs, stopover site use and fidelity are largely unknown. Since many migrating bats are endangered, such information is urgently needed to promote conservation. We selected the migrating Leisler's bat (Nyctalus leisleri) as model species and collected capture-recapture data in southern Switzerland year round during 6 years. We estimated seasonal survival and site fidelity with Cormack-Jolly-Seber models that accounted for the presence of transients fitted with Bayesian methods and assessed differences between sexes and seasons. Activity peaked in autumn and spring, whereas very few individuals were caught during summer. We hypothesize that the study site is a migratory stopover site used during fall and spring migration for most individuals, but there is also evidence for wintering. Additionally, we found strong clues for mating during fall. Summer survival that included two major migratory journeys was identical to winter survival in males and slightly higher in females, suggesting that the migratory journeys did not bear significant costs in terms of survival. Transience probability was in both seasons higher in males than in females. Our results suggest that, similarly to birds, Leisler's bat also use stopover sites during migration with high site fidelity. In contrast to most birds, the stopover site was also used for mating and migratory costs in terms of survival seemed to be low. Transients' analyses highlighted strong individual variation in site use which makes particularly challenging the study and modelling of their populations as well as their conservation.

  18. Hybrid computer technique yields random signal probability distributions

    Science.gov (United States)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  19. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  20. Probability of stress-corrosion fracture under random loading

    Science.gov (United States)

    Yang, J. N.

    1974-01-01

    Mathematical formulation is based on cumulative-damage hypothesis and experimentally-determined stress-corrosion characteristics. Under both stationary random loadings, mean value and variance of cumulative damage are obtained. Probability of stress-corrosion fracture is then evaluated, using principle of maximum entropy.

  1. Non-equilibrium random matrix theory. Transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, Francisco Gil [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Gruppe Theorie

    2016-06-15

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  2. Time preference and its relationship with age, health, and survival probability

    Directory of Open Access Journals (Sweden)

    Li-Wei Chao

    2009-02-01

    Full Text Available Although theories from economics and evolutionary biology predict that one's age, health, and survival probability should be associated with one's subjective discount rate (SDR, few studies have empirically tested for these links. Our study analyzes in detail how the SDR is related to age, health, and survival probability, by surveying a sample of individuals in townships around Durban, South Africa. In contrast to previous studies, we find that age is not significantly related to the SDR, but both physical health and survival expectations have a U-shaped relationship with the SDR. Individuals in very poor health have high discount rates, and those in very good health also have high discount rates. Similarly, those with expected survival probability on the extremes have high discount rates. Therefore, health and survival probability, and not age, seem to be predictors of one's SDR in an area of the world with high morbidity and mortality.

  3. Computer routines for probability distributions, random numbers, and related functions

    Science.gov (United States)

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  4. Application of random match probability calculations to mixed STR profiles.

    Science.gov (United States)

    Bille, Todd; Bright, Jo-Anne; Buckleton, John

    2013-03-01

    Mixed DNA profiles are being encountered more frequently as laboratories analyze increasing amounts of touch evidence. If it is determined that an individual could be a possible contributor to the mixture, it is necessary to perform a statistical analysis to allow an assignment of weight to the evidence. Currently, the combined probability of inclusion (CPI) and the likelihood ratio (LR) are the most commonly used methods to perform the statistical analysis. A third method, random match probability (RMP), is available. This article compares the advantages and disadvantages of the CPI and LR methods to the RMP method. We demonstrate that although the LR method is still considered the most powerful of the binary methods, the RMP and LR methods make similar use of the observed data such as peak height, assumed number of contributors, and known contributors where the CPI calculation tends to waste information and be less informative. © 2013 American Academy of Forensic Sciences.

  5. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  6. Monte Carlo based protocol for cell survival and tumour control probability in BNCT.

    Science.gov (United States)

    Ye, S J

    1999-02-01

    A mathematical model to calculate the theoretical cell survival probability (nominally, the cell survival fraction) is developed to evaluate preclinical treatment conditions for boron neutron capture therapy (BNCT). A treatment condition is characterized by the neutron beam spectra, single or bilateral exposure, and the choice of boron carrier drug (boronophenylalanine (BPA) or boron sulfhydryl hydride (BSH)). The cell survival probability defined from Poisson statistics is expressed with the cell-killing yield, the 10B(n,alpha)7Li reaction density, and the tolerable neutron fluence. The radiation transport calculation from the neutron source to tumours is carried out using Monte Carlo methods: (i) reactor-based BNCT facility modelling to yield the neutron beam library at an irradiation port; (ii) dosimetry to limit the neutron fluence below a tolerance dose (10.5 Gy-Eq); (iii) calculation of the 10B(n,alpha)7Li reaction density in tumours. A shallow surface tumour could be effectively treated by single exposure producing an average cell survival probability of 10(-3)-10(-5) for probable ranges of the cell-killing yield for the two drugs, while a deep tumour will require bilateral exposure to achieve comparable cell kills at depth. With very pure epithermal beams eliminating thermal, low epithermal and fast neutrons, the cell survival can be decreased by factors of 2-10 compared with the unmodified neutron spectrum. A dominant effect of cell-killing yield on tumour cell survival demonstrates the importance of choice of boron carrier drug. However, these calculations do not indicate an unambiguous preference for one drug, due to the large overlap of tumour cell survival in the probable ranges of the cell-killing yield for the two drugs. The cell survival value averaged over a bulky tumour volume is used to predict the overall BNCT therapeutic efficacy, using a simple model of tumour control probability (TCP).

  7. Limit law for transition probabilities and moderate deviations for Sinai's random walk in random environment

    CERN Document Server

    Comets, F

    2003-01-01

    We consider a one-dimensional random walk in random environment in the Sinai's regime. Our main result is that logarithms of the transition probabilities, after a suitable rescaling, converge in distribution as time tends to infinity, to some functional of the Brownian motion. We compute the law of this functional when the initial and final points agree. Also, among other things, we estimate the probability of being at time~$t$ at distance at least $z$ from the initial position, when $z$ is larger than $\\ln^2 t$, but still of logarithmic order in time.

  8. Probability of Survival Scores in Different Trauma Registries: A Systematic Review.

    Science.gov (United States)

    Stoica, Bogdan; Paun, Sorin; Tanase, Ioan; Negoi, Ionut; Chiotoroiu, Alexandru; Beuran, Mircea

    2016-01-01

    A mixed score to predict the probability of survival has a key role in the modern trauma systems. The aim of the current studies is to summarize the current knowledge about estimation of survival in major trauma patients, in different trauma registries. Systematic review of the literature using electronic search in the PubMed/Medline, Web of Science Core Collection and EBSCO databases. We have used as a MeSH or truncated words a combination of trauma "probability of survival" and "mixed scores". The search strategy in PubMed was: "((((trauma(MeSH Major Topic)) OR injury(Title/Abstract)) AND score (Title/Abstract)) AND survival) AND registry (Title/Abstract))))". We used as a language selection only English language literature. There is no consensus between the major trauma registries, regarding probability of survival estimation in major trauma patients. The German (RISC II), United Kingdom (PS Model 14) trauma registries scores are based of the largest population, with demographics updated to the nowadays European injury pattern. The revised TRISS, resulting from the USA National Trauma Database, seems to be inaccurate for trauma systems managing predominantly blunt injuries. The probability of survival should be evaluated in all major trauma patients, with a score derived from a population which reproduce the current demographics.Only a careful audit of the unpredicted deaths may continuously improve our care for severely injured patients. Celsius.

  9. Survival probability of an immobile target in a sea of evanescent diffusive or subdiffusive traps: a fractional equation approach.

    Science.gov (United States)

    Abad, E; Yuste, S B; Lindenberg, Katja

    2012-12-01

    We calculate the survival probability of an immobile target surrounded by a sea of uncorrelated diffusive or subdiffusive evanescent traps (i.e., traps that disappear in the course of their motion). Our calculation is based on a fractional reaction-subdiffusion equation derived from a continuous time random walk model of the system. Contrary to an earlier method valid only in one dimension (d=1), the equation is applicable in any Euclidean dimension d and elucidates the interplay between anomalous subdiffusive transport, the irreversible evanescence reaction, and the dimension in which both the traps and the target are embedded. Explicit results for the survival probability of the target are obtained for a density ρ(t) of traps which decays (i) exponentially and (ii) as a power law. In the former case, the target has a finite asymptotic survival probability in all integer dimensions, whereas in the latter case there are several regimes where the values of the decay exponent for ρ(t) and the anomalous diffusion exponent of the traps determine whether or not the target has a chance of eternal survival in one, two, and three dimensions.

  10. 30-Day Survival Probabilities as a Quality Indicator for Norwegian Hospitals: Data Management and Analysis.

    Science.gov (United States)

    Hassani, Sahar; Lindman, Anja Schou; Kristoffersen, Doris Tove; Tomic, Oliver; Helgeland, Jon

    2015-01-01

    The Norwegian Knowledge Centre for the Health Services (NOKC) reports 30-day survival as a quality indicator for Norwegian hospitals. The indicators have been published annually since 2011 on the website of the Norwegian Directorate of Health (www.helsenorge.no), as part of the Norwegian Quality Indicator System authorized by the Ministry of Health. Openness regarding calculation of quality indicators is important, as it provides the opportunity to critically review and discuss the method. The purpose of this article is to describe the data collection, data pre-processing, and data analyses, as carried out by NOKC, for the calculation of 30-day risk-adjusted survival probability as a quality indicator. Three diagnosis-specific 30-day survival indicators (first time acute myocardial infarction (AMI), stroke and hip fracture) are estimated based on all-cause deaths, occurring in-hospital or out-of-hospital, within 30 days counting from the first day of hospitalization. Furthermore, a hospital-wide (i.e. overall) 30-day survival indicator is calculated. Patient administrative data from all Norwegian hospitals and information from the Norwegian Population Register are retrieved annually, and linked to datasets for previous years. The outcome (alive/death within 30 days) is attributed to every hospital by the fraction of time spent in each hospital. A logistic regression followed by a hierarchical Bayesian analysis is used for the estimation of risk-adjusted survival probabilities. A multiple testing procedure with a false discovery rate of 5% is used to identify hospitals, hospital trusts and regional health authorities with significantly higher/lower survival than the reference. In addition, estimated risk-adjusted survival probabilities are published per hospital, hospital trust and regional health authority. The variation in risk-adjusted survival probabilities across hospitals for AMI shows a decreasing trend over time: estimated survival probabilities for AMI in

  11. 30-Day Survival Probabilities as a Quality Indicator for Norwegian Hospitals: Data Management and Analysis.

    Directory of Open Access Journals (Sweden)

    Sahar Hassani

    Full Text Available The Norwegian Knowledge Centre for the Health Services (NOKC reports 30-day survival as a quality indicator for Norwegian hospitals. The indicators have been published annually since 2011 on the website of the Norwegian Directorate of Health (www.helsenorge.no, as part of the Norwegian Quality Indicator System authorized by the Ministry of Health. Openness regarding calculation of quality indicators is important, as it provides the opportunity to critically review and discuss the method. The purpose of this article is to describe the data collection, data pre-processing, and data analyses, as carried out by NOKC, for the calculation of 30-day risk-adjusted survival probability as a quality indicator.Three diagnosis-specific 30-day survival indicators (first time acute myocardial infarction (AMI, stroke and hip fracture are estimated based on all-cause deaths, occurring in-hospital or out-of-hospital, within 30 days counting from the first day of hospitalization. Furthermore, a hospital-wide (i.e. overall 30-day survival indicator is calculated. Patient administrative data from all Norwegian hospitals and information from the Norwegian Population Register are retrieved annually, and linked to datasets for previous years. The outcome (alive/death within 30 days is attributed to every hospital by the fraction of time spent in each hospital. A logistic regression followed by a hierarchical Bayesian analysis is used for the estimation of risk-adjusted survival probabilities. A multiple testing procedure with a false discovery rate of 5% is used to identify hospitals, hospital trusts and regional health authorities with significantly higher/lower survival than the reference. In addition, estimated risk-adjusted survival probabilities are published per hospital, hospital trust and regional health authority. The variation in risk-adjusted survival probabilities across hospitals for AMI shows a decreasing trend over time: estimated survival probabilities

  12. Estimating Probability of Default on Peer to Peer Market – Survival Analysis Approach

    Directory of Open Access Journals (Sweden)

    Đurović Andrija

    2017-05-01

    Full Text Available Arguably a cornerstone of credit risk modelling is the probability of default. This article aims is to search for the evidence of relationship between loan characteristics and probability of default on peer-to-peer (P2P market. In line with that, two loan characteristics are analysed: 1 loan term length and 2 loan purpose. The analysis is conducted using survival analysis approach within the vintage framework. Firstly, 12 months probability of default through the cycle is used to compare riskiness of analysed loan characteristics. Secondly, log-rank test is employed in order to compare complete survival period of cohorts. Findings of the paper suggest that there is clear evidence of relationship between analysed loan characteristics and probability of default. Longer term loans are more risky than the shorter term ones and the least risky loans are those used for credit card payoff.

  13. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  14. Killing (absorption) versus survival in random motion

    Science.gov (United States)

    Garbaczewski, Piotr

    2017-09-01

    We address diffusion processes in a bounded domain, while focusing on somewhat unexplored affinities between the presence of absorbing and/or inaccessible boundaries. For the Brownian motion (Lévy-stable cases are briefly mentioned) model-independent features are established of the dynamical law that underlies the short-time behavior of these random paths, whose overall lifetime is predefined to be long. As a by-product, the limiting regime of a permanent trapping in a domain is obtained. We demonstrate that the adopted conditioning method, involving the so-called Bernstein transition function, works properly also in an unbounded domain, for stochastic processes with killing (Feynman-Kac kernels play the role of transition densities), provided the spectrum of the related semigroup operator is discrete. The method is shown to be useful in the case, when the spectrum of the generator goes down to zero and no isolated minimal (ground state) eigenvalue is in existence, like in the problem of the long-term survival on a half-line with a sink at origin.

  15. Notes on the Lumped Backward Master Equation for the Neutron Extinction/Survival Probability

    Energy Technology Data Exchange (ETDEWEB)

    Prinja, Anil K [Los Alamos National Laboratory

    2012-07-02

    chains (a fission chain is defined as the initial source neutron and all its subsequent progeny) in which some chains are short lived while others propagate for unusually long times. Under these conditions, fission chains do not overlap strongly and this precludes the cancellation of neutron number fluctuations necessary for the mean to become established as the dominant measure of the neutron population. The fate of individual chains then plays a defining role in the evolution of the neutron population in strongly stochastic systems, and of particular interest and importance in supercritical systems is the extinction probability, defined as the probability that the neutron chain (initiating neutron and its progeny) will be extinguished at a particular time, or its complement, the time-dependent survival probability. The time-asymptotic limit of the latter, the probability of divergence, gives the probability that the neutron population will grow without bound, and is more commonly known as the probability of initiation or just POI. The ability to numerically compute these probabilities, with high accuracy and without overly restricting the underlying physics (e.g., fission neutron multiplicity, reactivity variation) is clearly essential in developing an understanding of the behavior of strongly stochastic systems.

  16. Effects of amphibian chytrid fungus on individual survival probability in wild boreal toads

    Science.gov (United States)

    Pilliod, D.S.; Muths, E.; Scherer, R. D.; Bartelt, P.E.; Corn, P.S.; Hossack, B.R.; Lambert, B.A.; Mccaffery, R.; Gaughan, C.

    2010-01-01

    Chytridiomycosis is linked to the worldwide decline of amphibians, yet little is known about the demographic effects of the disease. We collected capture-recapture data on three populations of boreal toads (Bufo boreas [Bufo = Anaxyrus]) in the Rocky Mountains (U.S.A.). Two of the populations were infected with chytridiomycosis and one was not. We examined the effect of the presence of amphibian chytrid fungus (Batrachochytrium dendrobatidis [Bd]; the agent of chytridiomycosis) on survival probability and population growth rate. Toads that were infected with Bd had lower average annual survival probability than uninfected individuals at sites where Bd was detected, which suggests chytridiomycosis may reduce survival by 31-42% in wild boreal toads. Toads that were negative for Bd at infected sites had survival probabilities comparable to toads at the uninfected site. Evidence that environmental covariates (particularly cold temperatures during the breeding season) influenced toad survival was weak. The number of individuals in diseased populations declined by 5-7%/year over the 6 years of the study, whereas the uninfected population had comparatively stable population growth. Our data suggest that the presence of Bd in these toad populations is not causing rapid population declines. Rather, chytridiomycosis appears to be functioning as a low-level, chronic disease whereby some infected individuals survive but the overall population effects are still negative. Our results show that some amphibian populations may be coexisting with Bd and highlight the importance of quantitative assessments of survival in diseased animal populations. Journal compilation. ?? 2010 Society for Conservation Biology. No claim to original US government works.

  17. Some Bounds on the Deviation Probability for Sums of Nonnegative Random Variables Using Upper Polynomials, Moment and Probability Generating Functions

    OpenAIRE

    From, Steven G.

    2010-01-01

    We present several new bounds for certain sums of deviation probabilities involving sums of nonnegative random variables. These are based upon upper bounds for the moment generating functions of the sums. We compare these new bounds to those of Maurer [2], Bernstein [4], Pinelis [16], and Bentkus [3]. We also briefly discuss the infinitely divisible distributions case.

  18. Probability on graphs random processes on graphs and lattices

    CERN Document Server

    Grimmett, Geoffrey

    2018-01-01

    This introduction to some of the principal models in the theory of disordered systems leads the reader through the basics, to the very edge of contemporary research, with the minimum of technical fuss. Topics covered include random walk, percolation, self-avoiding walk, interacting particle systems, uniform spanning tree, random graphs, as well as the Ising, Potts, and random-cluster models for ferromagnetism, and the Lorentz model for motion in a random medium. This new edition features accounts of major recent progress, including the exact value of the connective constant of the hexagonal lattice, and the critical point of the random-cluster model on the square lattice. The choice of topics is strongly motivated by modern applications, and focuses on areas that merit further research. Accessible to a wide audience of mathematicians and physicists, this book can be used as a graduate course text. Each chapter ends with a range of exercises.

  19. Survival and compound nucleus probability of super heavy element Z = 117

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First grade College, Department of Physics, Kolar, Karnataka (India)

    2017-05-15

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of {sup 289-297}Ts, we have calculated the transmission probability (T{sub l}), compound nucleus formation probabilities (P{sub CN}) and survival probability (P{sub sur}) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of {sup 289-297}Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei {sup 289-297}Ts are worked out and listed explicitly. We have also studied the variation of P{sub CN} and P{sub sur} with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  20. A lower bound on the probability that a binomial random variable is exceeding its mean

    OpenAIRE

    Pelekis, Christos; Ramon, Jan

    2016-01-01

    We provide a lower bound on the probability that a binomial random variable is exceeding its mean. Our proof employs estimates on the mean absolute deviation and the tail conditional expectation of binomial random variables.

  1. rft1d: Smooth One-Dimensional Random Field Upcrossing Probabilities in Python

    Directory of Open Access Journals (Sweden)

    Todd C. Pataky

    2016-07-01

    Full Text Available Through topological expectations regarding smooth, thresholded n-dimensional Gaussian continua, random field theory (RFT describes probabilities associated with both the field-wide maximum and threshold-surviving upcrossing geometry. A key application of RFT is a correction for multiple comparisons which affords field-level hypothesis testing for both univariate and multivariate fields. For unbroken isotropic fields just one parameter in addition to the mean and variance is required: the ratio of a field's size to its smoothness. Ironically the simplest manifestation of RFT (1D unbroken fields has rarely surfaced in the literature, even during its foundational development in the late 1970s. This Python package implements 1D RFT primarily for exploring and validating RFT expectations, but also describes how it can be applied to yield statistical inferences regarding sets of experimental 1D fields.

  2. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    Science.gov (United States)

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  3. Modeling of thermal stresses and probability of survival of tubular SOFC

    Energy Technology Data Exchange (ETDEWEB)

    Nakajo, Arata [Laboratory for Industrial Energy Systems (LENI), Faculty of Engineering, Swiss Federal Institute of Technology, 1015 Lausanne (Switzerland); Stiller, Christoph; Bolland, Olav [Department of Energy and Process Engineering, Norwegian University of Science and Technology, Trondheim N-7491 (Norway); Haerkegaard, Gunnar [Department of Engineering Design and Materials, Norwegian University of Science and Technology, Trondheim N-7491 (Norway)

    2006-07-14

    The temperature profile generated by a thermo-electro-chemical model was used to calculate the thermal stress distribution in a tubular solid oxide fuel cell (SOFC). The solid heat balances were calculated separately for each layer of the MEA (membrane electrode assembly) in order to detect the radial thermal gradients more precisely. It appeared that the electrolyte undergoes high tensile stresses at the ends of the cell in limited areas and that the anode is submitted to moderate tensile stresses. A simplified version of the widely used Weibull analysis was used to calculate the global probability of survival for the assessment of the risks related to both operating points and load changes. The cell at room temperature was considered and revealed as critical. As a general trend, the computed probabilities of survival were too low for the typical requirements for a commercial product. A sensitivity analysis showed a strong influence of the thermal expansion mismatch between the layers of the MEA on the probability of survival. The lack of knowledge on mechanical material properties as well as uncertainties about the phenomena occurring in the cell revealed itself as a limiting parameter for the simulation of thermal stresses. (author)

  4. Analysis of feedbacks between nucleation rate, survival probability and cloud condensation nuclei formation

    Science.gov (United States)

    Westervelt, D. M.; Pierce, J. R.; Adams, P. J.

    2014-06-01

    Aerosol nucleation is an important source of particle number in the atmosphere. However, in order to become cloud condensation nuclei (CCN), freshly nucleated particles must undergo significant condensational growth while avoiding coagulational scavenging. In an effort to quantify the contribution of nucleation to CCN, this work uses the GEOS-Chem-TOMAS global aerosol model to calculate changes in CCN concentrations against a broad range of nucleation rates and mechanisms. We then quantify the factors that control CCN formation from nucleation, including daily nucleation rates, growth rates, coagulation sinks, condensation sinks, survival probabilities, and CCN formation rates, in order to examine feedbacks that may limit growth of nucleated particles to CCN. Nucleation rate parameterizations tested in GEOS-Chem-TOMAS include ternary nucleation (with multiple tuning factors), activation nucleation (with two pre-factors), binary nucleation, and ion-mediated nucleation. We find that nucleation makes a significant contribution to boundary layer CCN(0.2%), but this contribution is only modestly sensitive to the choice of nucleation scheme, ranging from 49 to 78% increase in concentrations over a control simulation with no nucleation. Moreover, a two order-of-magnitude increase in the globally averaged nucleation rate (via changes to tuning factors) results in small changes (less than 10%) to global CCN(0.2%) concentrations. To explain this, we present a simple theory showing that survival probability has an exponentially decreasing dependence on the square of the condensation sink. This functional form stems from a negative correlation between condensation sink and growth rate and a positive correlation between condensation sink and coagulational scavenging. Conceptually, with a fixed condensable vapor budget (sulfuric acid and organics), any increase in CCN concentrations due to higher nucleation rates necessarily entails an increased aerosol surface area in the

  5. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...... in high-dimensional problems and in settings such as HIV/AIDS that involve many competing risks....

  6. From gap probabilities in random matrix theory to eigenvalue expansions

    Science.gov (United States)

    Bothner, Thomas

    2016-02-01

    We present a method to derive asymptotics of eigenvalues for trace-class integral operators K :{L}2(J;{{d}}λ )\\circlearrowleft , acting on a single interval J\\subset {{R}}, which belongs to the ring of integrable operators (Its et al 1990 Int. J. Mod. Phys. B 4 1003-37 ). Our emphasis lies on the behavior of the spectrum \\{{λ }i(J)\\}{}i=0∞ of K as | J| \\to ∞ and i is fixed. We show that this behavior is intimately linked to the analysis of the Fredholm determinant {det}(I-γ K){| }{L2(J)} as | J| \\to ∞ and γ \\uparrow 1 in a Stokes type scaling regime. Concrete asymptotic formulæ are obtained for the eigenvalues of Airy and Bessel kernels in random matrix theory. Dedicated to Percy Deift and Craig Tracy on the occasion of their 70th birthdays.

  7. Cell survival probability in a spread-out Bragg peak for novel treatment planning

    Science.gov (United States)

    Surdutovich, Eugene; Solov'yov, Andrey V.

    2017-08-01

    The problem of variable cell survival probability along the spread-out Bragg peak is one of the long standing problems in planning and optimisation of ion-beam therapy. This problem is considered using the multiscale approach to the physics of ion-beam therapy. The physical reasons for this problem are analysed and understood on a quantitative level. A recipe of solution to this problem is suggested using this approach. This recipe can be used in the design of a novel treatment planning and optimisation based on fundamental science.

  8. Lower survival probabilities for adult Florida manatees in years with intense coastal storms

    Science.gov (United States)

    Langtimm, C.A.; Beck, C.A.

    2003-01-01

    The endangered Florida manatee (Trichechus manatus latirostris) inhabits the subtropical waters of the southeastern United States, where hurricanes are a regular occurrence. Using mark-resighting statistical models, we analyzed 19 years of photo-identification data and detected significant annual variation in adult survival for a subpopulation in northwest Florida where human impact is low. That variation coincided with years when intense hurricanes (Category 3 or greater on the Saffir-Simpson Hurricane Scale) and a major winter storm occurred in the northern Gulf of Mexico. Mean survival probability during years with no or low intensity storms was 0.972 (approximate 95% confidence interval = 0.961-0.980) but dropped to 0.936 (0.864-0.971) in 1985 with Hurricanes Elena, Kate, and Juan; to 0.909 (0.837-0.951) in 1993 with the March "Storm of the Century"; and to 0.817 (0.735-0.878) in 1995 with Hurricanes Opal, Erin, and Allison. These drops in survival probability were not catastrophic in magnitude and were detected because of the use of state-of-the-art statistical techniques and the quality of the data. Because individuals of this small population range extensively along the north Gulf coast of Florida, it was possible to resolve storm effects on a regional scale rather than the site-specific local scale common to studies of more sedentary species. This is the first empirical evidence in support of storm effects on manatee survival and suggests a cause-effect relationship. The decreases in survival could be due to direct mortality, indirect mortality, and/or emigration from the region as a consequence of storms. Future impacts to the population by a single catastrophic hurricane, or series of smaller hurricanes, could increase the probability of extinction. With the advent in 1995 of a new 25- to 50-yr cycle of greater hurricane activity, and longer term change possible with global climate change, it becomes all the more important to reduce mortality and injury

  9. Discrete probability models and methods probability on graphs and trees, Markov chains and random fields, entropy and coding

    CERN Document Server

    Brémaud, Pierre

    2017-01-01

    The emphasis in this book is placed on general models (Markov chains, random fields, random graphs), universal methods (the probabilistic method, the coupling method, the Stein-Chen method, martingale methods, the method of types) and versatile tools (Chernoff's bound, Hoeffding's inequality, Holley's inequality) whose domain of application extends far beyond the present text. Although the examples treated in the book relate to the possible applications, in the communication and computing sciences, in operations research and in physics, this book is in the first instance concerned with theory. The level of the book is that of a beginning graduate course. It is self-contained, the prerequisites consisting merely of basic calculus (series) and basic linear algebra (matrices). The reader is not assumed to be trained in probability since the first chapters give in considerable detail the background necessary to understand the rest of the book. .

  10. University Students’ Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics

    Science.gov (United States)

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution—the central, unifying, and overarching theme in biology. Aspects strongly related to abstract “threshold” concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument for assessing students’ conceptual knowledge of randomness and probability in the context of evolution. To address this problem, we have developed two instruments, Randomness and Probability Test in the Context of Evolution (RaProEvo) and Randomness and Probability Test in the Context of Mathematics (RaProMath), that include both multiple-choice and free-response items. The instruments were administered to 140 university students in Germany, then the Rasch partial-credit model was applied to assess them. The results indicate that the instruments generate reliable and valid inferences about students’ conceptual knowledge of randomness and probability in the two contexts (which are separable competencies). Furthermore, RaProEvo detected significant differences in knowledge of randomness and probability, as well as evolutionary theory, between biology majors and preservice biology teachers. PMID:28572180

  11. Probability of survival of implant-supported metal ceramic and CAD/CAM resin nanoceramic crowns.

    Science.gov (United States)

    Bonfante, Estevam A; Suzuki, Marcelo; Lorenzoni, Fábio C; Sena, Lídia A; Hirata, Ronaldo; Bonfante, Gerson; Coelho, Paulo G

    2015-08-01

    To evaluate the probability of survival and failure modes of implant-supported resin nanoceramic relative to metal-ceramic crowns. Resin nanoceramic molar crowns (LU) (Lava Ultimate, 3M ESPE, USA) were milled and metal-ceramic (MC) (Co-Cr alloy, Wirobond C+, Bego, USA) with identical anatomy were fabricated (n=21). The metal coping and a burnout-resin veneer were created by CAD/CAM, using an abutment (Stealth-abutment, Bicon LLC, USA) and a milled crown from the LU group as models for porcelain hot-pressing (GC-Initial IQ-Press, GC, USA). Crowns were cemented, the implants (n=42, Bicon) embedded in acrylic-resin for mechanical testing, and subjected to single-load to fracture (SLF, n=3 each) for determination of step-stress profiles for accelerated-life testing in water (n=18 each). Weibull curves (50,000 cycles at 200N, 90% CI) were plotted. Weibull modulus (m) and characteristic strength (η) were calculated and a contour plot used (m versus η) for determining differences between groups. Fractography was performed in SEM and polarized-light microscopy. SLF mean values were 1871N (±54.03) for MC and 1748N (±50.71) for LU. Beta values were 0.11 for MC and 0.49 for LU. Weibull modulus was 9.56 and η=1038.8N for LU, and m=4.57 and η=945.42N for MC (p>0.10). Probability of survival (50,000 and 100,000 cycles at 200 and 300N) was 100% for LU and 99% for MC. Failures were cohesive within LU. In MC crowns, porcelain veneer fractures frequently extended to the supporting metal coping. Probability of survival was not different between crown materials, but failure modes differed. In load bearing regions, similar reliability should be expected for metal ceramics, known as the gold standard, and resin nanoceramic crowns over implants. Failure modes involving porcelain veneer fracture and delamination in MC crowns are less likely to be successfully repaired compared to cohesive failures in resin nanoceramic material. Copyright © 2015 Academy of Dental Materials

  12. The two-parametric scaling and new temporal asymptotic of survival probability of diffusing particle in the medium with traps.

    Science.gov (United States)

    Arkhincheev, V E

    2017-03-01

    The new asymptotic behavior of the survival probability of particles in a medium with absorbing traps in an electric field has been established in two ways-by using the scaling approach and by the direct solution of the diffusion equation in the field. It has shown that at long times, this drift mechanism leads to a new temporal behavior of the survival probability of particles in a medium with absorbing traps.

  13. Inelastic cross section and survival probabilities at the LHC in minijet models

    Science.gov (United States)

    Fagundes, Daniel A.; Grau, Agnes; Pancheri, Giulia; Shekhovtsova, Olga; Srivastava, Yogendra N.

    2017-09-01

    Recent results for the total and inelastic hadronic cross sections from LHC experiments are compared with predictions from a single-channel eikonal minijet model driven by parton density functions and from an empirical model. The role of soft gluon resummation in the infrared region in taming the rise of minijets and their contribution to the increase of the total cross sections at high energies are discussed. Survival probabilities at the LHC, whose theoretical estimates range from circa 10% to a few per mille, are estimated in this model and compared with results from QCD-inspired models and from multichannel eikonal models. We revisit a previous calculation and examine the origin of these discrepancies.

  14. Predicting longitudinal trajectories of health probabilities with random-effects multinomial logit regression.

    Science.gov (United States)

    Liu, Xian; Engel, Charles C

    2012-12-20

    Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond existing work by developing a retransformation method that derives longitudinal growth trajectories of unbiased health probabilities. We estimated variances of the predicted probabilities by using the delta method. Additionally, we transformed the covariates' regression coefficients on the multinomial logit function, not substantively meaningful, to the conditional effects on the predicted probabilities. The empirical illustration uses the longitudinal data from the Asset and Health Dynamics among the Oldest Old. Our analysis compared three sets of the predicted probabilities of three health states at six time points, obtained from, respectively, the retransformation method, the best linear unbiased prediction, and the fixed-effects approach. The results demonstrate that neglect of retransforming random errors in the random-effects multinomial logit model results in severely biased longitudinal trajectories of health probabilities as well as overestimated effects of covariates on the probabilities. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Crossing probability for directed polymers in random media. II. Exact tail of the distribution.

    Science.gov (United States)

    De Luca, Andrea; Le Doussal, Pierre

    2016-03-01

    We study the probability p ≡ p(η)(t) that two directed polymers in a given random potential η and with fixed and nearby endpoints do not cross until time t. This probability is itself a random variable (over samples η), which, as we show, acquires a very broad probability distribution at large time. In particular, the moments of p are found to be dominated by atypical samples where p is of order unity. Building on a formula established by us in a previous work using nested Bethe ansatz and Macdonald process methods, we obtain analytically the leading large time behavior of all moments p(m) ≃ γ(m)/t. From this, we extract the exact tail ∼ρ(p)/t of the probability distribution of the noncrossing probability at large time. The exact formula is compared to numerical simulations, with excellent agreement.

  16. Survival probabilities of first and second clutches of blackbird (Turdus merula in an urban environment

    Directory of Open Access Journals (Sweden)

    Kurucz Kornelia

    2010-01-01

    Full Text Available The breeding success of blackbirds was investigated in April and June 2008 and 2009 in the Botanical Garden of the University of Pecs, with a total of 50 artificial nests at each of the four sessions (with 1 quail egg and 1 plasticine egg placed in every nest. In all four study periods of the two years, 2 nests (4% were destroyed by predators. Six nests (12%, of the nests were not discovered in either of the cases. The survival probability of artificial nests was greater in April than in June (both years, but the difference was significant only in 2008. Nests placed into a curtain of ivy (Hedera helix on a wall were located higher up than those in bushes, yet their predation rates were quite similar. The predation values of quail vs. plasticine eggs did not differ in 2008. In the year 2009, however, significantly more quail eggs were discovered (mostly removed, than plasticine eggs. Marks that were left on plasticine eggs originated mostly from small mammals and small-bodied birds, but the disappearance of a large number of quail and plasticine eggs was probably caused by larger birds, primarily jays.

  17. Use of ELVIS II platform for random process modelling and analysis of its probability density function

    Science.gov (United States)

    Maslennikova, Yu. S.; Nugmanov, I. S.

    2016-08-01

    The problem of probability density function estimation for a random process is one of the most common in practice. There are several methods to solve this problem. Presented laboratory work uses methods of the mathematical statistics to detect patterns in the realization of random process. On the basis of ergodic theory, we construct algorithm for estimating univariate probability density distribution function for a random process. Correlational analysis of realizations is applied to estimate the necessary size of the sample and the time of observation. Hypothesis testing for two probability distributions (normal and Cauchy) is used on the experimental data, using χ2 criterion. To facilitate understanding and clarity of the problem solved, we use ELVIS II platform and LabVIEW software package that allows us to make the necessary calculations, display results of the experiment and, most importantly, to control the experiment. At the same time students are introduced to a LabVIEW software package and its capabilities.

  18. Escape probability and mean residence time in random flows with unsteady drift

    Directory of Open Access Journals (Sweden)

    Brannan James R.

    2001-01-01

    Full Text Available We investigate fluid transport in random velocity fields with unsteady drift. First, we propose to quantify fluid transport between flow regimes of different characteristic motion, by escape probability and mean residence time. We then develop numerical algorithms to solve for escape probability and mean residence time, which are described by backward Fokker-Planck type partial differential equations. A few computational issues are also discussed. Finally, we apply these ideas and numerical algorithms to a tidal flow model.

  19. Approximations to the Probability of Failure in Random Vibration by Integral Equation Methods

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    Close approximations to the first passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first passage probability density function and the distribution function for the time interval spent below a barrier before...... outcrossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval, and hence for the first...... passage probability density. The results of the theory agree well with simulation results for narrow banded processes dominated by a single frequency, as well as for bimodal processes with 2 dominating frequencies in the structural response....

  20. Odds and Probabilities Estimation for the Survival of Breast Cancer Patients with Cancer Stages 2 & 3

    Directory of Open Access Journals (Sweden)

    Urrutia Jackie D.

    2016-01-01

    Full Text Available Breast cancer is one of the leading causes of death in the Philippines. One out of four who are diagnosed with breast cancer die within the first five years, and no less than 40 percent die within 10 years and it has continous rise as time passes by. Therefore, it is very important to know the factors that can help for the survival rate of the patients. The purpose of this study is to identify the best possible treatment or combination of treatments. The researchers considered four independent variables namely: Completed Surgery, Completed Chemotherapy, Completed Hormonotherapy and Completed Radiotherapy. The researchers limit this study for only 160 patients with stage 2 and 135 with stage 3 for a total of 295 patients considering the data gathered from three hospitals from Metro Manila. The names of the hospitals were not declared due to confidentiality of data. In identifying the best treatment or combination of treatments, odds, probabilities and odds ratios of patients, Logistic Regression Analysis was used.

  1. PREDICTING LONGITUDINAL TRAJECTORIES OF HEALTH PROBABILITIES WITH RANDOM-EFFECTS MULTINOMIAL LOGIT REGRESSION

    OpenAIRE

    Liu, Xian; Engel, Charles C.

    2012-01-01

    Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond e...

  2. The probability of a random straight line in two and three dimensions

    NARCIS (Netherlands)

    Beckers, A.L.D.; Smeulders, A.W.M.

    1990-01-01

    Using properties of shift- and rotation-invariance probability density distributions are derived for random straight lines in normal representation. It is found that in two-dimensional space the distribution of normal coordinates (r, phi) is uniform: p(r, phi) = c, where c is a normalisation

  3. University Students' Conceptual Knowledge of Randomness and Probability in the Contexts of Evolution and Mathematics

    Science.gov (United States)

    Fiedler, Daniela; Tröbst, Steffen; Harms, Ute

    2017-01-01

    Students of all ages face severe conceptual difficulties regarding key aspects of evolution-- the central, unifying, and overarching theme in biology. Aspects strongly related to abstract "threshold" concepts like randomness and probability appear to pose particular difficulties. A further problem is the lack of an appropriate instrument…

  4. The random effects prep continues to mispredict the probability of replication

    NARCIS (Netherlands)

    Iverson, G.J.; Lee, M.D.; Wagenmakers, E.-J.

    2010-01-01

    In their reply, Lecoutre and Killeen (2010) argue for a random effects version of prep, in which the observed effect from one experiment is used to predict the probability that an effect from a different but related experiment will have the same sign. They present a figure giving the impression that

  5. HABITAT ASSESSMENT USING A RANDOM PROBABILITY BASED SAMPLING DESIGN: ESCAMBIA RIVER DELTA, FLORIDA

    Science.gov (United States)

    Smith, Lisa M., Darrin D. Dantin and Steve Jordan. In press. Habitat Assessment Using a Random Probability Based Sampling Design: Escambia River Delta, Florida (Abstract). To be presented at the SWS/GERS Fall Joint Society Meeting: Communication and Collaboration: Coastal Systems...

  6. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  7. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  8. Probability and Random Processes With Applications to Signal Processing and Communications

    CERN Document Server

    Miller, Scott

    2012-01-01

    Miller and Childers have focused on creating a clear presentation of foundational concepts with specific applications to signal processing and communications, clearly the two areas of most interest to students and instructors in this course. It is aimed at graduate students as well as practicing engineers, and includes unique chapters on narrowband random processes and simulation techniques. The appendices provide a refresher in such areas as linear algebra, set theory, random variables, and more. Probability and Random Processes also includes applications in digital communications, informati

  9. Model and test in a fungus of the probability that beneficial mutations survive drift

    NARCIS (Netherlands)

    Gifford, D.R.; Visser, de J.A.G.M.; Wahl, L.M.

    2013-01-01

    Determining the probability of fixation of beneficial mutations is critically important for building predictive models of adaptive evolution. Despite considerable theoretical work, models of fixation probability have stood untested for nearly a century. However, recent advances in experimental and

  10. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    Science.gov (United States)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  11. Hypothyroidism improves random-pattern skin flap survival in rats.

    Science.gov (United States)

    Rahimpour, Sina; Nezami, Behtash Ghazi; Karimian, Negin; Sotoudeh-Anvari, Maryam; Ebrahimi, Farzad; Taleb, Shayandokht; Mirazi, Naser; Dehpour, Ahmad Reza

    2012-11-01

    The protective effect of hypothyroidism against ischemic or toxic conditions has been shown in various tissues. We investigated the effect of propylthiouracil (PTU)/methimazole (MMI)-induced hypothyroidism and acute local effect of MMI on the outcome of lethal ischemia in random-pattern skin flaps. Dorsal flaps with caudal pedicles were elevated at midline and flap survival was measured at the seventh day after surgery. The first group, as control, received 1 mL of 0.9% saline solution in the flap before flap elevation. In groups 2 and 3, hypothyroidism was induced by administration of either PTU 0.05% or MMI 0.04% in drinking water. The next four groups received local injections of MMI (10, 20, 50, or 100 μg/flap) before flap elevation. Local PTU injection was ignored due to insolubility of the agent. Hypothyroidism was induced in chronic PTU- and MMI-treated groups, and animals in these groups showed significant increase in their flap survival, compared to control euthyroid rats (79.47% ± 10.49% and 75.48% ± 12.93% versus 52.26% ± 5.75%, respectively, P hypothyroidism improves survival of random-pattern skin flaps in rats. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. How the probability of presentation to a primary care clinician correlates with cancer survival rates: a European survey using vignettes.

    Science.gov (United States)

    Harris, Michael; Frey, Peter; Esteva, Magdalena; Gašparović Babić, Svjetlana; Marzo-Castillejo, Mercè; Petek, Davorina; Petek Ster, Marija; Thulesius, Hans

    2017-03-01

    European cancer survival rates vary widely. System factors, including whether or not primary care physicians (PCPs) are gatekeepers, may account for some of these differences. This study explores where patients who may have cancer are likely to present for medical care in different European countries, and how probability of presentation to a primary care clinician correlates with cancer survival rates. Seventy-eight PCPs in a range of European countries assessed four vignettes representing patients who might have cancer, and consensus groups agreed how likely those patients were to present to different clinicians in their own countries. These data were compared with national cancer survival rates. A total of 14 countries. Consensus groups of PCPs. Probability of initial presentation to a PCP for four clinical vignettes. There was no significant correlation between overall national 1-year relative cancer survival rates and the probability of initial presentation to a PCP (r  = -0.16, 95% CI -0.39 to 0.08). Within that there was large variation depending on the type of cancer, with a significantly poorer lung cancer survival in countries where patients were more likely to initially consult a PCP (lung r = -0.57, 95% CI -0.83 to -0.12; ovary: r = -0.13, 95% CI -0.57 to 0.38; breast r = 0.14, 95% CI -0.36 to 0.58; bowel: r = 0.20, 95% CI -0.31 to 0.62). There were wide variations in the degree of gatekeeping between countries, with no simple binary model as to whether or not a country has a "PCP-as-gatekeeper" system. While there was case-by-case variation, there was no overall evidence of a link between a higher probability of initial consultation with a PCP and poorer cancer survival. KEY POINTS European cancer survival rates vary widely, and health system factors may account for some of these differences. The data from 14 European countries show a wide variation in the probability of initial presentation to a PCP. The degree to

  13. Estimation of the probability of bacterial population survival: Development of a probability model to describe the variability in time to inactivation of Salmonella enterica.

    Science.gov (United States)

    Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu

    2017-12-01

    Despite the development of numerous predictive microbial inactivation models, a model focusing on the variability in time to inactivation for a bacterial population has not been developed. Additionally, an appropriate estimation of the risk of there being any remaining bacterial survivors in foods after the application of an inactivation treatment has not yet been established. Here, Gamma distribution, as a representative probability distribution, was used to estimate the variability in time to inactivation for a bacterial population. Salmonella enterica serotype Typhimurium was evaluated for survival in a low relative humidity environment. We prepared bacterial cells with an initial concentration that was adjusted to 2 × 10n colony-forming units/2 μl (n = 1, 2, 3, 4, 5) by performing a serial 10-fold dilution, and then we placed 2 μl of the inocula into each well of 96-well microplates. The microplates were stored in a desiccated environment at 10-20% relative humidity at 5, 15, or 25 °C. The survival or death of bacterial cells for each well in the 96-well microplate was confirmed by adding tryptic soy broth as an enrichment culture. The changes in the death probability of the 96 replicated bacterial populations were described as a cumulative Gamma distribution. The variability in time to inactivation was described by transforming the cumulative Gamma distribution into a Gamma distribution. We further examined the bacterial inactivation on almond kernels and radish sprout seeds. Additionally, we described certainty levels of bacterial inactivation that ensure the death probability of a bacterial population at six decimal reduction levels, ranging from 90 to 99.9999%. Consequently, the probability model developed in the present study enables us to estimate the death probability of bacterial populations in a desiccated environment over time. This probability model may be useful for risk assessment to estimate the amount of remaining bacteria in a given

  14. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  15. Estimating the probability of survival of individual shortleaf pine (Pinus echinata mill.) trees

    Science.gov (United States)

    Sudip Shrestha; Thomas B. Lynch; Difei Zhang; James M. Guldin

    2012-01-01

    A survival model is needed in a forest growth system which predicts the survival of trees on individual basis or on a stand basis (Gertner, 1989). An individual-tree modeling approach is one of the better methods available for predicting growth and yield as it provides essential information about particular tree species; tree size, tree quality and tree present status...

  16. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  17. Reducing bias in survival under non-random temporary emigration

    Science.gov (United States)

    Peñaloza, Claudia L.; Kendall, William L.; Langtimm, Catherine Ann

    2014-01-01

    Despite intensive monitoring, temporary emigration from the sampling area can induce bias severe enough for managers to discard life-history parameter estimates toward the terminus of the times series (terminal bias). Under random temporary emigration unbiased parameters can be estimated with CJS models. However, unmodeled Markovian temporary emigration causes bias in parameter estimates and an unobservable state is required to model this type of emigration. The robust design is most flexible when modeling temporary emigration, and partial solutions to mitigate bias have been identified, nonetheless there are conditions were terminal bias prevails. Long-lived species with high adult survival and highly variable non-random temporary emigration present terminal bias in survival estimates, despite being modeled with the robust design and suggested constraints. Because this bias is due to uncertainty about the fate of individuals that are undetected toward the end of the time series, solutions should involve using additional information on survival status or location of these individuals at that time. Using simulation, we evaluated the performance of models that jointly analyze robust design data and an additional source of ancillary data (predictive covariate on temporary emigration, telemetry, dead recovery, or auxiliary resightings) in reducing terminal bias in survival estimates. The auxiliary resighting and predictive covariate models reduced terminal bias the most. Additional telemetry data was effective at reducing terminal bias only when individuals were tracked for a minimum of two years. High adult survival of long-lived species made the joint model with recovery data ineffective at reducing terminal bias because of small-sample bias. The naïve constraint model (last and penultimate temporary emigration parameters made equal), was the least efficient, though still able to reduce terminal bias when compared to an unconstrained model. Joint analysis of several

  18. Age-specific survival and reproductive probabilities: evidence for senescence in male fallow deer (Dama dama)

    National Research Council Canada - National Science Library

    A. G. McElligott; R. Altwegg; T. J. Altwegg

    2002-01-01

    ...–fitting model revealed that fallow bucks have four life–history stages: yearling, pre–reproductive, prime–age and senescent. Pre–reproductive males (2 and 3 years old) had the highest survival...

  19. Surviving probability indicators of landing juvenile magellanic penguins arriving along the southern Brazilian coast

    Directory of Open Access Journals (Sweden)

    Sandra Carvalho Rodrigues

    2010-04-01

    Full Text Available The aim of this work was to monitor and study the hematocrit and weight of juvenile penguins, with and without oil cover, found alive along the southern coast of Brazil, after capture, as well as before eventual death or release. Released juvenile penguins showed higher weight and hematocrit (3.65 ± 0.06 kg and 44.63 ± 0.29%, respectively than those that died (2.88 ± 0.08 kg and 34.42 ± 1.70%, respectively. Penguins with higher hematocrit and weight after capture had higher mean weight gain than their counterparts with smaller hematocrit and weight after the capture. Besides, juveniles with higher hematocrit and weight after the capture had higher survival rates, independent of the presence or absence of oil. The results suggested that juveniles covered with oil might have been healthier than the juveniles without oil. The animals without oil probably died as a consequence of health disturbances, while the animals with oil possibly were healthy before contact with oil in the sea.O hematócrito e o peso de pingüins juvenis, com e sem óleo, encontrados vivos na costa do sul do Brasil, foram monitorados após sua captura, bem como antes de sua morte ou liberação do centro de reabilitação. Os pingüins juvenis liberados apresentaram o último peso e hematócrito (3.65 ± 0.06 kg e 44.63 ± 0.29%, respectivamente maiores do que os pingüins que morreram (2.88 ± 0.08 kg e 34.42 ± 1.70%, respectivamente. Pingüins juvenis com maior hematócrito e peso após a captura tiveram maior ganho médio de peso do que os pingüins com menor hematócrito e peso após a captura. Além disso, os juvenis com maior hematócrito e peso após a captura tiveram maiores taxas de sobrevivência, independente da presença ou ausência de óleo. Os resultados sugerem que os pingüins juvenis com óleo poderiam estar mais saudáveis do que os juvenis sem óleo. Os animais sem óleo provavelmente morreram em decorrência de doenças, endoparasitas ou outros dist

  20. Nonlinear effects of winter sea ice on the survival probabilities of Adélie penguins.

    Science.gov (United States)

    Ballerini, Tosca; Tavecchia, Giacomo; Olmastroni, Silvia; Pezzo, Francesco; Focardi, Silvano

    2009-08-01

    The population dynamics of Antarctic seabirds are influenced by variations in winter sea ice extent and persistence; however, the type of relationship differs according to the region and the demographic parameter considered. We used annual presence/absence data obtained from 1,138 individually marked birds to study the influence of environmental and individual characteristics on the survival of Adélie penguins Pygoscelis adeliae at Edmonson Point (Ross Sea, Antarctica) between 1994 and 2005. About 25% of 600 birds marked as chicks were reobserved at the natal colony. The capture and survival rates of Adélie penguins at this colony increased with the age of individuals, and five age classes were identified for both parameters. Mean adult survival was 0.85 (SE = 0.01), and no effect of sex on survival was evident. Breeding propensity, as measured by adult capture rates, was close to one, indicating a constant breeding effort through time. Temporal variations in survival were best explained by a quadratic relationship with winter sea ice extent anomalies in the Ross Sea, suggesting that for this region optimal conditions are intermediate between too much and too little winter sea ice. This is likely the result of a balance between suitable wintering habitat and food availability. Survival rates were not correlated with the Southern Oscillation Index. Low adult survival after a season characterized by severe environmental conditions at breeding but favorable conditions during winter suggested an additional mortality mediated by the reproductive effort. Adélie penguins are sensitive indicators of environmental changes in the Antarctic, and the results from this study provide insights into regional responses of this species to variability in winter sea ice habitat.

  1. Survival probability and first-passage-time statistics of a Wiener process driven by an exponential time-dependent drift

    Science.gov (United States)

    Urdapilleta, Eugenio

    2011-02-01

    The survival probability and the first-passage-time statistics are important quantities in different fields. The Wiener process is the simplest stochastic process with continuous variables, and important results can be explicitly found from it. The presence of a constant drift does not modify its simplicity; however, when the process has a time-dependent component the analysis becomes difficult. In this work we analyze the statistical properties of the Wiener process with an absorbing boundary, under the effect of an exponential time-dependent drift. Based on the backward Fokker-Planck formalism we set the time-inhomogeneous equation and conditions that rule the diffusion of the corresponding survival probability. We propose as the solution an expansion series in terms of the intensity of the exponential drift, resulting in a set of recurrence equations. We explicitly solve the expansion up to second order and comment on higher-order solutions. The first-passage-time density function arises naturally from the survival probability and preserves the proposed expansion. Explicit results, related properties, and limit behaviors are analyzed and extensively compared to numerical simulations.

  2. Survival probability and first-passage-time statistics of a Wiener process driven by an exponential time-dependent drift.

    Science.gov (United States)

    Urdapilleta, Eugenio

    2011-02-01

    The survival probability and the first-passage-time statistics are important quantities in different fields. The Wiener process is the simplest stochastic process with continuous variables, and important results can be explicitly found from it. The presence of a constant drift does not modify its simplicity; however, when the process has a time-dependent component the analysis becomes difficult. In this work we analyze the statistical properties of the Wiener process with an absorbing boundary, under the effect of an exponential time-dependent drift. Based on the backward Fokker-Planck formalism we set the time-inhomogeneous equation and conditions that rule the diffusion of the corresponding survival probability. We propose as the solution an expansion series in terms of the intensity of the exponential drift, resulting in a set of recurrence equations. We explicitly solve the expansion up to second order and comment on higher-order solutions. The first-passage-time density function arises naturally from the survival probability and preserves the proposed expansion. Explicit results, related properties, and limit behaviors are analyzed and extensively compared to numerical simulations.

  3. PItcHPERFeCT: Primary Intracranial Hemorrhage Probability Estimation using Random Forests on CT.

    Science.gov (United States)

    Muschelli, John; Sweeney, Elizabeth M; Ullman, Natalie L; Vespa, Paul; Hanley, Daniel F; Crainiceanu, Ciprian M

    2017-01-01

    Intracerebral hemorrhage (ICH), where a blood vessel ruptures into areas of the brain, accounts for approximately 10-15% of all strokes. X-ray computed tomography (CT) scanning is largely used to assess the location and volume of these hemorrhages. Manual segmentation of the CT scan using planimetry by an expert reader is the gold standard for volume estimation, but is time-consuming and has within- and across-reader variability. We propose a fully automated segmentation approach using a random forest algorithm with features extracted from X-ray computed tomography (CT) scans. The Minimally Invasive Surgery plus rt-PA in ICH Evacuation (MISTIE) trial was a multi-site Phase II clinical trial that tested the safety of hemorrhage removal using recombinant-tissue plasminogen activator (rt-PA). For this analysis, we use 112 baseline CT scans from patients enrolled in the MISTE trial, one CT scan per patient. ICH was manually segmented on these CT scans by expert readers. We derived a set of imaging predictors from each scan. Using 10 randomly-selected scans, we used a first-pass voxel selection procedure based on quantiles of a set of predictors and then built 4 models estimating the voxel-level probability of ICH. The models used were: 1) logistic regression, 2) logistic regression with a penalty on the model parameters using LASSO, 3) a generalized additive model (GAM) and 4) a random forest classifier. The remaining 102 scans were used for model validation.For each validation scan, the model predicted the probability of ICH at each voxel. These voxel-level probabilities were then thresholded to produce binary segmentations of the hemorrhage. These masks were compared to the manual segmentations using the Dice Similarity Index (DSI) and the correlation of hemorrhage volume of between the two segmentations. We tested equality of median DSI using the Kruskal-Wallis test across the 4 models. We tested equality of the median DSI from sets of 2 models using a Wilcoxon

  4. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  5. Looping probability of random heteropolymers helps to understand the scaling properties of biopolymers.

    Science.gov (United States)

    Zhan, Y; Giorgetti, L; Tiana, G

    2016-09-01

    Random heteropolymers are a minimal description of biopolymers and can provide a theoretical framework to the investigate the formation of loops in biophysical experiments. The looping probability as a function of polymer length was observed to display in some biopolymers, like chromosomes in cell nuclei or long RNA chains, anomalous scaling exponents. Combining a two-state model with self-adjusting simulated-tempering calculations, we calculate numerically the looping properties of several realizations of the random interactions within the chain. We find a continuous set of exponents upon varying the temperature, which arises from finite-size effects and is amplified by the disorder of the interactions. We suggest that this could provide a simple explanation for the anomalous scaling exponents found in experiments. In addition, our results have important implications notably for the study of chromosome folding as they show that scaling exponents cannot be the sole criteria for testing hypothesis-driven models of chromosome architecture.

  6. Intraseasonal variation in survival and probable causes of mortality in greater sage-grouse Centrocercus urophasianus

    Science.gov (United States)

    Blomberg, Erik J.; Gibson, Daniel; Sedinger, James S.; Casazza, Michael L.; Coates, Peter S.

    2013-01-01

    The mortality process is a key component of avian population dynamics, and understanding factors that affect mortality is central to grouse conservation. Populations of greater sage-grouse Centrocercus urophasianus have declined across their range in western North America. We studied cause-specific mortality of radio-marked sage-grouse in Eureka County, Nevada, USA, during two seasons, nesting (2008-2012) and fall (2008-2010), when survival was known to be lower compared to other times of the year. We used known-fate and cumulative incidence function models to estimate weekly survival rates and cumulative risk of cause-specific mortalities, respectively. These methods allowed us to account for temporal variation in sample size and staggered entry of marked individuals into the sample to obtain robust estimates of survival and cause-specific mortality. We monitored 376 individual sage-grouse during the course of our study, and investigated 87 deaths. Predation was the major source of mortality, and accounted for 90% of all mortalities during our study. During the nesting season (1 April - 31 May), the cumulative risk of predation by raptors (0.10; 95% CI: 0.05-0.16) and mammals (0.08; 95% CI: 0.03-013) was relatively equal. In the fall (15 August - 31 October), the cumulative risk of mammal predation was greater (M(mam) = 0.12; 95% CI: 0.04-0.19) than either predation by raptors (M(rap) = 0.05; 95% CI: 0.00-0.10) or hunting harvest (M(hunt) = 0.02; 95% CI: 0.0-0.06). During both seasons, we observed relatively few additional sources of mortality (e.g. collision) and observed no evidence of disease-related mortality (e.g. West Nile Virus). In general, we found little evidence for intraseasonal temporal variation in survival, suggesting that the nesting and fall seasons represent biologically meaningful time intervals with respect to sage-grouse survival.

  7. The Development and Application of Random Match Probabilities to Firearm and Toolmark Identification.

    Science.gov (United States)

    Murdock, John E; Petraco, Nicholas D K; Thornton, John I; Neel, Michael T; Weller, Todd J; Thompson, Robert M; Hamby, James E; Collins, Eric R

    2017-05-01

    The field of firearms and toolmark analysis has encountered deep scrutiny of late, stemming from a handful of voices, primarily in the law and statistical communities. While strong scrutiny is a healthy and necessary part of any scientific endeavor, much of the current criticism leveled at firearm and toolmark analysis is, at best, misinformed and, at worst, punditry. One of the most persistent criticisms stems from the view that as the field lacks quantified random match probability data (or at least a firm statistical model) with which to calculate the probability of a false match, all expert testimony concerning firearm and toolmark identification or source attribution is unreliable and should be ruled inadmissible. However, this critique does not stem from the hard work of actually obtaining data and performing the scientific research required to support or reject current findings in the literature. Although there are sound reasons (described herein) why there is currently no unifying probabilistic model for the comparison of striated and impressed toolmarks as there is in the field of forensic DNA profiling, much statistical research has been, and continues to be, done to aid the criminal justice system. This research has thus far shown that error rate estimates for the field are very low, especially when compared to other forms of judicial error. The first purpose of this paper is to point out the logical fallacies in the arguments of a small group of pundits, who advocate a particular viewpoint but cloak it as fact and research. The second purpose is to give a balanced review of the literature regarding random match probability models and statistical applications that have been carried out in forensic firearm and toolmark analysis. © 2017 American Academy of Forensic Sciences.

  8. Assessing Uncertainties of Theoretical Atomic Transition Probabilities with Monte Carlo Random Trials

    Directory of Open Access Journals (Sweden)

    Alexander Kramida

    2014-04-01

    Full Text Available This paper suggests a method of evaluation of uncertainties in calculated transition probabilities by randomly varying parameters of an atomic code and comparing the results. A control code has been written to randomly vary the input parameters with a normal statistical distribution around initial values with a certain standard deviation. For this particular implementation, Cowan’s suite of atomic codes (R.D. Cowan, The Theory of Atomic Structure and Spectra, Berkeley, CA: University of California Press, 1981 was used to calculate radiative rates of magnetic-dipole and electric-quadrupole transitions within the ground configuration of titanium-like iron, Fe V. The Slater parameters used in the calculations were adjusted to fit experimental energy levels with Cowan’s least-squares fitting program, RCE. The standard deviations of the fitted parameters were used as input of the control code providing the distribution widths of random trials for these parameters. Propagation of errors through the matrix diagonalization and summation of basis state expansions leads to significant variations in the resulting transition rates. These variations vastly differ in their magnitude for different transitions, depending on their sensitivity to errors in parameters. With this method, the rate uncertainty can be individually assessed for each calculated transition.

  9. Survival under uncertainty an introduction to probability models of social structure and evolution

    CERN Document Server

    Volchenkov, Dimitri

    2016-01-01

    This book introduces and studies a number of stochastic models of subsistence, communication, social evolution and political transition that will allow the reader to grasp the role of uncertainty as a fundamental property of our irreversible world. At the same time, it aims to bring about a more interdisciplinary and quantitative approach across very diverse fields of research in the humanities and social sciences. Through the examples treated in this work – including anthropology, demography, migration, geopolitics, management, and bioecology, among other things – evidence is gathered to show that volatile environments may change the rules of the evolutionary selection and dynamics of any social system, creating a situation of adaptive uncertainty, in particular, whenever the rate of change of the environment exceeds the rate of adaptation. Last but not least, it is hoped that this book will contribute to the understanding that inherent randomness can also be a great opportunity – for social systems an...

  10. Novel head and neck cancer survival analysis approach: random survival forests versus Cox proportional hazards regression.

    Science.gov (United States)

    Datema, Frank R; Moya, Ana; Krause, Peter; Bäck, Thomas; Willmes, Lars; Langeveld, Ton; Baatenburg de Jong, Robert J; Blom, Henk M

    2012-01-01

    Electronic patient files generate an enormous amount of medical data. These data can be used for research, such as prognostic modeling. Automatization of statistical prognostication processes allows automatic updating of models when new data is gathered. The increase of power behind an automated prognostic model makes its predictive capability more reliable. Cox proportional hazard regression is most frequently used in prognostication. Automatization of a Cox model is possible, but we expect the updating process to be time-consuming. A possible solution lies in an alternative modeling technique called random survival forests (RSFs). RSF is easily automated and is known to handle the proportionality assumption coherently and automatically. Performance of RSF has not yet been tested on a large head and neck oncological dataset. This study investigates performance of head and neck overall survival of RSF models. Performances are compared to a Cox model as the "gold standard." RSF might be an interesting alternative modeling approach for automatization when performances are similar. RSF models were created in R (Cox also in SPSS). Four RSF splitting rules were used: log-rank, conservation of events, log-rank score, and log-rank approximation. Models were based on historical data of 1371 patients with primary head-and-neck cancer, diagnosed between 1981 and 1998. Models contain 8 covariates: tumor site, T classification, N classification, M classification, age, sex, prior malignancies, and comorbidity. Model performances were determined by Harrell's concordance error rate, in which 33% of the original data served as a validation sample. RSF and Cox models delivered similar error rates. The Cox model performed slightly better (error rate, 0.2826). The log-rank splitting approach gave the best RSF performance (error rate, 0.2873). In accord with Cox and RSF models, high T classification, high N classification, and severe comorbidity are very important covariates in the

  11. Continuous-time random walk: exact solutions for the probability density function and first two moments

    Energy Technology Data Exchange (ETDEWEB)

    Kwok Sau Fa [Departamento de Fisica, Universidade Estadual de Maringa, Av. Colombo 5790, 87020-900 Maringa-PR (Brazil); Joni Fat, E-mail: kwok@dfi.uem.br [Jurusan Teknik Elektro-Fakultas Teknik, Universitas Tarumanagara, Jl. Let. Jend. S. Parman 1, Blok L, Lantai 3 Grogol, Jakarta 11440 (Indonesia)

    2011-10-15

    We consider the decoupled continuous-time random walk model with a finite characteristic waiting time and approximate jump length variance. We take the waiting time probability density function (PDF) given by a combination of the exponential and the Mittag-Leffler function. Using this waiting time PDF, we investigate the diffusion behavior for all times. We obtain exact solutions for the first two moments and the PDF for the force-free and linear force cases. Due to the finite characteristic waiting time and jump length variance, the model presents, for the force-free case, normal diffusive behavior in the long-time limit. Further, the model can describe anomalous behavior at intermediate times.

  12. Computer-assisted predictive formulas expressing survival probability and life expectancy in US adults, men and women, 2001.

    Science.gov (United States)

    Chung, Sung J

    2007-06-01

    The National Center for Health Statistics (NCHS) reported the United States life tables, 2001 for US total, male and female populations on the basis of 2001 mortality statistics, the 2000 decennial census and the data from the Medicare program [E. Arias, United State life tables, 2001, Natl. Vital Stat. Rep. 52 (2004) 1-40]. The life tables show life expectancy, survival and death rate at each year between birth and 100 years of age. In this study formulas expressing survival probability and life expectancy in US adults, men and women are constructed from the data of the NCHS. A model of the 'probacent'-probability equation previously published by the author is employed in the study. Analysis of the formula-predicted values and the NCHS-reported data indicates that the formulas are accurate and reliable with a close agreement. The formula representing a generalized lognormal distribution might be useful for biomedical investigation, and epidemiological and demographic studies in US adults, men and women.

  13. Sugar administration to newly emerged Aedes albopictus males increases their survival probability and mating performance.

    Science.gov (United States)

    Bellini, Romeo; Puggioli, Arianna; Balestrino, Fabrizio; Brunelli, Paolo; Medici, Anna; Urbanelli, Sandra; Carrieri, Marco

    2014-04-01

    Aedes albopictus male survival in laboratory cages is no more than 4-5 days when kept without any access to sugar indicating their need to feed on a sugar source soon after emergence. We therefore developed a device to administer energetic substances to newly emerged males when released as pupae as part of a sterile insect technique (SIT) programme, made with a polyurethane sponge 4 cm thick and perforated with holes 2 cm in diameter. The sponge was imbibed with the required sugar solution and due to its high retention capacity the sugar solution was available for males to feed for at least 48 h. When evaluated in lab cages, comparing adults emerged from the device with sugar solution vs the device with water only (as negative control), about half of the males tested positive for fructose using the Van Handel anthrone test, compared to none of males in the control cage. We then tested the tool in semi-field and in field conditions with different sugar concentrations (10%, 15%, and 20%) and compared results to the controls fed with water only. Males were recaptured by a battery operated manual aspirator at 24 and 48 h after pupae release. Rather high share 10-25% of captured males tested positive for fructose in recollections in the vicinity of the control stations, while in the vicinity of the sugar stations around 40-55% of males were positive, though variability between replicates was large. The sugar positive males in the control test may have been released males that had access to natural sugar sources found close to the release station and/or wild males present in the environment. Only a slight increase in the proportion of positive males was obtained by increasing the sugar concentration in the feeding device from 10% to 20%. Surprisingly, modification of the device to add a black plastic inverted funnel above the container reduced rather than increased the proportion of fructose positive males collected around the station. No evidence of difference in the

  14. How long do the dead survive on the road? Carcass persistence probability and implications for road-kill monitoring surveys.

    Science.gov (United States)

    Santos, Sara M; Carvalho, Filipe; Mira, António

    2011-01-01

    Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i) describe carcass persistence timings for overall and for specific animal groups; ii) assess optimal sampling designs according to research objectives; and iii) model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning) and lizards (in the afternoon), daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days) for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. The guidance given here on monitoring frequencies is particularly relevant to provide conservation and transportation agencies with accurate numbers of road

  15. How long do the dead survive on the road? Carcass persistence probability and implications for road-kill monitoring surveys.

    Directory of Open Access Journals (Sweden)

    Sara M Santos

    Full Text Available BACKGROUND: Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. METHODOLOGY/PRINCIPAL FINDINGS: Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i describe carcass persistence timings for overall and for specific animal groups; ii assess optimal sampling designs according to research objectives; and iii model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning and lizards (in the afternoon, daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. CONCLUSION/SIGNIFICANCE: The guidance given here on monitoring frequencies is particularly relevant to provide

  16. How Long Do the Dead Survive on the Road? Carcass Persistence Probability and Implications for Road-Kill Monitoring Surveys

    Science.gov (United States)

    Santos, Sara M.; Carvalho, Filipe; Mira, António

    2011-01-01

    Background Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. Methodology/Principal Findings Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i) describe carcass persistence timings for overall and for specific animal groups; ii) assess optimal sampling designs according to research objectives; and iii) model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning) and lizards (in the afternoon), daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days) for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. Conclusion/Significance The guidance given here on monitoring frequencies is particularly relevant to provide conservation and

  17. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions

    DEFF Research Database (Denmark)

    Yura, Harold; Hanson, Steen Grüner

    2012-01-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the......Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set...... with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...

  18. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  19. Impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach.

    Science.gov (United States)

    Chandrasekar, A; Rakkiyappan, R; Cao, Jinde

    2015-10-01

    This paper studies the impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach. The array of neural networks are coupled in a random fashion which is governed by Bernoulli random variable. The aim of this paper is to obtain the synchronization criteria, which is suitable for both exactly known and partly unknown transition probabilities such that the coupled neural network is synchronized with mixed time-delay. The considered impulsive effects can be synchronized at partly unknown transition probabilities. Besides, a multiple integral approach is also proposed to strengthen the Markovian jumping randomly coupled neural networks with partly unknown transition probabilities. By making use of Kronecker product and some useful integral inequalities, a novel Lyapunov-Krasovskii functional was designed for handling the coupled neural network with mixed delay and then impulsive synchronization criteria are solvable in a set of linear matrix inequalities. Finally, numerical examples are presented to illustrate the effectiveness and advantages of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Internationally comparable diagnosis-specific survival probabilities for calculation of the ICD-10-based Injury Severity Score

    DEFF Research Database (Denmark)

    Gedeborg, R.; Warner, M.; Chen, L. H.

    2014-01-01

    BACKGROUND: The International Statistical Classification of Diseases, 10th Revision (ICD-10) -based Injury Severity Score (ICISS) performs well but requires diagnosis-specific survival probabilities (DSPs), which are empirically derived, for its calculation. The objective was to examine if DSPs b...... based on data pooled from several countries could increase accuracy, precision, utility, and international comparability of DSPs and ICISS. METHODS: Australia, Argentina, Austria, Canada, Denmark, New Zealand, and Sweden provided ICD-10-coded injury hospital discharge data, including in......-hospital mortality status. Data from the seven countries were pooled using four different methods to create an international collaborative effort ICISS (ICE-ICISS). The ability of the ICISS to predict mortality using the country-specific DSPs and the pooled DSPs was estimated and compared. RESULTS: The pooled DSPs...... generated empirically derived DSPs. These pooled DSPs facilitate international comparisons and enables the use of ICISS in all settings where ICD-10 hospital discharge diagnoses are available. The modest reduction in performance of the ICE-ICISS compared with the country-specific scores is unlikely...

  1. OPTIMAL ESTIMATION OF RANDOM PROCESSES ON THE CRITERION OF MAXIMUM A POSTERIORI PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. A. Lobaty

    2016-01-01

    Full Text Available The problem of obtaining the equations for the a posteriori probability density of a stochastic Markov process with a linear measurement model. Unlike common approaches based on consideration as a criterion for optimization of the minimum mean square error of estimation, in this case, the optimization criterion is considered the maximum a posteriori probability density of the process being evaluated.The a priori probability density estimated Gaussian process originally considered a differentiable function that allows us to expand it in a Taylor series without use of intermediate transformations characteristic functions and harmonic decomposition. For small time intervals the probability density measurement error vector, by definition, as given by a Gaussian with zero expectation. This makes it possible to obtain a mathematical expression for the residual function, which characterizes the deviation of the actual measurement process from its mathematical model.To determine the optimal a posteriori estimation of the state vector is given by the assumption that this estimate is consistent with its expectation – the maximum a posteriori probability density. This makes it possible on the basis of Bayes’ formula for the a priori and a posteriori probability density of an equation Stratonovich-Kushner.Using equation Stratonovich-Kushner in different types and values of the vector of drift and diffusion matrix of a Markov stochastic process can solve a variety of filtration tasks, identify, smoothing and system status forecast for continuous and for discrete systems. Discrete continuous implementation of the developed algorithms posteriori assessment provides a specific, discrete algorithms for the implementation of the on-board computer, a mobile robot system.

  2. On the probability of cost-effectiveness using data from randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Willan Andrew R

    2001-09-01

    Full Text Available Abstract Background Acceptability curves have been proposed for quantifying the probability that a treatment under investigation in a clinical trial is cost-effective. Various definitions and estimation methods have been proposed. Loosely speaking, all the definitions, Bayesian or otherwise, relate to the probability that the treatment under consideration is cost-effective as a function of the value placed on a unit of effectiveness. These definitions are, in fact, expressions of the certainty with which the current evidence would lead us to believe that the treatment under consideration is cost-effective, and are dependent on the amount of evidence (i.e. sample size. Methods An alternative for quantifying the probability that the treatment under consideration is cost-effective, which is independent of sample size, is proposed. Results Non-parametric methods are given for point and interval estimation. In addition, these methods provide a non-parametric estimator and confidence interval for the incremental cost-effectiveness ratio. An example is provided. Conclusions The proposed parameter for quantifying the probability that a new therapy is cost-effective is superior to the acceptability curve because it is not sample size dependent and because it can be interpreted as the proportion of patients who would benefit if given the new therapy. Non-parametric methods are used to estimate the parameter and its variance, providing the appropriate confidence intervals and test of hypothesis.

  3. A new formulation of the probability density function in random walk models for atmospheric dispersion

    DEFF Research Database (Denmark)

    Falk, Anne Katrine Vinther; Gryning, Sven-Erik

    1997-01-01

    In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials. ...

  4. PItcHPERFeCT: Primary Intracranial Hemorrhage Probability Estimation using Random Forests on CT

    Directory of Open Access Journals (Sweden)

    John Muschelli

    2017-01-01

    Results: All results presented are for the 102 scans in the validation set. The median DSI for each model was: 0.89 (logistic, 0.885 (LASSO, 0.88 (GAM, and 0.899 (random forest. Using the random forest results in a slightly higher median DSI compared to the other models. After Bonferroni correction, the hypothesis of equality of median DSI was rejected only when comparing the random forest DSI to the DSI from the logistic (p < 0.001, LASSO (p < 0.001, or GAM (p < 0.001 models. In practical terms the difference between the random forest and the logistic regression is quite small. The correlation (95% CI between the volume from manual segmentation and the predicted volume was 0.93 (0.9,0.95 for the random forest model. These results indicate that random forest approach can achieve accurate segmentation of ICH in a population of patients from a variety of imaging centers. We provide an R package (https://github.com/muschellij2/ichseg and a Shiny R application online (http://johnmuschelli.com/ich_segment_all.html for implementing and testing the proposed approach.

  5. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....

  6. Random-effects regression analysis of correlated grouped-time survival data.

    Science.gov (United States)

    Hedeker, D; Siddiqui, O; Hu, F B

    2000-04-01

    Random-effects regression modelling is proposed for analysis of correlated grouped-time survival data. Two analysis approaches are considered. The first treats survival time as an ordinal outcome, which is either right-censored or not. The second approach treats survival time as a set of dichotomous indicators of whether the event occurred for time periods up to the period of the event or censor. For either approach both proportional hazards and proportional odds versions of the random-effects model are developed, while partial proportional hazards and odds generalizations are described for the latter approach. For estimation, a full-information maximum marginal likelihood solution is implemented using numerical quadrature to integrate over the distribution of multiple random effects. The quadrature solution allows some flexibility in the choice of distributions for the random effects; both normal and rectangular distributions are considered in this article. An analysis of a dataset where students are clustered within schools is used to illustrate features of random-effects analysis of clustered grouped-time survival data.

  7. Spectral shaping of a randomized PWM DC-DC converter using maximum entropy probability distributions

    CSIR Research Space (South Africa)

    Dove, Albert

    2017-01-01

    Full Text Available stream_source_info Dove_2018.pdf.txt stream_content_type text/plain stream_size 26566 Content-Encoding UTF-8 stream_name Dove_2018.pdf.txt Content-Type text/plain; charset=UTF-8 SPECTRAL SHAPING OF A RANDOMIZED PWM DC... behind spectral shaping is to select a randomization technique with its associated PDF to analytically obtain a specified spectral profile [21]. The benefits of this idea comes in being able to achieve some level of controllability on the spectral content...

  8. Tail probabilities and partial moments for quadratic forms in multivariate generalized hyperbolic random vectors

    NARCIS (Netherlands)

    Broda, S.A.

    2013-01-01

    Countless test statistics can be written as quadratic forms in certain random vectors, or ratios thereof. Consequently, their distribution has received considerable attention in the literature. Except for a few special cases, no closed-form expression for the cdf exists, and one resorts to numerical

  9. Is extrapair mating random? On the probability distribution of extrapair young in avian broods

    NARCIS (Netherlands)

    Brommer, Jon E.; Korsten, Peter; Bouwman, Karen A.; Berg, Mathew L.; Komdeur, Jan

    2007-01-01

    A dichotomy in female extrapair copulation (EPC) behavior, with some females seeking EPC and others not, is inferred if the observed distribution of extrapair young (EPY) over broods differs from a random process on the level of individual offspring (binomial, hypergeometrical, or Poisson). A review

  10. Random errors of oceanic monthly rainfall derived from SSM/I using probability distribution functions

    Science.gov (United States)

    Chang, Alfred T. C.; Chiu, Long S.; Wilheit, Thomas T.

    1993-01-01

    Global averages and random errors associated with the monthly oceanic rain rates derived from the Special Sensor Microwave/Imager (SSM/I) data using the technique developed by Wilheit et al. (1991) are computed. Accounting for the beam-filling bias, a global annual average rain rate of 1.26 m is computed. The error estimation scheme is based on the existence of independent (morning and afternoon) estimates of the monthly mean. Calculations show overall random errors of about 50-60 percent for each 5 deg x 5 deg box. The results are insensitive to different sampling strategy (odd and even days of the month). Comparison of the SSM/I estimates with raingage data collected at the Pacific atoll stations showed a low bias of about 8 percent, a correlation of 0.7, and an rms difference of 55 percent.

  11. Formulas for Rational-Valued Separability Probabilities of Random Induced Generalized Two-Qubit States

    Directory of Open Access Journals (Sweden)

    Paul B. Slater

    2015-01-01

    Full Text Available Previously, a formula, incorporating a 5F4 hypergeometric function, for the Hilbert-Schmidt-averaged determinantal moments ρPTnρk/ρk of 4×4 density-matrices (ρ and their partial transposes (|ρPT|, was applied with k=0 to the generalized two-qubit separability probability question. The formula can, furthermore, be viewed, as we note here, as an averaging over “induced measures in the space of mixed quantum states.” The associated induced-measure separability probabilities (k=1,2,… are found—via a high-precision density approximation procedure—to assume interesting, relatively simple rational values in the two-re[al]bit (α=1/2, (standard two-qubit (α=1, and two-quater[nionic]bit (α=2 cases. We deduce rather simple companion (rebit, qubit, quaterbit, … formulas that successfully reproduce the rational values assumed for general  k. These formulas are observed to share certain features, possibly allowing them to be incorporated into a single master formula.

  12. Effect of noise and detector sensitivity on a dynamical process: inverse power law and Mittag-Leffler interevent time survival probabilities.

    Science.gov (United States)

    Pramukkul, Pensri; Svenkeson, Adam; Grigolini, Paolo

    2014-02-01

    We study the combined effects of noise and detector sensitivity on a dynamical process that generates intermittent events mimicking the behavior of complex systems. By varying the sensitivity level of the detector we move between two forms of complexity, from inverse power law to Mittag-Leffler interevent time survival probabilities. Here fluctuations fight against complexity, causing an exponential truncation to the survival probability. We show that fluctuations of relatively weak intensity have a strong effect on the generation of Mittag-Leffler complexity, providing a reason why stretched exponentials are frequently found in nature. Our results afford a more unified picture of complexity resting on the Mittag-Leffler function and encompassing the standard inverse power law definition.

  13. Effect of noise and detector sensitivity on a dynamical process: Inverse power law and Mittag-Leffler interevent time survival probabilities

    Science.gov (United States)

    Pramukkul, Pensri; Svenkeson, Adam; Grigolini, Paolo

    2014-02-01

    We study the combined effects of noise and detector sensitivity on a dynamical process that generates intermittent events mimicking the behavior of complex systems. By varying the sensitivity level of the detector we move between two forms of complexity, from inverse power law to Mittag-Leffler interevent time survival probabilities. Here fluctuations fight against complexity, causing an exponential truncation to the survival probability. We show that fluctuations of relatively weak intensity have a strong effect on the generation of Mittag-Leffler complexity, providing a reason why stretched exponentials are frequently found in nature. Our results afford a more unified picture of complexity resting on the Mittag-Leffler function and encompassing the standard inverse power law definition.

  14. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  15. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    Science.gov (United States)

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  17. Application of Survival Analysis to Study Timing and Probability of Outcome Attainment by a Community College Student Cohort

    Science.gov (United States)

    Mourad, Roger; Hong, Ji-Hee

    2008-01-01

    This study applies competing risks survival analysis to describe outcome attainment for an entire cohort of students who first attended a Midwestern community college in the Fall Semester 2001. Outcome attainment included transfer to a four-year institution, degree/ certificate attainment from the community college under study, and transfer to a…

  18. Frequency format diagram and probability chart for breast cancer risk communication: a prospective, randomized trial

    Directory of Open Access Journals (Sweden)

    Wahner-Roedler Dietlind

    2008-10-01

    Full Text Available Abstract Background Breast cancer risk education enables women make informed decisions regarding their options for screening and risk reduction. We aimed to determine whether patient education regarding breast cancer risk using a bar graph, with or without a frequency format diagram, improved the accuracy of risk perception. Methods We conducted a prospective, randomized trial among women at increased risk for breast cancer. The main outcome measurement was patients' estimation of their breast cancer risk before and after education with a bar graph (BG group or bar graph plus a frequency format diagram (BG+FF group, which was assessed by previsit and postvisit questionnaires. Results Of 150 women in the study, 74 were assigned to the BG group and 76 to the BG+FF group. Overall, 72% of women overestimated their risk of breast cancer. The improvement in accuracy of risk perception from the previsit to the postvisit questionnaire (BG group, 19% to 61%; BG+FF group, 13% to 67% was not significantly different between the 2 groups (P = .10. Among women who inaccurately perceived very high risk (≥ 50% risk, inaccurate risk perception decreased significantly in the BG+FF group (22% to 3% compared with the BG group (28% to 19% (P = .004. Conclusion Breast cancer risk communication using a bar graph plus a frequency format diagram can improve the short-term accuracy of risk perception among women perceiving inaccurately high risk.

  19. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni

    2015-08-28

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise / offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

  20. Survival of two post systems--five-year results of a randomized clinical trial.

    Science.gov (United States)

    Schmitter, Marc; Hamadi, Khaled; Rammelsberg, Peter

    2011-01-01

    To assess the survival rate of two different post systems after 5 years of service with a prospective randomized controlled trial. One hundred patients in need of a post were studied. Half of the patients received long glass fiber-reinforced posts, while the other half received long metal screw posts. The posts were assigned randomly. After at least 5 years (mean, 61.37 months), follow-ups were established. When a complication occurred prior to this recall, the type and time of the complication was documented. Statistical analysis was performed using the log-rank test and Kaplan-Meier analysis. Additionally, a Cox regression was performed to analyze risk factors. The survival rate of fiber-reinforced posts was 71.8%. In the metal screw post group, the survival rate was significantly lower, 50.0% (log-rank test, P = .026). Metal posts resulted more often in more unfavorable complications (eg, root fractures); consequently, more teeth (n = 17) had to be extracted. The Cox regression identified the following risk factors: position of the tooth (anterior vs posterior teeth), degree of coronal tooth destruction, and the post system (fiber-reinforced post vs metal screw post). Fiber-reinforced restorations loosened in several patients; in some of these cases (n = 6), patients did not notice this, leading to the extraction of teeth. Long metal screw posts should be used with great care in endodontically treated teeth. Besides the selection of the post system, other factors influence the survival of the restoration.

  1. Effect of natural hirudin on random pattern skin flap survival in a porcine model.

    Science.gov (United States)

    Zhao, H; Shi, Q; Sun, Z Y; Yin, G Q; Yang, H L

    2012-01-01

    The effect of local administration of hirudin on random pattern skin flap survival was investigated in a porcine model. Three random pattern skin flaps (4 × 14 cm) were created on each flank of five Chinese minipigs. The experimental group (10 flaps) received 20 antithrombin units of hirudin, injected subdermally into the distal half immediately after surgery and on days 1 and 2; a control group (10 flaps) was injected with saline and a sham group (10 flaps) was not injected. All flaps were followed for 10 days postoperatively. Macroscopically, the congested/necrotic length in the experimental group was significantly decreased compared with the other two groups by day 3. Histopathological evaluation revealed venous congestion and inflammation in the control and sham groups from day 1, but minimal changes in the experimental group. By day 10, the mean ± SD surviving area was significantly greater in the experimental group (67.6 ± 2.1%) than in the control (45.2 ± 1.4%) or sham (48.3 ± 1.1%) groups. Local administration of hirudin can significantly increase the surviving area in overdimensioned random pattern skin flaps, in a porcine model.

  2. Aerobic Exercise Increases Hippocampal Volume in Older Women with Probable Mild Cognitive Impairment: A 6-Month Randomized Controlled Trial

    Science.gov (United States)

    ten Brinke, Lisanne F.; Bolandzadeh, Niousha; Nagamatsu, Lindsay S.; Hsu, Chun Liang; Davis, Jennifer C.; Miran-Khan, Karim; Liu-Ambrose, Teresa

    2015-01-01

    Background Mild cognitive impairment (MCI) is a well-recognized risk factor for dementia and represents a vital opportunity for intervening. Exercise is a promising strategy for combating cognitive decline, by improving both brain structure and function. Specifically, aerobic training (AT) improved spatial memory and hippocampal volume in healthy community-dwelling older adults. In older women with probable MCI, we previously demonstrated that both resistance training (RT) and AT improved memory. In this secondary analysis, we investigated: 1) the effect of both RT and AT on hippocampal volume; and 2) the association between change in hippocampal volume and change in memory. Methods Eighty-six females aged 70 to 80 years with probable MCI were randomly assigned to a six-month, twice-weekly program of: 1) AT, 2) RT, or 3) Balance and Tone Training (BAT; i.e., control). At baseline and trial completion, participants performed a 3T magnetic resonance imaging scan to determine hippocampal volume. Verbal memory and learning was assessed by Rey’s Auditory Verbal Learning Test. Results Compared with the BAT group, AT significantly improved left, right, and total hippocampal volumes (p≤0.03). After accounting for baseline cognitive function and experimental group, increased left hippocampal volume was independently associated with reduced verbal memory and learning performance as indexed by loss after interference (r=0.42, p=0.03). Conclusion Aerobic training significantly increased hippocampal volume in older women with probable MCI. More research is needed to ascertain the relevance of exercise-induced changes in hippocampal volume on memory performance in older adults with MCI. PMID:24711660

  3. Survival

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — These data provide information on the survival of California red-legged frogs in a unique ecosystem to better conserve this threatened species while restoring...

  4. Survival probability of Baltic larval cod in relation to spatial overlap patterns with their prey obtained from drift model studies

    DEFF Research Database (Denmark)

    Hinrichsen, H.H.; Schmidt, J.O.; Petereit, C.

    2005-01-01

    patterns on the overlap of Baltic cod larvae with their prey. A three-dimensional hydrodynamic model was used to analyse spatio-temporally resolved drift patterns of larval Baltic cod. A coefficient of overlap between modelled larval and idealized prey distributions indicated the probability of predator......-prey overlap, dependent on the hatching time of cod larvae. By performing model runs for the years 1979-1998 investigated the intra- and interannual variability of potential spatial overlap between predator and prey. Assuming uniform prey distributions, we generally found the overlap to have decreased since...... the mid-1980s, but with the highest variability during the 1990s. Seasonally, predator-prey overlap on the Baltic cod spawning grounds was highest in summer and lowest at the end of the cod spawning season. Horizontally variable prey distributions generally resulted in decreased overlap coefficients...

  5. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  6. Modeling longitudinal data with nonparametric multiplicative random effects jointly with survival data.

    Science.gov (United States)

    Ding, Jimin; Wang, Jane-Ling

    2008-06-01

    In clinical studies, longitudinal biomarkers are often used to monitor disease progression and failure time. Joint modeling of longitudinal and survival data has certain advantages and has emerged as an effective way to mutually enhance information. Typically, a parametric longitudinal model is assumed to facilitate the likelihood approach. However, the choice of a proper parametric model turns out to be more elusive than models for standard longitudinal studies in which no survival endpoint occurs. In this article, we propose a nonparametric multiplicative random effects model for the longitudinal process, which has many applications and leads to a flexible yet parsimonious nonparametric random effects model. A proportional hazards model is then used to link the biomarkers and event time. We use B-splines to represent the nonparametric longitudinal process, and select the number of knots and degrees based on a version of the Akaike information criterion (AIC). Unknown model parameters are estimated through maximizing the observed joint likelihood, which is iteratively maximized by the Monte Carlo Expectation Maximization (MCEM) algorithm. Due to the simplicity of the model structure, the proposed approach has good numerical stability and compares well with the competing parametric longitudinal approaches. The new approach is illustrated with primary biliary cirrhosis (PBC) data, aiming to capture nonlinear patterns of serum bilirubin time courses and their relationship with survival time of PBC patients.

  7. Risk Prediction of One-Year Mortality in Patients with Cardiac Arrhythmias Using Random Survival Forest.

    Science.gov (United States)

    Miao, Fen; Cai, Yun-Peng; Zhang, Yu-Xiao; Li, Ye; Zhang, Yuan-Ting

    2015-01-01

    Existing models for predicting mortality based on traditional Cox proportional hazard approach (CPH) often have low prediction accuracy. This paper aims to develop a clinical risk model with good accuracy for predicting 1-year mortality in cardiac arrhythmias patients using random survival forest (RSF), a robust approach for survival analysis. 10,488 cardiac arrhythmias patients available in the public MIMIC II clinical database were investigated, with 3,452 deaths occurring within 1-year followups. Forty risk factors including demographics and clinical and laboratory information and antiarrhythmic agents were analyzed as potential predictors of all-cause mortality. RSF was adopted to build a comprehensive survival model and a simplified risk model composed of 14 top risk factors. The built comprehensive model achieved a prediction accuracy of 0.81 measured by c-statistic with 10-fold cross validation. The simplified risk model also achieved a good accuracy of 0.799. Both results outperformed traditional CPH (which achieved a c-statistic of 0.733 for the comprehensive model and 0.718 for the simplified model). Moreover, various factors are observed to have nonlinear impact on cardiac arrhythmias prognosis. As a result, RSF based model which took nonlinearity into account significantly outperformed traditional Cox proportional hazard model and has great potential to be a more effective approach for survival analysis.

  8. Effects of Rosmarinus officinalis on the survivability of random-patterned skin flaps: an experimental study.

    Science.gov (United States)

    Ince, Bilsev; Yildirim, Alpagan Mustafa; Okur, Mehmet Ihsan; Dadaci, Mehmet; Yoruk, Ebru

    2015-04-01

    Improving survival of skin flaps used in soft-tissue reconstruction is clinically an important goal, and several systemic and local agents have been used for this purpose. However, a substance that prevents the flap necrosis has not yet been defined. This study aimed to investigate whether a Rosmarinus officinalis extract could improve the skin flap survival. In this study, 21 Wistar albino rats were divided into three groups. Rectangular 8 × 2 cm random-pattern flaps were elevated from the back of the rats. Group I was considered the control group. In Group II, a 0.5-cc of Rosmarinus officinalis oil was applied with an ear bud to the flap area 30 minutes before the flap elevation. After suturing the flaps to their location, the oil was administered twice a day for a week. In Group III, 0.5 cc of the oil was applied twice a day to the area that was elevated for a week until surgery. At the end of the week, the flaps were sutured to their location, and wiped postoperatively twice a day for a week with the oil. Mean percentage of these areas was found to be 29.81%, 58.99%, and 67.68% in Group I, Group II, and Group III, respectively. The mean percentage of the flap survival areas and vessel diameters were significantly greater in the Groups II and III than in the control group (p Rosmarinus officinalis extract can increase the flap survivability.

  9. Predicting treatment effect from surrogate endpoints and historical trials: an extrapolation involving probabilities of a binary outcome or survival to a specific time.

    Science.gov (United States)

    Baker, Stuart G; Sargent, Daniel J; Buyse, Marc; Burzykowski, Tomasz

    2012-03-01

    Using multiple historical trials with surrogate and true endpoints, we consider various models to predict the effect of treatment on a true endpoint in a target trial in which only a surrogate endpoint is observed. This predicted result is computed using (1) a prediction model (mixture, linear, or principal stratification) estimated from historical trials and the surrogate endpoint of the target trial and (2) a random extrapolation error estimated from successively leaving out each trial among the historical trials. The method applies to either binary outcomes or survival to a particular time that is computed from censored survival data. We compute a 95% confidence interval for the predicted result and validate its coverage using simulation. To summarize the additional uncertainty from using a predicted instead of true result for the estimated treatment effect, we compute its multiplier of standard error. Software is available for download. © 2011, The International Biometric Society No claim to original US government works.

  10. Fracture strength and probability of survival of narrow and extra-narrow dental implants after fatigue testing: In vitro and in silico analysis.

    Science.gov (United States)

    Bordin, Dimorvan; Bergamo, Edmara T P; Fardin, Vinicius P; Coelho, Paulo G; Bonfante, Estevam A

    2017-07-01

    To assess the probability of survival (reliability) and failure modes of narrow implants with different diameters. For fatigue testing, 42 implants with the same macrogeometry and internal conical connection were divided, according to diameter, as follows: narrow (Ø3.3×10mm) and extra-narrow (Ø2.9×10mm) (21 per group). Identical abutments were torqued to the implants and standardized maxillary incisor crowns were cemented and subjected to step-stress accelerated life testing (SSALT) in water. The use-level probability Weibull curves, and reliability for a mission of 50,000 and 100,000 cycles at 50N, 100, 150 and 180N were calculated. For the finite element analysis (FEA), two virtual models, simulating the samples tested in fatigue, were constructed. Loading at 50N and 100N were applied 30° off-axis at the crown. The von-Mises stress was calculated for implant and abutment. The beta (β) values were: 0.67 for narrow and 1.32 for extra-narrow implants, indicating that failure rates did not increase with fatigue in the former, but more likely were associated with damage accumulation and wear-out failures in the latter. Both groups showed high reliability (up to 97.5%) at 50 and 100N. A decreased reliability was observed for both groups at 150 and 180N (ranging from 0 to 82.3%), but no significant difference was observed between groups. Failure predominantly involved abutment fracture for both groups. FEA at 50N-load, Ø3.3mm showed higher von-Mises stress for abutment (7.75%) and implant (2%) when compared to the Ø2.9mm. There was no significant difference between narrow and extra-narrow implants regarding probability of survival. The failure mode was similar for both groups, restricted to abutment fracture. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Random walks on three-strand braids and on related hyperbolic groups 05.40.-a Fluctuation phenomena, random processes, noise, and Brownian motion; 02.50.-r Probability theory, stochastic processes, and statistics; 02.40.Ky Riemannian geometries;

    CERN Document Server

    Nechaev, S

    2003-01-01

    We investigate the statistical properties of random walks on the simplest nontrivial braid group B sub 3 , and on related hyperbolic groups. We provide a method using Cayley graphs of groups allowing us to compute explicitly the probability distribution of the basic statistical characteristics of random trajectories - the drift and the return probability. The action of the groups under consideration in the hyperbolic plane is investigated, and the distribution of a geometric invariant - the hyperbolic distance - is analysed. It is shown that a random walk on B sub 3 can be viewed as a 'magnetic random walk' on the group PSL(2, Z).

  12. Random walks on three-strand braids and on related hyperbolic groups[05.40.-a Fluctuation phenomena, random processes, noise, and Brownian motion; 02.50.-r Probability theory, stochastic processes, and statistics; 02.40.Ky Riemannian geometries;

    Energy Technology Data Exchange (ETDEWEB)

    Nechaev, Sergei [Laboratoire de Physique Theorique et Modeles Statistiques, Universite Paris Sud, 91405 Orsay Cedex (France); Voituriez, Raphael [Laboratoire de Physique Theorique et Modeles Statistiques, Universite Paris Sud, 91405 Orsay Cedex (France)

    2003-01-10

    We investigate the statistical properties of random walks on the simplest nontrivial braid group B{sub 3}, and on related hyperbolic groups. We provide a method using Cayley graphs of groups allowing us to compute explicitly the probability distribution of the basic statistical characteristics of random trajectories - the drift and the return probability. The action of the groups under consideration in the hyperbolic plane is investigated, and the distribution of a geometric invariant - the hyperbolic distance - is analysed. It is shown that a random walk on B{sub 3} can be viewed as a 'magnetic random walk' on the group PSL(2, Z)

  13. Fatigue Reliability under Random Loads

    DEFF Research Database (Denmark)

    Talreja, R.

    1979-01-01

    We consider the problem of estimating the probability of survival (non-failure) and the probability of safe operation (strength greater than a limiting value) of structures subjected to random loads. These probabilities are formulated in terms of the probability distributions of the loads and the...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....

  14. Effects of Dopamine Donor Pretreatment on Graft Survival after Kidney Transplantation: A Randomized Trial.

    Science.gov (United States)

    Schnuelle, Peter; Schmitt, Wilhelm H; Weiss, Christel; Habicht, Antje; Renders, Lutz; Zeier, Martin; Drüschler, Felix; Heller, Katharina; Pisarski, Przemyslaw; Banas, Bernhard; Krämer, Bernhard K; Jung, Matthias; Lopau, Kai; Olbricht, Christoph J; Weihprecht, Horst; Schenker, Peter; De Fijter, Johan W; Yard, Benito A; Benck, Urs

    2017-03-07

    Donor dopamine improves initial graft function after kidney transplantation due to antioxidant properties. We investigated if a 4 µg/kg per minute continuous dopamine infusion administered after brain-death confirmation affects long-term graft survival and examined the exposure-response relationship with treatment duration. Five-year follow-up of 487 renal transplant patients from 60 European centers who had participated in the randomized, multicenter trial of dopamine donor pretreatment between 2004 and 2007 (ClinicalTrials.gov identifier: NCT00115115). Follow-up was complete in 99.2%. Graft survival was 72.6% versus 68.7% (P=0.34), and 83.3% versus 80.4% (P=0.42) after death-censoring in treatment and control arms according to trial assignment. Although infusion times varied substantially in the treatment arm (range 0-32.2 hours), duration of the dopamine infusion and all-cause graft failure exhibited an exposure-response relationship (hazard ratio, 0.96; 95% confidence interval [95% CI], 0.92 to 1.00, per hour). Cumulative frequency curves of graft survival and exposure time of the dopamine infusion indicated a maximum response rate at 7.10 hours (95% CI, 6.99 to 7.21), which almost coincided with the optimum infusion time for improvement of early graft function (7.05 hours; 95% CI, 6.92 to 7.18). Taking infusion time of 7.1 hours as threshold in subsequent graft survival analyses indicated a relevant benefit: Overall, 81.5% versus 68.5%; P=0.03; and 90.3% versus 80.2%; P=0.04 after death-censoring. We failed to show a significant graft survival advantage on intention-to-treat. Dopamine infusion time was very short in a considerable number of donors assigned to treatment. Our finding of a significant, nonlinear exposure-response relationship disclosed a threshold value of the dopamine infusion time that may improve long-term kidney graft survival. Copyright © 2017 by the American Society of Nephrology.

  15. Estimation of Survival Probabilities for Use in Cost-effectiveness Analyses: A Comparison of a Multi-state Modeling Survival Analysis Approach with Partitioned Survival and Markov Decision-Analytic Modeling.

    Science.gov (United States)

    Williams, Claire; Lewsey, James D; Mackay, Daniel F; Briggs, Andrew H

    2017-05-01

    Modeling of clinical-effectiveness in a cost-effectiveness analysis typically involves some form of partitioned survival or Markov decision-analytic modeling. The health states progression-free, progression and death and the transitions between them are frequently of interest. With partitioned survival, progression is not modeled directly as a state; instead, time in that state is derived from the difference in area between the overall survival and the progression-free survival curves. With Markov decision-analytic modeling, a priori assumptions are often made with regard to the transitions rather than using the individual patient data directly to model them. This article compares a multi-state modeling survival regression approach to these two common methods. As a case study, we use a trial comparing rituximab in combination with fludarabine and cyclophosphamide v. fludarabine and cyclophosphamide alone for the first-line treatment of chronic lymphocytic leukemia. We calculated mean Life Years and QALYs that involved extrapolation of survival outcomes in the trial. We adapted an existing multi-state modeling approach to incorporate parametric distributions for transition hazards, to allow extrapolation. The comparison showed that, due to the different assumptions used in the different approaches, a discrepancy in results was evident. The partitioned survival and Markov decision-analytic modeling deemed the treatment cost-effective with ICERs of just over £16,000 and £13,000, respectively. However, the results with the multi-state modeling were less conclusive, with an ICER of just over £29,000. This work has illustrated that it is imperative to check whether assumptions are realistic, as different model choices can influence clinical and cost-effectiveness results.

  16. Survival benefit with capecitabine/docetaxel versus docetaxel alone: analysis of therapy in a randomized phase III trial.

    Science.gov (United States)

    Miles, David; Vukelja, Svetislava; Moiseyenko, Vladimir; Cervantes, Guadalupe; Mauriac, Louis; Van Hazel, Guy; Liu, Wing-Yiu; Ayoub, Jean-Pierre; O'Shaughnessy, Joyce A

    2004-10-01

    In a large phase III trial of 511 patients with anthracycline-pretreated advanced/metastatic breast cancer, capecitabine/docetaxel combination therapy was shown to have significantly superior efficacy compared with single-agent docetaxel, including superior progression-free and overall survival and objective response rate. An updated survival analysis with >/= 27 months follow-up shows that patients receiving combination therapy maintained significantly superior survival (hazard ratio [HR], 0.777 [95% CI, 0.645-0.942]; P < 0.01; median survival, 14.5 months vs. 11.5 months) compared with those receiving single-agent docetaxel. Following the failure of docetaxel monotherapy, 35% of patients did not receive additional cytotoxic chemotherapy. Among patients randomized to single-agent docetaxel, only those given poststudy single-agent capecitabine had significantly prolonged survival compared with those given any other poststudy chemotherapy (HR, 0.500; P = 0.0046; median survival, 21.0 months vs. 12.3 months, respectively). By contrast, poststudy vinorelbine-containing chemotherapy did not affect survival following progression on single-agent docetaxel compared with other poststudy chemotherapy regimens (HR, 1.014; P = 0.94; median survival, 13.5 months vs. 12.6 months, respectively). Among patients randomized to combination therapy, discontinuing docetaxel of capecitabine has a similar effect on survival (HR, 0.720; P = 0.20; median survival, 15.8 months vs. 18.3 months, respectively). Median survival was 18.3 months in patients who discontinued docetaxel and continued to receive capecitabine versus 15.8 months in patients who discontinued capecitabine and continued to receive docetaxel, with a trend toward improved survival in patients continuing to receive capecitabine. Although this is a retrospective analysis, these data suggest that the sequential administration of docetaxel followed by capecitabine is associated with prolonged survival in patients who are

  17. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  18. The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.

    Directory of Open Access Journals (Sweden)

    Haiyong Ren

    Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.

  19. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  20. Determination of the compound nucleus survival probability Psurv for various "hot" fusion reactions based on the dynamical cluster-decay model

    Science.gov (United States)

    Chopra, Sahila; Kaur, Arshdeep; Gupta, Raj K.

    2015-03-01

    After a successful attempt to define and determine recently the compound nucleus (CN) fusion/ formation probability PCN within the dynamical cluster-decay model (DCM), we introduce and estimate here for the first time the survival probability Psurv of CN against fission, again within the DCM. Calculated as the dynamical fragmentation process, Psurv is defined as the ratio of the evaporation residue (ER) cross section σER and the sum of σER and fusion-fission (ff) cross section σff, the CN formation cross section σCN, where each contributing fragmentation cross section is determined in terms of its formation and barrier penetration probabilities P0 and P . In DCM, the deformations up to hexadecapole and "compact" orientations for both in-plane (coplanar) and out-of-plane (noncoplanar) configurations are allowed. Some 16 "hot" fusion reactions, forming a CN of mass number ACN˜100 to superheavy nuclei, are analyzed for various different nuclear interaction potentials, and the variation of Psurv on CN excitation energy E*, fissility parameter χ , CN mass ACN, and Coulomb parameter Z1Z2 is investigated. Interesting results are that three groups, namely, weakly fissioning, radioactive, and strongly fissioning superheavy nuclei, are identified with Psurv, respectively, ˜1 ,˜10-6 , and ˜10-10 . For the weakly fissioning group (100

  1. Use of Systemic Rosmarinus Officinalis to Enhance the Survival of Random-Pattern Skin Flaps.

    Science.gov (United States)

    İnce, Bilsev; Bilgen, Fatma; Gündeşlioğlu, Ayşe Özlem; Dadacı, Mehmet; Kozacıoğlu, Sümeyye

    2016-11-01

    Skin flaps are commonly used in soft-tissue reconstruction; however, necrosis can be a frequent complication. Several systemic and local agents have been used in attempts to improve skin flap survival, but none that can prevent flap necrosis have been identified. This study aims to determine whether the use of systemic Rosmarinus officinalis (R. officinalis) extract can prevent flap necrosis and improve skin flap recovery. Animal experimentation. Thirty-five Wistar albino rats were divided in five groups. A rectangular random-pattern flaps measuring 8×2 cm was elevated from the back of each rat. Group I was the control group. In Group II, 0.2 ml of R. officinalis oil was given orally 2h before surgery. R. officinalis oil was then applied orally twice a day for a week. In Group III, R. officinalis oil was given orally twice a day for one week before surgery. At the end of the week, 0.2 mL of R. officinalis oil was given orally 2 h before surgery. In Group IV, 0.2 mL of R. officinalis oil was injected subcutaneously 2 h before surgery. After the surgery, 0.2 mL R. officinalis oil was injected subcutaneously twice a day for one week. In Group V, 0.2 mL R. officinalis oil was injected subcutaneously twice a day for one week prior to surgery. At the end of the week, one last 0.2 mL R. officinalis oil injection was administered subcutaneously 2 h before surgery. After the surgery, 0.2 mL R. officinalis oil was injected subcutaneously twice a day for one week. The mean percentage of viable surface area was significantly greater (pofficinalis has vasodilatory effects that contribute to increased skin flap survival.

  2. Modeling Latin-American stock markets volatility: Varying probabilities and mean reversion in a random level shift model

    Directory of Open Access Journals (Sweden)

    Gabriel Rodríguez

    2016-06-01

    Full Text Available Following Xu and Perron (2014, I applied the extended RLS model to the daily stock market returns of Argentina, Brazil, Chile, Mexico and Peru. This model replaces the constant probability of level shifts for the entire sample with varying probabilities that record periods with extremely negative returns. Furthermore, it incorporates a mean reversion mechanism with which the magnitude and the sign of the level shift component vary in accordance with past level shifts that deviate from the long-term mean. Therefore, four RLS models are estimated: the Basic RLS, the RLS with varying probabilities, the RLS with mean reversion, and a combined RLS model with mean reversion and varying probabilities. The results show that the estimated parameters are highly significant, especially that of the mean reversion model. An analysis of ARFIMA and GARCH models is also performed in the presence of level shifts, which shows that once these shifts are taken into account in the modeling, the long memory characteristics and GARCH effects disappear. Also, I find that the performance prediction of the RLS models is superior to the classic models involving long memory as the ARFIMA(p,d,q models, the GARCH and the FIGARCH models. The evidence indicates that except in rare exceptions, the RLS models (in all its variants are showing the best performance or belong to the 10% of the Model Confidence Set (MCS. On rare occasions the GARCH and the ARFIMA models appear to dominate but they are rare exceptions. When the volatility is measured by the squared returns, the great exception is Argentina where a dominance of GARCH and FIGARCH models is appreciated.

  3. The Mean Distance to the nth Neighbour in a Uniform Distribution of Random Points: An Application of Probability Theory

    Science.gov (United States)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K.

    2008-01-01

    We study different ways of determining the mean distance (r[subscript n]) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating…

  4. Ten-Year Survival Results of a Randomized Trial of Irradiation of Internal Mammary Nodes After Mastectomy

    Energy Technology Data Exchange (ETDEWEB)

    Hennequin, Christophe, E-mail: christophe.hennequin@sls.aphp.fr [Hôpital Saint-Louis, AP-HP et Université de Paris VII (France); Bossard, Nadine [Hospices Civils de Lyon, Service de Biostatistique, Université Lyon 1, Lyon, and CNRS, UMR5558, Laboratoire de Biométrie et Biologie Evolutive, Equipe Biotatistique-Santé, Villeurbanne (France); Servagi-Vernat, Stéphanie [Centre hospitalier Universitaire de Besançon (France); Maingon, Philippe [Centre François Leclerc, Dijon (France); Dubois, Jean-Bernard [Centre Val d' Aurelle, Montpellier (France); Datchary, Jean [Centre Hospitalier d' Annecy (France); Carrie, Christian [Centre Léon Bérard, Lyon (France); Roullet, Bernard [Centre Hospitalier Universitaire de Limoges (France); Suchaud, Jean-Philippe [Centre Hospitalier de Roanne (France); Teissier, Eric [Centre de Radiothérapie de Mougins (France); Lucardi, Audrey [Hospices Civils de Lyon (France); Gerard, Jean-Pierre [Centre Antoine Lacassagne, Nice (France); Belot, Aurélien [Hospices Civils de Lyon, Service de Biostatistique, Université Lyon 1, Lyon, and CNRS, UMR5558, Laboratoire de Biométrie et Biologie Evolutive, Equipe Biotatistique-Santé, Villeurbanne (France); Institut de Veille Sanitaire, Département des Maladies Chroniques et des Traumatismes, Saint-Maurice (France); and others

    2013-08-01

    Purpose: To evaluate the efficacy of irradiation of internal mammary nodes (IMN) on 10-year overall survival in breast cancer patients after mastectomy. Methods and Patients: This multicenter phase 3 study enrolled patients with positive axillary nodes (pN+) or central/medial tumors with or without pN+. Other inclusion criteria were age <75 and a Karnofsky index ≥70. All patients received postoperative irradiation of the chest wall and supraclavicular nodes and were randomly assigned to receive IMN irradiation or not. Randomization was stratified by tumor location (medial/central or lateral), axillary lymph node status, and adjuvant therapy (chemotherapy vs no chemotherapy). The prescribed dose of irradiation to the target volumes was 50 Gy or equivalent. The first 5 intercostal spaces were included in the IMN target volume, and two-thirds of the dose (31.5 Gy) was given by electrons. The primary outcome was overall survival at 10 years. Disease-free survival and toxicity were secondary outcomes. Results: T total of 1334 patients were analyzed after a median follow-up of 11.3 years among the survivors. No benefit of IMN irradiation on the overall survival could be demonstrated: the 10-year overall survival was 59.3% in the IMN-nonirradiated group versus 62.6% in the IMN-irradiated group (P=.8). According to stratification factors, we defined 6 subgroups (medial/central or lateral tumor, pN0 [only for medial/central] or pN+, and chemotherapy or not). In all these subgroups, IMN irradiation did not significantly improve overall survival. Conclusions: In patients treated with 2-dimensional techniques, we failed to demonstrate a survival benefit for IMN irradiation. This study cannot rule out a moderate benefit, especially with more modern, conformal techniques applied to a higher risk population.

  5. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  6. Probability calculus of fractional order and fractional Taylor's series application to Fokker-Planck equation and information of non-random functions

    Energy Technology Data Exchange (ETDEWEB)

    Jumarie, Guy [Department of Mathematics, University of Quebec at Montreal, P.O. Box 8888, Downtown Station, Montreal, Qc, H3C 3P8 (Canada)], E-mail: jumarie.guy@uqam.ca

    2009-05-15

    A probability distribution of fractional (or fractal) order is defined by the measure {mu}{l_brace}dx{r_brace} = p(x)(dx){sup {alpha}}, 0 < {alpha} < 1. Combining this definition with the fractional Taylor's series f(x+h)=E{sub {alpha}}(D{sub x}{sup {alpha}}h{sup {alpha}})f(x) provided by the modified Riemann Liouville definition, one can expand a probability calculus parallel to the standard one. A Fourier's transform of fractional order using the Mittag-Leffler function is introduced, together with its inversion formula; and it provides a suitable generalization of the characteristic function of fractal random variables. It appears that the state moments of fractional order are more especially relevant. The main properties of this fractional probability calculus are outlined, it is shown that it provides a sound approach to Fokker-Planck equation which are fractional in both space and time, and it provides new results in the information theory of non-random functions.

  7. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  8. On Probability Domains IV

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2017-06-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  9. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    George, L.L.

    1983-01-01

    This workshop describes some probability problems in power plant reliability and maintenance analysis. The problems are seismic risk analysis, loss of load probability, load combinations, and load sharing. The seismic risk problem is to compute power plant reliability given an earthquake and the resulting risk. Component survival occurs if its peak random response to the earthquake does not exceed its strength. Power plant survival is a complicated Boolean function of component failures and survivals. The responses and strengths of components are dependent random processes, and the peak responses are maxima of random processes. The resulting risk is the expected cost of power plant failure.

  10. Multi-channel Dual Clocks three-dimensional probability Random Multiple Access protocol for Wireless Public Bus Networks based on RTS/CTS mechanism

    Directory of Open Access Journals (Sweden)

    Zhou Sheng Jie

    2016-01-01

    Full Text Available A MAC protocol for public bus networks, called Bus MAC protocol, designed to provide high quality Internet service for bus passengers. The paper proposed a multi-channel dual clocks three-demission probability random multiple access protocol based on RTS/CTS mechanism, decreasing collisions caused by multiple access from multiple passengers. Use the RTS/CTS mechanism increases the reliability and stability of the system, reducing the collision possibility of the information packets to a certain extent, improves the channel utilization; use the multi-channel mechanism, not only enables the channel load balancing, but also solves the problem of the hidden terminal and exposed terminal. Use the dual clocks mechanism, reducing the system idle time. At last, the different selection of the three-dimensional probabilities can make the system throughput adapt to the network load which could realize the maximum of the system throughput.

  11. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  12. Random sampler M-estimator algorithm with sequential probability ratio test for robust function approximation via feed-forward neural networks.

    Science.gov (United States)

    El-Melegy, Moumen T

    2013-07-01

    This paper addresses the problem of fitting a functional model to data corrupted with outliers using a multilayered feed-forward neural network. Although it is of high importance in practical applications, this problem has not received careful attention from the neural network research community. One recent approach to solving this problem is to use a neural network training algorithm based on the random sample consensus (RANSAC) framework. This paper proposes a new algorithm that offers two enhancements over the original RANSAC algorithm. The first one improves the algorithm accuracy and robustness by employing an M-estimator cost function to decide on the best estimated model from the randomly selected samples. The other one improves the time performance of the algorithm by utilizing a statistical pretest based on Wald's sequential probability ratio test. The proposed algorithm is successfully evaluated on synthetic and real data, contaminated with varying degrees of outliers, and compared with existing neural network training algorithms.

  13. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  14. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  15. Sequential FOLFIRI.3 + Gemcitabine Improves Health-Related Quality of Life Deterioration-Free Survival of Patients with Metastatic Pancreatic Adenocarcinoma: A Randomized Phase II Trial.

    Directory of Open Access Journals (Sweden)

    Amélie Anota

    Full Text Available A randomized multicenter phase II trial was conducted to assess the sequential treatment strategy using FOLFIRI.3 and gemcitabine alternately (Arm 2 compared to gemcitabine alone (Arm 1 in patients with metastatic non pre-treated pancreatic adenocarcinoma. The primary endpoint was the progression-free survival (PFS rate at 6 months. It concludes that the sequential treatment strategy appears to be feasible and effective with a PFS rate of 43.5% in Arm 2 at 6 months (26.1% in Arm 1. This paper reports the results of the longitudinal analysis of the health-related quality of life (HRQoL as a secondary endpoint of this study.HRQoL was evaluated using the EORTC QLQ-C30 at baseline and every two months until the end of the study or death. HRQoL deterioration-free survival (QFS was defined as the time from randomization to a first significant deterioration as compared to the baseline score with no further significant improvement, or death. A propensity score was estimated comparing characteristics of partial and complete responders. Analyses were repeated with inverse probability weighting method using the propensity score. Multivariate Cox regression analyses were performed to identify independent factors influencing QFS.98 patients were included between 2007 and 2011. Adjusting on the propensity score, patients of Arm 2 presented a longer QFS of Global Health Status (Hazard Ratio: 0.52 [0.31-0.85], emotional functioning (0.35 [0.21-0.59] and pain (0.50 [0.31-0.81] than those of Arm 1.Patients of Arm 2 presented a better HRQoL with a longer QFS than those of Arm 1. Moreover, the propensity score method allows to take into account the missing data depending on patients' characteristics.Eudract N° 2006-005703-34. (Name of the Trial: FIRGEM.

  16. Using Logistic Regression and Random Forests multivariate statistical methods for landslide spatial probability assessment in North-Est Sicily, Italy

    Science.gov (United States)

    Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele

    2015-04-01

    first phase of the work addressed to identify the spatial relationships between the landslides location and the 13 related factors by using the Frequency Ratio bivariate statistical method. The analysis was then carried out by adopting a multivariate statistical approach, according to the Logistic Regression technique and Random Forests technique that gave best results in terms of AUC. The models were performed and evaluated with different sample sizes and also taking into account the temporal variation of input variables such as burned areas by wildfire. The most significant outcome of this work are: the relevant influence of the sample size on the model results and the strong importance of some environmental factors (e.g. land use and wildfires) for the identification of the depletion zones of extremely rapid shallow landslides.

  17. Randomized controlled clinical trial of the 24-months survival of composite resin restorations after one-step incomplete and complete excavation on primary teeth.

    Science.gov (United States)

    Franzon, R; Opdam, N J; Guimarães, L F; Demarco, F F; Casagrande, L; Haas, A N; Araujo, F B

    2015-10-01

    This randomized clinical trial aimed to compare the 24-months survival of composite restorations in primary molars after partial caries removal (PCR) and total caries removal (TCR). Forty-eight children aged 3-8 years with at least one molar with a deep carious lesion were included (PCR; n=66; TCR; n=54). For PCR, excavation was stopped when dentine with a leathery consistency was achieved; in the TCR group, total absence of carious tissue was confirmed using a blunt-tipped probe. Pulpotomy was performed in cases of pulp exposure. Success was assessed by modified USPHS criteria with Alpha and Bravo scores recorded as success. Pulp exposure occurred in 1 and 15 of the teeth treated with PCR and TCR respectively (p<0.01). The restorations survival rate after 24 months was 66% (PCR) and 86% (TCR) (p=0.03). When teeth that received pulpotomy were analyzed separately, the survival rate was 92% (p=0.09). PCR performed in occlusoproximal restorations demonstrated the lowest success rate (p=0.002). PCR increases 2.90 times the probability of having a restorative failure compared to TCR (p=0.03), after adjusting for cavity type. When pulp exposure and restoration failure were considered as the outcome, there was no significant difference between the two groups (p=0.10) with success rates of 64% (PCR) and 61% (TCR). Collectively, deciduous teeth submitted to PCR prevented pulp exposure and, consequently, more invasive treatments; otherwise, PCR yielded lower longevity for composite restoration compared to TCR, suggesting that PCR restorations need to be followed over time, especially when multi-surface restorations are involved. Composite restorations on carious remaining tissue require monitoring over time, especially those performed in more than one surface. Even if the restorations present shortcomings over the time, the majority of them are subject to repair, allowing more conservative approaches for teeth with deep caries lesions. Copyright © 2015 Elsevier Ltd. All

  18. Statistical modelling of survival data with random effects h-likelihood approach

    CERN Document Server

    Ha, Il Do; Lee, Youngjo

    2017-01-01

    This book provides a groundbreaking introduction to the likelihood inference for correlated survival data via the hierarchical (or h-) likelihood in order to obtain the (marginal) likelihood and to address the computational difficulties in inferences and extensions. The approach presented in the book overcomes shortcomings in the traditional likelihood-based methods for clustered survival data such as intractable integration. The text includes technical materials such as derivations and proofs in each chapter, as well as recently developed software programs in R (“frailtyHL”), while the real-world data examples together with an R package, “frailtyHL” in CRAN, provide readers with useful hands-on tools. Reviewing new developments since the introduction of the h-likelihood to survival analysis (methods for interval estimation of the individual frailty and for variable selection of the fixed effects in the general class of frailty models) and guiding future directions, the book is of interest to research...

  19. Survival probabilities of Pugh-child-PBC classified patients in the Euricterus primary biliary cirrhosis population, based on the Mayo clinic prognostic model

    NARCIS (Netherlands)

    Reisman, Y; vanDam, GM; Gips, CH; Lavelle, SM; CuervasMons, [No Value; deDombal, FT; Gauthier, A; MalchowMoller, A; Molino, G; Theodossi, A; Tsiftsis, DD; Dawids, S; Larsson, L

    1997-01-01

    Background/Aims: Estimation of prognosis becomes increasingly important in primary biliary cirrhosis (PBC) with advancing disease and also with regard to patient management. The ubiquitous used Pugh scoring for severity of disease is simple while the Mayo model which has been validated for survival

  20. Random Linear Network Coding is Key to Data Survival in Highly Dynamic Distributed Storage

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Fitzek, Frank; Roetter, Daniel Enrique Lucani

    2015-01-01

    and Reed-Solomon mechanisms. Our results use traces from a BitTorrent client for Android devices to show that RLNC outperforms the next best scheme (fully centralized Reed-Solomon) not only by having a much lower probability of data loss, but by reducing storage requirements by up to 50% and reconstruction...

  1. Personalized Risk Prediction in Clinical Oncology Research: Applications and Practical Issues Using Survival Trees and Random Forests.

    Science.gov (United States)

    Hu, Chen; Steingrimsson, Jon Arni

    2017-10-19

    A crucial component of making individualized treatment decisions is to accurately predict each patient's disease risk. In clinical oncology, disease risks are often measured through time-to-event data, such as overall survival and progression/recurrence-free survival, and are often subject to censoring. Risk prediction models based on recursive partitioning methods are becoming increasingly popular largely due to their ability to handle nonlinear relationships, higher-order interactions, and/or high-dimensional covariates. The most popular recursive partitioning methods are versions of the Classification and Regression Tree (CART) algorithm, which builds a simple interpretable tree structured model. With the aim of increasing prediction accuracy, the random forest algorithm averages multiple CART trees, creating a flexible risk prediction model. Risk prediction models used in clinical oncology commonly use both traditional demographic and tumor pathological factors as well as high-dimensional genetic markers and treatment parameters from multimodality treatments. In this article, we describe the most commonly used extensions of the CART and random forest algorithms to right-censored outcomes. We focus on how they differ from the methods for noncensored outcomes, and how the different splitting rules and methods for cost-complexity pruning impact these algorithms. We demonstrate these algorithms by analyzing a randomized Phase III clinical trial of breast cancer. We also conduct Monte Carlo simulations to compare the prediction accuracy of survival forests with more commonly used regression models under various scenarios. These simulation studies aim to evaluate how sensitive the prediction accuracy is to the underlying model specifications, the choice of tuning parameters, and the degrees of missing covariates.

  2. Effect of everolimus on survival in advanced hepatocellular carcinoma after failure of sorafenib: the EVOLVE-1 randomized clinical trial.

    Science.gov (United States)

    Zhu, Andrew X; Kudo, Masatoshi; Assenat, Eric; Cattan, Stéphane; Kang, Yoon-Koo; Lim, Ho Yeong; Poon, Ronnie T P; Blanc, Jean-Frederic; Vogel, Arndt; Chen, Chao-Long; Dorval, Etienne; Peck-Radosavljevic, Markus; Santoro, Armando; Daniele, Bruno; Furuse, Junji; Jappe, Annette; Perraud, Kevin; Anak, Oezlem; Sellami, Dalila B; Chen, Li-Tzong

    2014-07-02

    Aside from the multikinase inhibitor sorafenib, there are no effective systemic therapies for the treatment of advanced hepatocellular carcinoma. To assess the efficacy of everolimus in patients with advanced hepatocellular carcinoma for whom sorafenib treatment failed. EVOLVE-1 was a randomized, double-blind, phase 3 study conducted among 546 adults with Barcelona Clinic Liver Cancer stage B or C hepatocellular carcinoma and Child-Pugh A liver function whose disease progressed during or after sorafenib or who were intolerant of sorafenib. Patients were enrolled from 17 countries between May 2010 and March 2012. Randomization was stratified by region (Asia vs rest of world) and macrovascular invasion (present vs absent). Everolimus, 7.5 mg/d, or matching placebo, both given in combination with best supportive care and continued until disease progression or intolerable toxicity. Per the 2:1 randomization scheme, 362 patients were randomized to the everolimus group and 184 patients to the placebo group. The primary end point was overall survival. Secondary end points included time to progression and the disease control rate (the percentage of patients with a best overall response of complete or partial response or stable disease). No significant difference in overall survival was seen between treatment groups, with 303 deaths (83.7%) in the everolimus group and 151 deaths (82.1%) in the placebo group (hazard ratio [HR], 1.05; 95% CI, 0.86-1.27; P = .68; median overall survival, 7.6 months with everolimus, 7.3 months with placebo). Median time to progression with everolimus and placebo was 3.0 months and 2.6 months, respectively (HR, 0.93; 95% CI, 0.75-1.15), and disease control rate was 56.1% and 45.1%, respectively (P = .01). The most common grade 3/4 adverse events for everolimus vs placebo were anemia (7.8% vs 3.3%, respectively), asthenia (7.8% vs 5.5%, respectively), and decreased appetite (6.1% vs 0.5%, respectively). No patients experienced hepatitis C

  3. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  4. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  5. Physical Activity Improves Verbal and Spatial Memory in Older Adults with Probable Mild Cognitive Impairment: A 6-Month Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Lindsay S. Nagamatsu

    2013-01-01

    Full Text Available We report secondary findings from a randomized controlled trial on the effects of exercise on memory in older adults with probable MCI. We randomized 86 women aged 70–80 years with subjective memory complaints into one of three groups: resistance training, aerobic training, or balance and tone (control. All participants exercised twice per week for six months. We measured verbal memory and learning using the Rey Auditory Verbal Learning Test (RAVLT and spatial memory using a computerized test, before and after trial completion. We found that the aerobic training group remembered significantly more items in the loss after interference condition of the RAVLT compared with the control group after six months of training. In addition, both experimental groups showed improved spatial memory performance in the most difficult condition where they were required to memorize the spatial location of three items, compared with the control group. Lastly, we found a significant correlation between spatial memory performance and overall physical capacity after intervention in the aerobic training group. Taken together, our results provide support for the prevailing notion that exercise can positively impact cognitive functioning and may represent an effective strategy to improve memory in those who have begun to experience cognitive decline.

  6. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    , extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...

  7. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  8. Survival probability of larval sprat in response to decadal changes in diel vertical migration behavior and prey abundance in the Baltic Sea

    DEFF Research Database (Denmark)

    Hinrichsen, Hans-Harald; Peck, Myron A.; Schmidt, Jörn

    2010-01-01

    We employed a coupled three-dimensional biophysical model to explore long-term inter- and intra-annual variability in the survival of sprat larvae in the Bornholm Basin, a major sprat spawning area in the Baltic Sea. Model scenarios incorporated observed decadal changes in larval diel vertical...... in the 1990s compared to the 1980s. After changing their foraging strategy by shifting from mid-depth, low prey environment to near-surface waters, first-feeding larvae encountered much higher rates of prey encounter and almost optimal feeding conditions and had a much higher growth potential. Consequently...

  9. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  10. The Norwegian dietary guidelines and colorectal cancer survival (CRC-NORDIET) study: a food-based multicentre randomized controlled trial.

    Science.gov (United States)

    Henriksen, Hege Berg; Ræder, Hanna; Bøhn, Siv Kjølsrud; Paur, Ingvild; Kværner, Ane Sørlie; Billington, Siv Åshild; Eriksen, Morten Tandberg; Wiedsvang, Gro; Erlund, Iris; Færden, Arne; Veierød, Marit Bragelien; Zucknick, Manuela; Smeland, Sigbjørn; Blomhoff, Rune

    2017-01-30

    Colorectal cancer survivors are not only at risk for recurrent disease but also at increased risk of comorbidities such as other cancers, cardiovascular disease, diabetes, hypertension and functional decline. In this trial, we aim at investigating whether a diet in accordance with the Norwegian food-based dietary guidelines and focusing at dampening inflammation and oxidative stress will improve long-term disease outcomes and survival in colorectal cancer patients. This paper presents the study protocol of the Norwegian Dietary Guidelines and Colorectal Cancer Survival study. Men and women aged 50-80 years diagnosed with primary invasive colorectal cancer (Stage I-III) are invited to this randomized controlled, parallel two-arm trial 2-9 months after curative surgery. The intervention group (n = 250) receives an intensive dietary intervention lasting for 12 months and a subsequent maintenance intervention for 14 years. The control group (n = 250) receives no dietary intervention other than standard clinical care. Both groups are offered equal general advice of physical activity. Patients are followed-up at 6 months and 1, 3, 5, 7, 10 and 15 years after baseline. The study center is located at the Department of Nutrition, University of Oslo, and patients are recruited from two hospitals within the South-Eastern Norway Regional Health Authority. Primary outcomes are disease-free survival and overall survival. Secondary outcomes are time to recurrence, cardiovascular disease-free survival, compliance to the dietary recommendations and the effects of the intervention on new comorbidities, intermediate biomarkers, nutrition status, physical activity, physical function and quality of life. The current study is designed to gain a better understanding of the role of a healthy diet aimed at dampening inflammation and oxidative stress on long-term disease outcomes and survival in colorectal cancer patients. Since previous research on the role of diet for

  11. Effect of Azithromycin on Airflow Decline-Free Survival After Allogeneic Hematopoietic Stem Cell Transplant: The ALLOZITHRO Randomized Clinical Trial.

    Science.gov (United States)

    Bergeron, Anne; Chevret, Sylvie; Granata, Angela; Chevallier, Patrice; Vincent, Laure; Huynh, Anne; Tabrizi, Reza; Labussiere-Wallet, Hélène; Bernard, Marc; Chantepie, Sylvain; Bay, Jacques-Olivier; Thiebaut-Bertrand, Anne; Thepot, Sylvain; Contentin, Nathalie; Fornecker, Luc-Matthieu; Maillard, Natacha; Risso, Karine; Berceanu, Ana; Blaise, Didier; Peffault de La Tour, Regis; Chien, Jason W; Coiteux, Valérie; Socié, Gérard

    2017-08-08

    Bronchiolitis obliterans syndrome has been associated with increased morbidity and mortality after allogeneic hematopoietic stem cell transplant (HSCT). Previous studies have suggested that azithromycin may reduce the incidence of post-lung transplant bronchiolitis obliterans syndrome. To evaluate if the early administration of azithromycin can improve airflow decline-free survival after allogeneic HSCT. The ALLOZITHRO parallel-group trial conducted in 19 French academic transplant centers and involving participants who were at least 16 years old, had undergone allogeneic HSCT for a hematological malignancy, and had available pretransplant pulmonary function test results. Enrollment was from February 2014 to August 2015 with follow-up through April 26, 2017. Patients were randomly assigned to receive 3 times a week either 250 mg of azithromycin (n = 243) or placebo (n = 237) for 2 years, starting at the time of the conditioning regimen. The primary efficacy end point was airflow decline-free survival at 2 years after randomization. Main secondary end points were overall survival and bronchiolitis obliterans syndrome at 2 years. Thirteen months after enrollment, the independent data and safety monitoring board detected an unanticipated imbalance across blinded groups in the number of hematological relapses, and the treatment was stopped December 26, 2016. Among 480 randomized participants, 465 (97%) were included in the modified intention-to-treat analysis (mean age, 52 [SD, 14] years; 75 women [35%]). At the time of data cutoff, 104 patients (22%; 54 azithromycin vs 50 placebo) had experienced an airflow decline; 138 patients (30%) died (78 azithromycin vs 60 placebo). Two-year airflow decline-free survival was 32.8% (95% CI, 25.9%-41.7%) with azithromycin and 41.3% (95% CI, 34.1%-50.1%) with placebo (unadjusted hazard ratio [HR], 1.3; 95% CI, 1.02-1.70; P = .03). Of the 22 patients (5%) who experienced bronchiolitis obliterans syndrome, 15 (6%) were in

  12. Diversity and survival of artificial lifeforms under sedimentation and random motion.

    Science.gov (United States)

    Glade, Nicolas; Bastien, Olivier; Ballet, Pascal

    2017-12-01

    Cellular automata are often used to explore the numerous possible scenarios of what could have occurred at the origins of life and before, during the prebiotic ages, when very simple molecules started to assemble and organise into larger catalytic or informative structures, or to simulate ecosystems. Artificial self-maintained spatial structures emerge in cellular automata and are often used to represent molecules or living organisms. They converge generally towards homogeneous stationary soups of still-life creatures. It is hard for an observer to believe they are similar to living systems, in particular because nothing is moving anymore within such simulated environments after few computation steps, because they present isotropic spatial organisation, because the diversity of self-maintained morphologies is poor, and because when stationary states are reached the creatures are immortal. Natural living systems, on the contrary, are composed of a high diversity of creatures in interaction having limited lifetimes and generally present a certain anisotropy of their spatial organisation, in particular frontiers and interfaces. In the present work, we propose that the presence of directional weak fields such as gravity may counter-balance the excess of mixing and disorder caused by Brownian motion and favour the appearance of specific regions, i.e. different strata or environmental layers, in which physical-chemical conditions favour the emergence and the survival of self-maintained spatial structures including living systems. We test this hypothesis by way of numerical simulations of a very simplified ecosystem model. We use the well-known Game of Life to which we add rules simulating both sedimentation forces and thermal agitation. We show that this leads to more active (vitality and biodiversity) and robust (survival) dynamics. This effectively suggests that coupling such physical processes to reactive systems allows the separation of environments into different

  13. Survival Outcomes With Short-Course Radiation Therapy in Elderly Patients With Glioblastoma: Data From a Randomized Phase 3 Trial.

    Science.gov (United States)

    Guedes de Castro, Douglas; Matiello, Juliana; Roa, Wilson; Ghosh, Sunita; Kepka, Lucyna; Kumar, Narendra; Sinaika, Valery; Lomidze, Darejan; Hentati, Dalenda; Rosenblatt, Eduardo; Fidarova, Elena

    2017-07-15

    To perform a subset analysis of survival outcomes in elderly patients with glioblastoma from a randomized phase 3 trial comparing 2 short-course radiation therapy (RT) regimens in elderly and/or frail patients. The original trial population included elderly and/or frail patients with a diagnosis of glioblastoma. Patients joined the phase 3, randomized, multicenter, prospective, noninferiority trial; were assigned to 1 of 2 groups in a 1:1 ratio, either short-course RT (25 Gy in 5 fractions, arm 1) or commonly used RT (40 Gy in 15 fractions, arm 2); and were stratified by age (elderly and frail patients were defined as patients aged ≥65 years with KPS of 50%-70%; elderly and non-frail patients were defined as patients aged ≥65 years with KPS of 80%-100%); 61 of the 98 initial patients comprised the patient population, with 26 patients randomized to arm 1 and 35 to arm 2. In this unplanned analysis, the short-course RT results were not statistically significantly different from the results of commonly used RT in elderly patients. The median overall survival time was 6.8 months (95% confidence interval [CI], 4.5-9.1 months) in arm 1 and 6.2 months (95% CI, 4.7-7.7 months) in arm 2 (P=.936). The median progression-free survival time was 4.3 months (95% CI, 2.6-5.9 months) in arm 1 and 3.2 months (95% CI, 0.1-6.3 months) in arm 2 (P=.706). A short-course RT regimen of 25 Gy in 5 fractions is an acceptable treatment option for patients aged ≥65 years, mainly those with a poor performance status or contraindication to chemotherapy, which would be indicated in cases of methylated O6 methylguanine-DNA-methyltransferase promoter tumors. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  15. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  16. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  17. Meta-analyses of randomized controlled trials show suboptimal validity of surrogate outcomes for overall survival in advanced colorectal cancer.

    Science.gov (United States)

    Ciani, Oriana; Buyse, Marc; Garside, Ruth; Peters, Jaime; Saad, Everardo D; Stein, Ken; Taylor, Rod S

    2015-07-01

    To quantify and compare the treatment effects on three surrogate end points, progression-free survival (PFS), time to progression (TTP), and tumor response rate (TR) vs. overall survival (OS) based on a meta-analysis of randomized controlled trials (RCTs) of drug interventions in advanced colorectal cancer (aCRC). We systematically searched for RCTs of pharmacologic therapies in aCRC between 2003 and 2013. Trial characteristics, risk of bias, and outcomes were recorded based on a predefined form. Univariate and multivariate random-effects meta-analyses were used to estimate pooled summary treatment effects. The ratio of hazard ratios (HRs)/odds ratios (ORs) and difference in medians were used to quantify the degree of difference in treatment effects on the surrogate end points and OS. Spearman ρ, surrogate threshold effect (STE), and R(2) were also estimated across predefined trial-level covariates. We included 101 RCTs. In univariate and multivariate meta-analyses, we found larger treatment effects for the surrogates than for OS. Compared with OS, treatment effects were on average 13% higher when HRs were measured and 3% to 45% higher when ORs were considered; differences in median PFS/TTP were higher than on OS by an average of 0.5 month. Spearman ρ ranged from 0.39 to 0.80, mean R(2) from 0.06 to 0.65, and STE was 0.8 for HRPFS, 0.64 for HRTTP, or 0.28 for ORTR. The stratified analyses revealed high variability across all strata. None of the end points in this study were found to achieve the level of evidence (ie, mean R(2)trial > 0.60) that has been set to select high or excellent correlation levels by common surrogate evaluation tools. Previous surrogacy relationships observed between PFS and TTP vs. OS in selected settings may not apply across other classes or lines of therapy. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Survival time outcomes in randomized, controlled trials and meta-analyses: the parallel universes of efficacy and cost-effectiveness.

    Science.gov (United States)

    Guyot, Patricia; Welton, Nicky J; Ouwens, Mario J N M; Ades, A E

    2011-01-01

    Many regulatory agencies require that manufacturers establish both efficacy and cost-effectiveness. The statistical analysis of the randomized, controlled trial (RCT) outcomes should be the same for both purposes. The question addressed by this article is the following: for survival outcomes, what is the relationship between the statistical analyses used to support inference and the statistical model used to support decision making based on cost-effectiveness analysis (CEA)? We performed a review of CEAs alongside trials and CEAs based on a synthesis of RCT results, which were submitted to the National Institute for Health and Clinical Excellence (NICE) Technology Appraisal program and included survival outcomes. We recorded the summary statistics and the statistical models used in both efficacy and cost-effectiveness analyses as well as procedures for model diagnosis and selection. In no case was the statistical model for efficacy and CEA the same. For efficacy, relative risks or Cox regression was used. For CEA, the common practice was to fit a parametric model to the control arm, then to apply the hazard ratio from the efficacy analysis to predict the treatment arm. The proportional hazards assumption was seldom checked; the choice of model was seldom based on formal criteria, and uncertainty in model choice was seldom addressed and never propagated through the model. Both inference and decisions based on CEAs should be based on the same statistical model. This article shows that for survival outcomes, this is not the case. In the interests of transparency, trial protocols should specify a common procedure for model choice for both purposes. Further, the sufficient statistics and the life tables for each arm should be reported to improve transparency and to facilitate secondary analyses of results of RCTs. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Survival and success rates of immediately and early loaded implants: 12-month results from a multicentric randomized clinical study.

    Science.gov (United States)

    Grandi, Tommaso; Garuti, Giovanna; Guazzi, Paolo; Tarabini, Luciano; Forabosco, Andrea

    2012-06-01

    Our objective was to compare survival and peri-implant bone levels of immediately nonocclusally vs early loaded implants in partially edentulous patients up to 12 months after implant placement. Eighty patients (inclusion criteria: general good health, good oral hygiene, 30-65 years old; exclusion criteria: head and neck irradiation/cancer, pregnancy, uncontrolled diabetes, substance abuse, bruxism, lack of opposing occluding dentition, smokers >10 cigarettes/day, need for bone augmentation procedures) were selected in 5 Italian study centers and randomized into 2 groups: 40 patients in the immediately loaded group (minimal insertion torque 30 Ncm) and 40 patients in the early loaded group. Immediately loaded implants were provided with nonoccluding temporary restorations. Final restorations were provided 2 months later. Early loaded implants were provided with a definitive restoration after 2 months. Peri-implant bone resorption was evaluated radiographically with software (ImageJ 1.42). No dropout occurred. Both groups gradually lost peri-implant bone. After 12 months, patients of both groups lost an average of 0.4 mm of peri-implant bone. There were no statistically significant differences (evaluated with t test) between the 2 loading strategies for peri-implant bone level changes at 2 (P = .6730), 6 (P = .6613) and 12 (P = .5957) months or for survival rates (100% in both groups). If adequate primary stability is achieved, immediate loading of dental implants can provide similar success rates, survival rates, and peri-implant bone resorption as compared with early loading, as evaluated in the present study.

  20. A Randomized Study on Postrelapse Disease-Free Survival with Adjuvant Mistletoe versus Oral Etoposide in Osteosarcoma Patients

    Directory of Open Access Journals (Sweden)

    Alessandra Longhi

    2014-01-01

    Full Text Available Background. Osteosarcoma is a highly malignant bone tumour. After the second relapse, the 12-month postrelapse disease-free survival (PRDFS rate decreases below 20%. Oral Etoposide is often used in clinical practice after surgery as an “adjuvant” outside any protocol and with only limited evidence of improved survival. Viscum album fermentatum Pini (Viscum is an extract of mistletoe plants grown on pine trees for subcutaneous (sc injection with immunomodulatory activity. Methods. Encouraged by preliminary findings, we conducted a study where osteosarcoma patients free from disease after second metastatic relapse were randomly assigned to Viscum sc or Oral Etoposide. Our goal was to compare 12-month PRDFS rates with an equivalent historical control group. Results. Twenty patients have been enrolled, with a median age of 34 years (range 11–65 and a median follow-up time of 38.5 months (3–73. The median PRDSF is currently 4 months (1–47 in the Etoposide and 39 months (2–73 in the Viscum group. Patients getting Viscum reported a higher quality of life due to lower toxicity. Conclusion. Viscum shows promise as adjuvant treatment in prolonging PRDFS after second relapse in osteosarcoma patients. A larger study is required to conclusively determine efficacy and immunomodulatory mechanisms of Viscum therapy in osteosarcoma patients.

  1. A Randomized Study on Postrelapse Disease-Free Survival with Adjuvant Mistletoe versus Oral Etoposide in Osteosarcoma Patients.

    Science.gov (United States)

    Longhi, Alessandra; Reif, Marcus; Mariani, Erminia; Ferrari, Stefano

    2014-01-01

    Background. Osteosarcoma is a highly malignant bone tumour. After the second relapse, the 12-month postrelapse disease-free survival (PRDFS) rate decreases below 20%. Oral Etoposide is often used in clinical practice after surgery as an "adjuvant" outside any protocol and with only limited evidence of improved survival. Viscum album fermentatum Pini (Viscum) is an extract of mistletoe plants grown on pine trees for subcutaneous (sc) injection with immunomodulatory activity. Methods. Encouraged by preliminary findings, we conducted a study where osteosarcoma patients free from disease after second metastatic relapse were randomly assigned to Viscum sc or Oral Etoposide. Our goal was to compare 12-month PRDFS rates with an equivalent historical control group. Results. Twenty patients have been enrolled, with a median age of 34 years (range 11-65) and a median follow-up time of 38.5 months (3-73). The median PRDSF is currently 4 months (1-47) in the Etoposide and 39 months (2-73) in the Viscum group. Patients getting Viscum reported a higher quality of life due to lower toxicity. Conclusion. Viscum shows promise as adjuvant treatment in prolonging PRDFS after second relapse in osteosarcoma patients. A larger study is required to conclusively determine efficacy and immunomodulatory mechanisms of Viscum therapy in osteosarcoma patients.

  2. COUNTRY-LEVEL SOCIOECONOMIC INDICATORS ASSOCIATED WITH SURVIVAL PROBABILITY OF BECOMING A CENTENARIAN AMONG OLDER EUROPEAN ADULTS: GENDER INEQUALITY, MALE LABOUR FORCE PARTICIPATION AND PROPORTIONS OF WOMEN IN PARLIAMENTS.

    Science.gov (United States)

    Kim, Jong In; Kim, Gukbin

    2017-03-01

    This study confirms an association between survival probability of becoming a centenarian (SPBC) for those aged 65 to 69 and country-level socioeconomic indicators in Europe: the gender inequality index (GII), male labour force participation (MLP) rates and proportions of seats held by women in national parliaments (PWP). The analysis was based on SPBC data from 34 countries obtained from the United Nations (UN). Country-level socioeconomic indicator data were obtained from the UN and World Bank databases. The associations between socioeconomic indicators and SPBC were assessed using correlation coefficients and multivariate regression models. The findings show significant correlations between the SPBC for women and men aged 65 to 69 and country-level socioeconomic indicators: GII (r=-0.674, p=0.001), MLP (r=0.514, p=0.002) and PWP (r=0.498, p=0.003). The SPBC predictors for women and men were lower GIIs and higher MLP and PWP (R 2=0.508, p=0.001). Country-level socioeconomic indicators appear to have an important effect on the probability of becoming a centenarian in European adults aged 65 to 69. Country-level gender equality policies in European counties may decrease the risk of unhealthy old age and increase longevity in elders through greater national gender equality; disparities in GII and other country-level socioeconomic indicators impact longevity probability. National longevity strategies should target country-level gender inequality.

  3. Modeling Transport in Fractured Porous Media with the Random-Walk Particle Method: The Transient Activity Range and the Particle-Transfer Probability

    Energy Technology Data Exchange (ETDEWEB)

    Lehua Pan; G.S. Bodvarsson

    2001-10-22

    Multiscale features of transport processes in fractured porous media make numerical modeling a difficult task, both in conceptualization and computation. Modeling the mass transfer through the fracture-matrix interface is one of the critical issues in the simulation of transport in a fractured porous medium. Because conventional dual-continuum-based numerical methods are unable to capture the transient features of the diffusion depth into the matrix (unless they assume a passive matrix medium), such methods will overestimate the transport of tracers through the fractures, especially for the cases with large fracture spacing, resulting in artificial early breakthroughs. We have developed a new method for calculating the particle-transfer probability that can capture the transient features of diffusion depth into the matrix within the framework of the dual-continuum random-walk particle method (RWPM) by introducing a new concept of activity range of a particle within the matrix. Unlike the multiple-continuum approach, the new dual-continuum RWPM does not require using additional grid blocks to represent the matrix. It does not assume a passive matrix medium and can be applied to the cases where global water flow exists in both continua. The new method has been verified against analytical solutions for transport in the fracture-matrix systems with various fracture spacing. The calculations of the breakthrough curves of radionuclides from a potential repository to the water table in Yucca Mountain demonstrate the effectiveness of the new method for simulating 3-D, mountain-scale transport in a heterogeneous, fractured porous medium under variably saturated conditions.

  4. Effect of multiple micronutrient supplementation on survival of HIV-infected children in Uganda: a randomized, controlled trial.

    Science.gov (United States)

    Ndeezi, Grace; Tylleskär, Thorkild; Ndugwa, Christopher M; Tumwine, James K

    2010-06-03

    Micronutrient deficiencies compromise the survival of HIV-infected children in low-income countries. We assessed the effect of multiple micronutrient supplementation on the mortality of HIV-infected children in Uganda. In a randomized, controlled trial, 847 children aged one to five years and attending HIV clinics in Uganda were stratified by antiretroviral therapy (ART, n = 85 versus no ART, n = 762). The children were randomized to six months of either: twice the recommended dietary allowance of 14 micronutrients as the intervention arm (vitamins A, B1, B2, niacin, B6, B12, C, D and E, folate, zinc, copper, iodine and selenium); or the standard recommended dietary allowance of six multivitamins (vitamins A, D2, B1, B2, C and niacin) as a comparative "standard-of-care" arm. Mortality was analyzed at 12 months of follow up using Kaplan Meier curves and the log rank test. Mortality at 12 months was 25 out of 426 (5.9%) children in the intervention arm and 28 out of 421 (6.7%) in the comparative arms: risk ratio 0.9 (95% CI 0.5 - 1.5). Two out of 85 (2.4%) children in the ART stratum died compared with 51 out of 762 (6.7%) in the non-ART stratum. Of those who died in the non-ART stratum, 25 of 383 (6.5%) were in the intervention arm and 26 of 379 (6.9%) in the comparative arm; risk ratio 1.0 (95% CI 0.6 - 1.6). There was no significant difference in survival at 12 months (p = 0.64, log rank test). In addition, there was no significant difference in mean weight-for-height at 12 months; 0.70 +/- 1.43 (95% CI 0.52 - 0.88) for the intervention versus 0.59 +/- 1.15 (95% CI 0.45 - 0.75) in the comparative arm. The mean CD4 cell count; 1024 +/- 592 (95% CI 942 - 1107) versus 1060 +/- 553 (95% CI 985 - 1136) was also similar between the two groups. Twice the recommended dietary allowance of 14 micronutrients compared with a standard recommended dietary allowance of six multivitamins for six months was well tolerated, but it did not significantly alter mortality, growth or CD

  5. Survival Rate of Atraumatic Restorative Treatment (ART) Restorations Using a Glass Ionomer Bilayer Technique with a Nanofilled Coating: A Bi-center Randomized Clinical Trial.

    Science.gov (United States)

    Hesse, Daniela; Bonifácio, Clarissa Calil; Bönecker, Marcelo; Guglielmi, Camila de Almeida Brandão; da Franca, Carolina; van Amerongen, Willem Evert; Colares, Viviane; Raggio, Daniela Prócida

    2016-01-01

    The high-viscosity consistency of glass ionomer cement (GIC) contributes to its inappropriate adaptation, while the material's premature exposure to humidity decreases its mechanical properties. This study's purposes were to: (1) investigate approximal atraumatic restorative treatment (ART) restorations' survival in primary molars using two different insertion techniques and two surface protection materials; and (2) compare the results of cities where treatments were performed. A total of 389 six- to seven-year-olds were selected from two cities in Brazil and randomly assigned into four groups: (1) ART restorations plus petroleum jelly (PJ); (2) bilayer-ART restorations plus PJ; (3) ART restorations plus nanofilled coating for GIC (NC); (4) bilayer-ART restorations plus NC. Restorations were evaluated after one, six, 12, 18, and 24 months. Kaplan-Meier survival analysis, log-rank test, and Cox regression analysis were performed. Restorations' cumulative survival was 46.4 percent. There was a higher survival of bilayer-ART restorations (P=0.03). No difference was observed between surface protection materials (P=0.57). Restorations made in Barueri were almost 2.5-fold more likely to survive than those from Recife (PART restorations' survival in primary molars. The nanofilled coating does not influence restorations' survival rate, and the city where treatments were performed influences restoration survival.

  6. Manual vs. integrated automatic load-distributing band CPR with equal survival after out of hospital cardiac arrest. The randomized CIRC trial

    NARCIS (Netherlands)

    Wik, L.; Olsen, J.A.; Persse, D.; Sterz, F.; Lozano Jr, M.; Brouwer, M.A.; Westfall, M.; Souders, C.M.; Malzer, R.; Grunsven, P.M. van; Travis, D.T.; Whitehead, A.; Herken, U.R.; Lerner, E.B.

    2014-01-01

    OBJECTIVE: To compare integrated automated load distributing band CPR (iA-CPR) with high-quality manual CPR (M-CPR) to determine equivalence, superiority, or inferiority in survival to hospital discharge. METHODS: Between March 5, 2009 and January 11, 2011 a randomized, unblinded, controlled group

  7. Randomized controlled clinical trial of the 24-months survival of composite resin restorations after one-step incomplete and complete excavation on primary teeth

    NARCIS (Netherlands)

    Franzon, R.; Opdam, N.J.; Guimaraes, L.F.; Demarco, F.F.; Casagrande, L.; Haas, A.N de; Araujo, F.B.

    2015-01-01

    OBJECTIVE: This randomized clinical trial aimed to compare the 24-months survival of composite restorations in primary molars after partial caries removal (PCR) and total caries removal (TCR). METHODS: Forty-eight children aged 3-8 years with at least one molar with a deep carious lesion were

  8. Effect of Total Laparoscopic Hysterectomy vs Total Abdominal Hysterectomy on Disease-Free Survival Among Women With Stage I Endometrial Cancer: A Randomized Clinical Trial.

    Science.gov (United States)

    Janda, Monika; Gebski, Val; Davies, Lucy C; Forder, Peta; Brand, Alison; Hogg, Russell; Jobling, Thomas W; Land, Russell; Manolitsas, Tom; Nascimento, Marcelo; Neesham, Deborah; Nicklin, James L; Oehler, Martin K; Otton, Geoff; Perrin, Lewis; Salfinger, Stuart; Hammond, Ian; Leung, Yee; Sykes, Peter; Ngan, Hextan; Garrett, Andrea; Laney, Michael; Ng, Tong Yow; Tam, Karfai; Chan, Karen; Wrede, C David; Pather, Selvan; Simcock, Bryony; Farrell, Rhonda; Robertson, Gregory; Walker, Graeme; Armfield, Nigel R; Graves, Nick; McCartney, Anthony J; Obermair, Andreas

    2017-03-28

    Standard treatment for endometrial cancer involves removal of the uterus, tubes, ovaries, and lymph nodes. Few randomized trials have compared disease-free survival outcomes for surgical approaches. To investigate whether total laparoscopic hysterectomy (TLH) is equivalent to total abdominal hysterectomy (TAH) in women with treatment-naive endometrial cancer. The Laparoscopic Approach to Cancer of the Endometrium (LACE) trial was a multinational, randomized equivalence trial conducted between October 7, 2005, and June 30, 2010, in which 27 surgeons from 20 tertiary gynecological cancer centers in Australia, New Zealand, and Hong Kong randomized 760 women with stage I endometrioid endometrial cancer to either TLH or TAH. Follow-up ended on March 3, 2016. Patients were randomly assigned to undergo TAH (n = 353) or TLH (n = 407). The primary outcome was disease-free survival, which was measured as the interval between surgery and the date of first recurrence, including disease progression or the development of a new primary cancer or death assessed at 4.5 years after randomization. The prespecified equivalence margin was 7% or less. Secondary outcomes included recurrence of endometrial cancer and overall survival. Patients were followed up for a median of 4.5 years. Of 760 patients who were randomized (mean age, 63 years), 679 (89%) completed the trial. At 4.5 years of follow-up, disease-free survival was 81.3% in the TAH group and 81.6% in the TLH group. The disease-free survival rate difference was 0.3% (favoring TLH; 95% CI, -5.5% to 6.1%; P = .007), meeting criteria for equivalence. There was no statistically significant between-group difference in recurrence of endometrial cancer (28/353 in TAH group [7.9%] vs 33/407 in TLH group [8.1%]; risk difference, 0.2% [95% CI, -3.7% to 4.0%]; P = .93) or in overall survival (24/353 in TAH group [6.8%] vs 30/407 in TLH group [7.4%]; risk difference, 0.6% [95% CI, -3.0% to 4.2%]; P = .76). Among women

  9. Improved survival with ursodeoxycholic acid prophylaxis in allogeneic stem cell transplantation: long-term follow-up of a randomized study.

    Science.gov (United States)

    Ruutu, Tapani; Juvonen, Eeva; Remberger, Mats; Remes, Kari; Volin, Liisa; Mattsson, Jonas; Nihtinen, Anne; Hägglund, Hans; Ringdén, Olle

    2014-01-01

    We report the long-term results of a prospective randomized study on the use of ursodeoxycholic acid (UDCA) for prevention of hepatic complications after allogeneic stem cell transplantation. Two hundred forty-two patients, 232 with malignant disease, were randomized to receive (n = 123) or not to receive (n = 119) UDCA from the beginning of the conditioning until 90 days post-transplantation. The results were reported after 1-year follow-up. UDCA administration reduced significantly the proportion of patients developing high serum bilirubin levels as well as the incidence of severe acute graft-versus-host disease (GVHD), liver GVHD, and intestinal GVHD. In the UDCA prophylaxis group, nonrelapse mortality (NRM) was lower and overall survival better than in the control group. After a 10-year follow-up, the difference in the survival and NRM in favor of the UDCA-treated group, seen at 1 year, was maintained (survival 48% versus 38%, P = .037; NRM 28% versus 41%, P = .01). A landmark analysis in patients surviving at 1 year post-transplantation showed no significant differences between the study groups in the long-term follow-up in chronic GVHD, relapse rate, NRM, disease-free survival, or overall survival. These long-term results continue to support the useful role of UDCA in the prevention of transplant-related complications in allogeneic transplantation. Copyright © 2014 American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.

  10. Comparison of survival time of Hawley and Vacuum-formed retainers in orthhodontic patients– a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Moslemzadeh

    2017-01-01

    Full Text Available Background: Maintaining the results of orthodontic treatment and keeping the teeth in the corrected position is a great challenge in orthodontics. This study aimed to compare the survival time of three types of retainers including Hawley, 1-mm Vacuum-Formed (VF, and 1.5-mm VF within 6-month period. Methods: In this randomized clinical study, 152 patients were allocated into three groups to receive one type of the retainers. They were visited 1, 3, and 6 months after retainer delivery and checked for breakage, loss, local perforation, and discoloration from the patient's and clinician's point of view as indicators of failure. Chi-square and Fisher's exact tests were used as appropriated. Result: The results revealed that breakage was among the main reasons of failure of retainers within 6 months, which was statistically significantly different between Hawley and VF retainers, as well as between 1-mm and 1.5-mm VF retainers in the three intervals (p0.05. Assessing the discoloration from the patient's point of view revealed statistically significant differences between Hawley and VF retainers within the first month; however, the difference was not significant at the third and sixth months (p0.05. By the end of the sixth month, some of the VF retainers had perforation; while, perforation was not observed in Hawley retainers. Conclusion: Considering the higher breakage rate of 1-mm VF, 1.5-mm VF seems the retainer if choice.

  11. Survival paths through the forest

    DEFF Research Database (Denmark)

    Mogensen, Ulla Brasch

    when the information is high-dimensional e.g. when there are many thousands of genes or markers. In these situations machine learning methods such as the random forest can still be applied and provide reasonable prediction accuracy. The main focus in this talk is the performance of random forest...... in particular when the response is three-dimensional. In a diagnostic study of inflammatory bowel disease three classes of patients have to be diagnosed based on microarray gene-expression data. The performance of random forest is compared on a probability scale and on a classification scale to elastic net....... In survival analysis with competing risks I present an extension of random forest using time-dependent pseudo-values to build event risk prediction models. This approach is evaluated with data from Copenhagen stroke study. Further, I will explain how to use the R-package "pec" to evaluate random forests using...

  12. Probability in biology: overview of a comprehensive theory of probability in living systems.

    Science.gov (United States)

    Nakajima, Toshiyuki

    2013-09-01

    Probability is closely related to biological organization and adaptation to the environment. Living systems need to maintain their organizational order by producing specific internal events non-randomly, and must cope with the uncertain environments. These processes involve increases in the probability of favorable events for these systems by reducing the degree of uncertainty of events. Systems with this ability will survive and reproduce more than those that have less of this ability. Probabilistic phenomena have been deeply explored using the mathematical theory of probability since Kolmogorov's axiomatization provided mathematical consistency for the theory. However, the interpretation of the concept of probability remains both unresolved and controversial, which creates problems when the mathematical theory is applied to problems in real systems. In this article, recent advances in the study of the foundations of probability from a biological viewpoint are reviewed, and a new perspective is discussed toward a comprehensive theory of probability for understanding the organization and adaptation of living systems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Measurement uncertainty and probability

    National Research Council Canada - National Science Library

    Willink, Robin

    2013-01-01

    ... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...

  14. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  15. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  16. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  17. Probability machines: consistent probability estimation using nonparametric learning machines.

    Science.gov (United States)

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  18. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  19. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  20. A Randomized Study on Postrelapse Disease-Free Survival with Adjuvant Mistletoe versus Oral Etoposide in Osteosarcoma Patients

    OpenAIRE

    Longhi, Alessandra; Reif, Marcus; Mariani, Erminia; Ferrari, Stefano

    2014-01-01

    Background. Osteosarcoma is a highly malignant bone tumour. After the second relapse, the 12-month postrelapse disease-free survival (PRDFS) rate decreases below 20%. Oral Etoposide is often used in clinical practice after surgery as an “adjuvant” outside any protocol and with only limited evidence of improved survival. Viscum album fermentatum Pini (Viscum) is an extract of mistletoe plants grown on pine trees for subcutaneous (sc) injection with immunomodulatory activity. Methods. Encourage...

  1. Probability distribution of intersymbol distances in random symbolic sequences: Applications to improving detection of keywords in texts and of amino acid clustering in proteins.

    Science.gov (United States)

    Carpena, Pedro; Bernaola-Galván, Pedro A; Carretero-Campos, Concepción; Coronado, Ana V

    2016-11-01

    Symbolic sequences have been extensively investigated in the past few years within the framework of statistical physics. Paradigmatic examples of such sequences are written texts, and deoxyribonucleic acid (DNA) and protein sequences. In these examples, the spatial distribution of a given symbol (a word, a DNA motif, an amino acid) is a key property usually related to the symbol importance in the sequence: The more uneven and far from random the symbol distribution, the higher the relevance of the symbol to the sequence. Thus, many techniques of analysis measure in some way the deviation of the symbol spatial distribution with respect to the random expectation. The problem is then to know the spatial distribution corresponding to randomness, which is typically considered to be either the geometric or the exponential distribution. However, these distributions are only valid for very large symbolic sequences and for many occurrences of the analyzed symbol. Here, we obtain analytically the exact, randomly expected spatial distribution valid for any sequence length and any symbol frequency, and we study its main properties. The knowledge of the distribution allows us to define a measure able to properly quantify the deviation from randomness of the symbol distribution, especially for short sequences and low symbol frequency. We apply the measure to the problem of keyword detection in written texts and to study amino acid clustering in protein sequences. In texts, we show how the results improve with respect to previous methods when short texts are analyzed. In proteins, which are typically short, we show how the measure quantifies unambiguously the amino acid clustering and characterize its spatial distribution.

  2. Cluster-randomized study of intermittent preventive treatment for malaria in infants (IPTi in southern Tanzania: evaluation of impact on survival

    Directory of Open Access Journals (Sweden)

    Schellenberg Joanna

    2011-12-01

    Full Text Available Abstract Background Intermittent Preventive Treatment for malaria control in infants (IPTi consists of the administration of a treatment dose of an anti-malarial drug, usually sulphadoxine-pyrimethamine, at scheduled intervals, regardless of the presence of Plasmodium falciparum infection. A pooled analysis of individually randomized trials reported that IPTi reduced clinical episodes by 30%. This study evaluated the effect of IPTi on child survival in the context of a five-district implementation project in southern Tanzania. [Trial registration: clinical trials.gov NCT00152204]. Methods After baseline household and health facility surveys in 2004, five districts comprising 24 divisions were randomly assigned either to receive IPTi (n = 12 or not (n = 12. Implementation started in March 2005, led by routine health services with support from the research team. In 2007, a large household survey was undertaken to assess the impact of IPTi on survival in infants aged two-11 months through birth history interviews with all women aged 13-49 years. The analysis is based on an "intention-to-treat" ecological design, with survival outcomes analysed according to the cluster in which the mothers lived. Results Survival in infants aged two-11 months was comparable in IPTi and comparison areas at baseline. In intervention areas in 2007, 48% of children aged 12-23 months had documented evidence of receiving three doses of IPTi, compared to 2% in comparison areas (P P = 0.31. Conclusion The lack of evidence of an effect of IPTi on survival could be a false negative result due to a lack of power or imbalance of unmeasured confounders. Alternatively, there could be no mortality impact of IPTi due to low coverage, late administration, drug resistance, decreased malaria transmission or improvements in vector control and case management. This study raises important questions for programme evaluation design.

  3. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  4. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  5. A random matrix/transition state theory for the probability distribution of state-specific unimolecular decay rates: Generalization to include total angular momentum conservation and other dynamical symmetries

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, R.; Miller, W.H.; Moore, C.B. (Department of Chemistry, University of California, and Chemical Sciences Division, Lawrence Berkeley Laboratory, Berkeley, California 94720 (United States)); Polik, W.F. (Department of Chemistry, Hope College, Holland, Michigan 49423 (United States))

    1993-07-15

    A previously developed random matrix/transition state theory (RM/TST) model for the probability distribution of state-specific unimolecular decay rates has been generalized to incorporate total angular momentum conservation and other dynamical symmetries. The model is made into a predictive theory by using a semiclassical method to determine the transmission probabilities of a nonseparable rovibrational Hamiltonian at the transition state. The overall theory gives a good description of the state-specific rates for the D[sub 2]CO[r arrow]D[sub 2]+CO unimolecular decay; in particular, it describes the dependence of the distribution of rates on total angular momentum [ital J]. Comparison of the experimental values with results of the RM/TST theory suggests that there is mixing among the rovibrational states.

  6. Survival in Malnourished Older Patients Receiving Post-Discharge Nutritional Support; Long-Term Results of a Randomized Controlled Trial

    NARCIS (Netherlands)

    Neelemaat, F; van Keeken, S; Langius, J A E; de van der Schueren, M A E; Thijs, A; Bosmans, J E

    2017-01-01

    BACKGROUND: Previous analyses have shown that a post-discharge individualized nutritional intervention had positive effects on body weight, lean body mass, functional limitations and fall incidents in malnourished older patients. However, the impact of this intervention on survival has not yet been

  7. Survival and Complications of Single Dental Implants in the Edentulous Mandible Following Immediate or Delayed Loading: A Randomized Controlled Clinical Trial.

    Science.gov (United States)

    Kern, M; Att, W; Fritzer, E; Kappel, S; Luthardt, R G; Mundt, T; Reissmann, D R; Rädel, M; Stiesch, M; Wolfart, S; Passia, N

    2018-02-01

    It was the aim of this 24-mo randomized controlled clinical trial to investigate whether the survival of a single median implant placed in the edentulous mandible to retain a complete denture is not compromised by immediate loading. Secondary outcomes were differences in prosthetic complications between the loading principles. Each of the 158 patients who received an implant was randomly assigned to the immediate loading group ( n = 81) or the delayed loading group ( n = 77). Recall visits were performed 1 mo after implant placement (for only the delayed loading group) and 1, 4, 12, and 24 mo after implant loading. Nine implants failed in the immediate loading group, all within the first 3 mo of implant loading, and 1 implant failed in the delayed loading group prior to loading. Noninferiority of implant survival of the immediate loading group, as compared with the delayed loading group, could not be shown ( P = 0.81). Consistent with this result, a secondary analysis with Fisher exact test revealed that the observed difference in implant survival between the treatment groups was indeed statistically significant ( P = 0.019). The most frequent prosthetic complications and maintenance interventions in the mandible were retention adjustments, denture fractures, pressure sores, and matrix exchanges. There was only 1 statistically significant difference between the groups regarding the parameter "fracture of the denture base in the ball attachment area" ( P = 0.007). The results indicate that immediate loading of a single implant in the edentulous mandible reveals inferior survival than that of delayed loading and therefore should be considered only in exceptional cases (German Clinical Trials Register: DRKS00003730).

  8. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  9. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  10. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  11. Effect of fish oil, arginine, and doxorubicin chemotherapy on remission and survival time for dogs with lymphoma: a double-blind, randomized placebo-controlled study.

    Science.gov (United States)

    Ogilvie, G K; Fettman, M J; Mallinckrodt, C H; Walton, J A; Hansen, R A; Davenport, D J; Gross, K L; Richardson, K L; Rogers, Q; Hand, M S

    2000-04-15

    Polyunsaturated n-3 fatty acids have been shown to inhibit the growth and metastasis of tumors. This double-blind, randomized study was designed to evaluate the hypothesis that polyunsaturated n-3 fatty acids can improve metabolic parameters, decrease chemical indices of inflammation, enhance quality of life, and extend disease free interval and survival time for dogs treated for lymphoblastic lymphoma with doxorubicin chemotherapy. Thirty-two dogs with lymphoma were randomized to receive one of two diets supplemented with menhaden fish oil and arginine (experimental diet) or an otherwise identical diet supplemented with soybean oil (control diet). Diets were fed before and after remission was attained with up to five dosages of doxorubicin. Parameters examined included blood concentrations of glucose, lactic acid, and insulin in response to glucose and diet tolerance tests; alpha-1 acid glycoprotein; tumor necrosis factor; interleukin-6; body weight; amino acid profiles; resting energy expenditure; disease free interval (DFI); survival time (ST); and clinical performance scores. Dogs fed the experimental diet had significantly (P diet tolerance testing. Increasing C22:6 levels were significantly (P dogs with Stage III lymphoma fed the experimental diet. Fatty acids of the n-3 series normalize elevated blood lactic acid in a dose-dependent manner, resulting in an increase in DFI and ST for dogs with lymphoma. Copyright 2000 American Cancer Society.

  12. Baseline oxidative defense and survival after 5-7 years among elderly stroke patients at nutritional risk: Follow-up of a randomized, nutritional intervention trial.

    Science.gov (United States)

    Iversen, Per O; Ha, Lisa; Blomhoff, Rune; Hauge, Truls; Veierød, Marit B

    2015-08-01

    Patients at nutritional risk are particularly vulnerable to adverse outcomes of acute stroke. We previously found that increased energy- and protein intervention improved short-term survival among stroke patients with the highest baseline antioxidant capacity. We now examined survival of these patients after 5-7 years. We studied 165 patients >65 years admitted to hospital for acute stroke and enrolled in a randomized nutritional intervention study in 2005-2007. Cox regression analysis was used to estimate the associations between all-cause mortality (through 2011) and baseline plasma levels of antioxidant markers (glutathione reducing capacity, alpha-tocopherol, vitamin C and total carotenoids). We found no significant difference (P = 0.86) in survival between the intervention and control group. Among the tested antioxidant markers, plasma levels above the median for total carotenoids were associated with reduced risk of death in the intervention group (adjusted hazard ratio, 0.29; 95% confidence interval, 0.12-0.71). Hospitalized patients that received enhanced dietary energy- and protein after acute stroke and with baseline plasma total carotenoids above median level, had reduced risk of death after 5-7 years. Further trials testing intervention with diets rich in antioxidants are warranted. Copyright © 2014 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  13. A randomized controlled trial of cognitive-behavioral stress management in breast cancer: survival and recurrence at 11-year follow-up.

    Science.gov (United States)

    Stagl, Jamie M; Lechner, Suzanne C; Carver, Charles S; Bouchard, Laura C; Gudenkauf, Lisa M; Jutagir, Devika R; Diaz, Alain; Yu, Qilu; Blomberg, Bonnie B; Ironson, Gail; Glück, Stefan; Antoni, Michael H

    2015-11-01

    Non-metastatic breast cancer patients often experience psychological distress which may influence disease progression and survival. Cognitive-behavioral stress management (CBSM) improves psychological adaptation and lowers distress during breast cancer treatment and long-term follow-ups. We examined whether breast cancer patients randomized to CBSM had improved survival and recurrence 8-15 years post-enrollment. From 1998 to 2005, women (N = 240) 2-10 weeks post-surgery for non-metastatic Stage 0-IIIb breast cancer were randomized to a 10-week, group-based CBSM intervention (n = 120) or a 1-day psychoeducational seminar control (n = 120). In 2013, 8-15 years post-study enrollment (11-year median), recurrence and survival data were collected. Cox Proportional Hazards Models and Weibull Accelerated Failure Time tests were used to assess group differences in all-cause mortality, breast cancer-specific mortality, and disease-free interval, controlling for biomedical confounders. Relative to the control, the CBSM group was found to have a reduced risk of all-cause mortality (HR = 0.21; 95 % CI [0.05, 0.93]; p = .040). Restricting analyses to women with invasive disease revealed significant effects of CBSM on breast cancer-related mortality (p = .006) and disease-free interval (p = .011). CBSM intervention delivered post-surgery may provide long-term clinical benefit for non-metastatic breast cancer patients in addition to previously established psychological benefits. Results should be interpreted with caution; however, the findings contribute to the limited evidence regarding physical benefits of psychosocial intervention post-surgery for non-metastatic breast cancer. Additional research is necessary to confirm these results and investigate potential explanatory mechanisms, including physiological pathways, health behaviors, and treatment adherence changes.

  14. Estimating Probabilities in Recommendation Systems

    OpenAIRE

    Sun, Mingxuan; Lebanon, Guy; Kidwell, Paul

    2010-01-01

    Recommendation systems are emerging as an important business application with significant economic impact. Currently popular systems include Amazon's book recommendations, Netflix's movie recommendations, and Pandora's music recommendations. In this paper we address the problem of estimating probabilities associated with recommendation system data using non-parametric kernel smoothing. In our estimation we interpret missing items as randomly censored observations and obtain efficient computat...

  15. Multicenter, phase III trial comparing selenium supplementation with observation in gynecologic radiation oncology: follow-up analysis of the survival data 6 years after cessation of randomization.

    Science.gov (United States)

    Muecke, Ralph; Micke, Oliver; Schomburg, Lutz; Glatzel, Michael; Reichl, Berthold; Kisters, Klaus; Schaefer, Ulrich; Huebner, Jutta; Eich, Hans T; Fakhrian, K; Adamietz, Irenaeus A; Buentzel, Jens

    2014-11-01

    In 2010, we reported that selenium (Se) supplementation during radiation therapy (RT) is effective for increasing blood Se levels in Se-deficient cervical and uterine cancer patients, and reduced the number of episodes and severity of RT-induced diarrhea. In the current study, we examine whether of Se supplementation during adjuvant RT affects long-term survival of these patients. Former patients were identified and questioned with respect to their health and well-being. A total of 81 patients were randomized in the initial supplementation study, 39 of whom received Se (selenium group, SeG) and 42 of whom served as controls (control group, CG). When former patients were reidentified after a median follow-up of 70 months (range = 0-136), the actuarial 10-year disease-free survival rate in the SeG was 80.1% compared to 83.2% in the CG (P = .65), and the actuarial 10-year overall survival rate of patients in the SeG was 55.3% compared to 42.7% in the CG (P = .09). Our extended follow-up analysis demonstrates that Se supplementation had no influence on the effectiveness of the anticancer irradiation therapy and did not negatively affect patients' long-term survival. In view of its positive effects on RT-induced diarrhea, we consider Se supplementation to be a meaningful and beneficial adjuvant treatment in Se-deficient cervical and uterine cancer patients while undergoing pelvic radiation therapy. © The Author(s) 2014.

  16. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  17. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  18. Genetic analysis of the cumulative pseudo-survival rate during lactation of Holstein cattle in Japan by using random regression models.

    Science.gov (United States)

    Sasaki, O; Aihara, M; Nishiura, A; Takeda, H; Satoh, M

    2015-08-01

    Longevity is a crucial economic trait in the dairy farming industry. In this study, our objective was to develop a random regression model for genetic evaluation of survival. For the analysis, we used test-day records obtained for the first 5 lactations of 380,252 cows from 1,296 herds in Japan between 2001 and 2010; this data set was randomly divided into 7 subsets. The cumulative pseudo-survival rate (PSR) was determined according to whether a cow was alive (1) or absent (0) in her herd on the test day within each lactation group. Each lactation number was treated as an independent trait in a random regression multiple-trait model (MTM) or as a repeated measure in a random regression single-trait repeatability model (STRM). A proportional hazard model (PHM) was also developed as a piecewise-hazards model. The average (± standard deviation) heritability estimates of the PSR at 365 d in milk (DIM) among the 7 data sets in the first (LG1), second (LG2), and third to fifth lactations (LG3) of the MTM were 0.042±0.007, 0.070±0.012, and 0.084±0.007, respectively. The heritability estimate of the STRM was 0.038±0.004. The genetic correlations of PSR between distinct DIM within or between lactation groups were high when the interval between DIM was short. These results indicated that whereas the genetic factors contributing to the PSR between closely associated DIM would be similar even for different lactation numbers, the genetic factors contributing to PSR would differ between distinct lactation periods. The average (± standard deviation) effective heritability estimate based on the relative risk of the PHM among the 7 data sets was 0.068±0.009. The estimated breeding values (EBV) in LG1, LG2, LG3, the STRM, and the PHM were unbiased estimates of the genetic trend. The absolute values of the Spearman's rank correlation coefficients between the EBV of the relative risk of the PHM and the EBV of PSR at 365 DIM for LG1, LG2, LG3, and the STRM were 0.75, 0.87, 0

  19. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  20. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  1. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the

  2. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  3. Stationary algorithmic probability

    National Research Council Canada - National Science Library

    Müller, Markus

    2010-01-01

    ...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...

  4. Comparison of survival among eligible patients not enrolled versus enrolled in the Collaborative Ocular Melanoma Study (COMS) randomized trial of pre-enucleation radiation of large choroidal melanoma.

    Science.gov (United States)

    Gilson, Marta M; Diener-West, Marie; Hawkins, Barbara S

    2007-01-01

    To compare survival between patients enrolled in the Collaborative Ocular Melanoma Study (COMS) randomized trial of pre-enucleation radiation therapy (PERT) for large choroidal melanoma and eligible patients who did not enroll. COMS clinical center personnel prospectively reported to the COMS Coordinating Center all patients with choroidal melanoma examined between November 1986 and December 1994. Deaths of enrolled patients were reported prospectively by clinical center personnel. In a COMS ancillary study, we retrospectively searched medical records of participating clinical centers, the Social Security Death Index, and the National Death Index to determine vital status of eligible patients not enrolled. Cox proportional hazards analysis was used to compare survival within 10 years of baseline reporting and before July 31, 2000, of enrolled patients versus eligible patients not enrolled. Clinical centers that received local institutional review board approval to participate in this ancillary study prospectively reported on 129 of 299 eligible patients not enrolled in the COMS PERT trial. The baseline characteristics of the 129 patients included in this ancillary study were similar to those of the 170 patients not included; 73 patients were reported as deceased. Previously identified prognostic covariates, i.e., age and longest tumor diameter, were confirmed to predict survival in both enrolled patients and eligible patients not enrolled; trial enrollment was not predictive. After adjusting for prognostic covariates and stratifying by clinical center, the estimated hazard ratio (enrolled vs. not-enrolled) was 1.12 (95% confidence interval: 0.83 to 1.51). The results of the COMS PERT trial should be generalizable to all patients with choroidal melanoma meeting the eligibility criteria for that trial. While the methods we used may not be generalizable to all clinical trials because of unique features of the COMS, other researchers may be able to use similar methods

  5. Survival rate of one-piece dental implants placed with a flapless or flap protocol--a randomized, controlled study: 12-month results.

    Science.gov (United States)

    Froum, Stuart J; Cho, Sang Choon; Elian, Nicholas; Romanos, George; Jalbout, Ziad; Natour, Mazen; Norman, Robert; Neri, Dinah; Tarnow, Dennis P

    2011-01-01

    The purpose of this randomized controlled clinical study was to compare the survival of a one-piece anodically oxidized surface implant when placed with a flapless or flap protocol. Bone loss measurements on radiographs and changes in clinical probing depths 1 year post-definitive restoration placement were recorded and compared. Fifty-two of 60 patients (implants) remained in the study at the 1-year follow-up. At the time of final evaluation, no implant was lost in either group. At the time of placement of the definitive restoration, there was a mean mesial and distal bone gain in both groups compared to bone levels present at the time of implant insertion. There were no significant changes in bone levels between placement of the definitive restoration and those recorded 12 months later, and no significant differences in bone levels between the flap or flapless group at 6 or 12 months were noted. No significant differences were seen either in pocket depth or change in pocket depth at 6 and 12 months in the flapless and flap groups. It was therefore concluded that one-piece anodically oxidized surface implants, 1 year post-definitive restoration insertion, had high survival rates (100%) and stable marginal bone and probing depth levels whether a flapless or flap protocol was used for implant insertion.

  6. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  7. Factual and cognitive probability

    OpenAIRE

    Chuaqui, Rolando

    2012-01-01

    This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...

  8. Evaluating probability forecasts

    OpenAIRE

    Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo

    2011-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...

  9. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    2014-01-01

    In this new edition of this classic text, much of the material has been rearranged and revised for pedagogical reasons. Many classic inequalities and proofs are now incorporated into the text, and many citations have been added.

  10. MRE11-deficiency associated with improved long-term disease free survival and overall survival in a subset of stage III colon cancer patients in randomized CALGB 89803 trial.

    Directory of Open Access Journals (Sweden)

    Thomas Pavelitz

    Full Text Available Colon cancers deficient in mismatch repair (MMR may exhibit diminished expression of the DNA repair gene, MRE11, as a consequence of contraction of a T11 mononucleotide tract. This study investigated MRE11 status and its association with prognosis, survival and drug response in patients with stage III colon cancer.Cancer and Leukemia Group B 89803 (Alliance randomly assigned 1,264 patients with stage III colon cancer to postoperative weekly adjuvant bolus 5-fluorouracil/leucovorin (FU/LV or irinotecan+FU/LV (IFL, with 8 year follow-up. Tumors from these patients were analyzed to determine stability of a T11 tract in the MRE11 gene. The primary endpoint was overall survival (OS, and a secondary endpoint was disease-free survival (DFS. Non-proportional hazards were addressed using time-dependent covariates in Cox analyses.Of 625 tumor cases examined, 70 (11.2% exhibited contraction at the T11 tract in one or both MRE11 alleles and were thus predicted to be deficient in MRE11 (dMRE11. In pooled treatment analyses, dMRE11 patients showed initially reduced DFS and OS but improved long-term DFS and OS compared with patients with an intact MRE11 T11 tract. In the subgroup of dMRE11 patients treated with IFL, an unexplained early increase in mortality but better long-term DFS than IFL-treated pMRE11 patients was observed.Analysis of this relatively small number of patients and events showed that the dMRE11 marker predicts better prognosis independent of treatment in the long-term. In subgroup analyses, dMRE11 patients treated with irinotecan exhibited unexplained short-term mortality. MRE11 status is readily assayed and may therefore prove to be a useful prognostic marker, provided that the results reported here for a relatively small number of patients can be generalized in independent analyses of larger numbers of samples.ClinicalTrials.gov NCT00003835.

  11. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  12. Wear, bone density, functional outcome and survival in vitamin E-incorporated polyethylene cups in reversed hybrid total hip arthroplasty: design of a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    van der Veen Hugo C

    2012-09-01

    Full Text Available Abstract Background Aseptic loosening of total hip arthroplasties is generally caused by periprosthetic bone resorption due to tissue reactions on polyethylene wear particles. In vitro testing of polyethylene cups incorporated with vitamin E shows increased wear resistance. The objective of this study is to compare vitamin E-stabilized highly cross-linked polyethylene with conventional cross-linked polyethylene in “reversed hybrid” total hip arthroplasties (cemented all-polyethylene cups combined with uncemented femoral stems. We hypothesize that the adjunction of vitamin E leads to a decrease in polyethylene wear in the long-term. We also expect changes in bone mineral density, less osteolysis, equal functional scores and increased implant survival in polyethylene cemented cups incorporated with vitamin E in the long-term. Design A double-blinded randomized controlled trial will be conducted. Patients to be included are aged under 70, suffer from non-inflammatory degenerative joint disease of the hip and are scheduled for a primary total hip arthroplasty. The study group will receive a reversed hybrid total hip arthroplasty with a vitamin E-stabilized highly cross-linked polyethylene cemented cup. The control group will receive a reversed hybrid total hip arthroplasty with a conventional cross-linked polyethylene cemented cup. Radiological follow-up will be assessed at 6 weeks and at 1, 3, 5, 7 and 10 years postoperatively, to determine polyethylene wear and osteolysis. Patient-reported functional status (HOOS, physician-reported functional status (Harris Hip Score and patients’ physical activity behavior (SQUASH will also be assessed at these intervals. Acetabular bone mineral density will be assessed by dual energy X-ray absorptiometry (DEXA at 6 weeks and at 1 year and 2 years postoperatively. Implant survival will be determined at 10 years postoperatively. Discussion In vitro results of vitamin E-stabilized polyethylene are promising

  13. Surviving and thriving with cancer using a Web-based health behavior change intervention: randomized controlled trial.

    Science.gov (United States)

    Bantum, Erin O'Carrol; Albright, Cheryl L; White, Kami K; Berenberg, Jeffrey L; Layi, Gabriela; Ritter, Phillip L; Laurent, Diana; Plant, Katy; Lorig, Kate

    2014-02-24

    Given the substantial improvements in cancer screening and cancer treatment in the United States, millions of adult cancer survivors live for years following their initial cancer diagnosis and treatment. However, latent side effects can occur and some symptoms can be alleviated or managed effectively via changes in lifestyle behaviors. The purpose of this study was to test the effectiveness of a six-week Web-based multiple health behavior change program for adult survivors. Participants (n=352) were recruited from oncology clinics, a tumor registry, as well as through online mechanisms, such as Facebook and the Association of Cancer Online Resources (ACOR). Cancer survivors were eligible if they had completed their primary cancer treatment from 4 weeks to 5 years before enrollment. Participants were randomly assigned to the Web-based program or a delayed-treatment control condition. In total, 303 survivors completed the follow-up survey (six months after completion of the baseline survey) and participants in the Web-based intervention condition had significantly greater reductions in insomnia and greater increases in minutes per week of vigorous exercise and stretching compared to controls. There were no significant changes in fruit and vegetable consumption or other outcomes. The Web-based intervention impacted insomnia and exercise; however, a majority of the sample met or exceeded national recommendations for health behaviors and were not suffering from depression or fatigue at baseline. Thus, the survivors were very healthy and well-adjusted upon entry and their ability to make substantial health behavior changes may have been limited. Future work is discussed, with emphasis placed on ways in which Web-based interventions can be more specifically analyzed for benefit, such as in regard to social networking. Clinicaltrials.gov NCT00962494; http://www.clinicaltrials.gov/ct2/show/NCT00962494 (Archived by WebCite at http://www.webcitation.org/6NIv8Dc6Q).

  14. Duration of adjuvant trastuzumab in HER2 positive breast cancer: Overall and disease free survival results from meta-analyses of randomized controlled trials.

    Science.gov (United States)

    Gyawali, Bishal; Niraula, Saroj

    2017-11-01

    One year of trastuzumab, chosen empirically, improves survival of women with early-stage, HER2-positive breast cancer but also adds substantially to cost, toxicity, and inconvenience. Longer treatment does not improve outcomes, but potentiates toxicities. Medline, Embase, and major conference proceedings were searched systematically in June 2017 to identify Randomized Controlled Trials (RCTs) comparing one year versus shorter durations of trastuzumab in adjuvant treatment of breast cancer. Reported Hazard-Ratios (HR) for Overall Survival (OS) and Disease-Free Survival (DFS), and Odds-Ratio for cardiac events, with respective 95% Confidence Intervals (CI) from each study was weighted using generic inverse-variance, and pooled in a meta-analysis. Inter-study heterogeneity and sub-group difference (based on hormone-receptors and node-positivity) were assessed using I 2 , and chi 2 statistics, respectively. Four studies (n=7614) satisfied inclusion criteria. Individual RCTs had diverse pre-specified upper-limits of 95% CI for declaring non-inferiority (range: <1.15 to <1.53). Pooled results demonstrated significant improvements in OS (HR 1.28, p=0.04), and DFS (HR 1.24, p=0.005) with 1year of trastuzumab compared to shorter durations. Absence of multiplicity argument allowed for declaring superiority of 1year of trastuzumab based on our results despite non-inferiority designs of individual trials. No influence on overall effect by duration of trastuzumab in experimental arm (9weeks versus 6months) was noted. No statistical interaction by hormone-receptor status and node-positivity on overall results was noticed [p(sub-group difference) 0.73, and 0.52, respectively]. Odds-Ratio for cardiac events was 2.65 (p<0.001) favoring shorter duration. One year of trastuzumab prolongs overall, and disease-free survivals in women with early-stage HER2 positive breast cancer compared to shorter durations and this should remain as the standard of care. Cardiotoxicity increased

  15. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  16. Probability Analysis of a Quantum Computer

    OpenAIRE

    Einarsson, Göran

    2003-01-01

    The quantum computer algorithm by Peter Shor for factorization of integers is studied. The quantum nature of a QC makes its outcome random. The output probability distribution is investigated and the chances of a successful operation is determined

  17. [Analysis of survival and mortality curves with the model of vital receptors. The maximal life span. Effect of temperature on the life span. The mortality probability density function (mortality curve) and its parameters].

    Science.gov (United States)

    Poltorakov, A P

    2001-01-01

    We have continued an analysis of survival curves by the model of the vital receptors (MVR). The main types survival function (E-, TW- and GM-distributions) have been considered. It was found that the maximal life span depends on the threshold concentration of vital receptors. Equations are obtained for the dependence of the maximal life span on the kinetic parameters in the reactions of inactivation, destruction and inactivation. Dependence of maximal time life on initial size of the population have been considered. The influence of temperature on the survival curves is analysed by E-distribution. Equations are founded for the description of thermosurvival and thermoinactivation curves. Equation are obtained for the dependence of density function and it characteristics (modal and antimodal age, coefficient of asymmetry) on the MVR parameters. It was shown that E-, TW- and GM-distribution has different types of asymmetry. The coefficient of asymmetry of GM-distribution is associated on the MVR parameters. It is assumed that symmetry of the curves of mortality and birth-rate is coordinated by the mechanisms of MVR.

  18. Efficient probability sequence

    OpenAIRE

    Regnier, Eva

    2014-01-01

    A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...

  19. Efficient probability sequences

    OpenAIRE

    Regnier, Eva

    2014-01-01

    DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...

  20. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  1. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...

  2. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...

  3. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  4. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  5. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  6. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  7. Oxygen boundary crossing probabilities.

    Science.gov (United States)

    Busch, N A; Silver, I A

    1987-01-01

    The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.

  8. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  9. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. In All Probability, Probability is not All

    Science.gov (United States)

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  11. The impact of prenatal vitamin A and zinc supplementation on birth size and neonatal survival - a double-blind, randomized controlled trial in a rural area of Indonesia.

    Science.gov (United States)

    Prawirohartono, Endy P; Nyström, Lennarth; Nurdiati, Detty S; Hakimi, Mohammad; Lind, Torbjörn

    2013-01-01

    Prenatal supplementation with micronutrients may increase birth weight and thus improve infant health and survival in settings where infants and children are at risk of micronutrient deficiencies. To assess whether vitamin A and/or zinc supplementation given during pregnancy can improve birth weight, birth length, neonatal morbidity, or infant mortality. A double-blind, randomized controlled trial supplementing women (n = 2173) in Central Java, Indonesia throughout pregnancy with vitamin A, zinc, combined vitamin A+zinc, or placebo. Out of 2173 supplemented pregnant women, 1956 neonates could be evaluated. Overall, zinc supplementation improved birth length compared to placebo or combined vitamin A+zinc (48.8 vs. 48.5 cm, p = 0.04); vitamin A supplementation improved birth length compared to placebo or combined vitamin A+zinc (48.7 vs. 48.2 cm, p = 0.04). These effects remained after adjusting for maternal height, pre-pregnancy weight, and parity. There was no effect of supplementation on birth weight, the proportion of low birth weight, neonatal morbidity, or mortality. Prenatal zinc or vitamin A supplementation demonstrates a small but significant effect on birth length, but supplementation with zinc, vitamin A or a combination of zinc and vitamin A, have no effect on birth weight, neonatal morbidity, or mortality.

  12. Survival-time statistics for sample space reducing stochastic processes.

    Science.gov (United States)

    Yadav, Avinash Chand

    2016-04-01

    Stochastic processes wherein the size of the state space is changing as a function of time offer models for the emergence of scale-invariant features observed in complex systems. I consider such a sample-space reducing (SSR) stochastic process that results in a random sequence of strictly decreasing integers {x(t)},0≤t≤τ, with boundary conditions x(0)=N and x(τ) = 1. This model is shown to be exactly solvable: P_{N}(τ), the probability that the process survives for time τ is analytically evaluated. In the limit of large N, the asymptotic form of this probability distribution is Gaussian, with mean and variance both varying logarithmically with system size: 〈τ〉∼lnN and σ_{τ}^{2}∼lnN. Correspondence can be made between survival-time statistics in the SSR process and record statistics of independent and identically distributed random variables.

  13. Statistics and Probability

    Directory of Open Access Journals (Sweden)

    Laktineh Imad

    2010-04-01

    Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  14. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  15. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  16. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  17. Difficulties related to Probabilities

    OpenAIRE

    Rosinger, Elemer Elad

    2010-01-01

    Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.

  18. Dynamic update with probabilities

    NARCIS (Netherlands)

    Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld

    2009-01-01

    Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant

  19. Elements of quantum probability

    NARCIS (Netherlands)

    Kummerer, B.; Maassen, H.

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with

  20. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  1. Probability on compact Lie groups

    CERN Document Server

    Applebaum, David

    2014-01-01

    Probability theory on compact Lie groups deals with the interaction between “chance” and “symmetry,” a beautiful area of mathematics of great interest in its own sake but which is now also finding increasing applications in statistics and engineering (particularly with respect to signal processing). The author gives a comprehensive introduction to some of the principle areas of study, with an emphasis on applicability. The most important topics presented are: the study of measures via the non-commutative Fourier transform, existence and regularity of densities, properties of random walks and convolution semigroups of measures, and the statistical problem of deconvolution. The emphasis on compact (rather than general) Lie groups helps readers to get acquainted with what is widely seen as a difficult field but which is also justified by the wealth of interesting results at this level and the importance of these groups for applications. The book is primarily aimed at researchers working in probability, s...

  2. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  3. An Improved Upper Bound for the Critical Probability of the Frog Model on Homogeneous Trees

    Science.gov (United States)

    Lebensztayn, Élcio; Machado, Fábio P.; Popov, Serguei

    2005-04-01

    We study the frog model on homogeneous trees, a discrete time system of simple symmetric random walks whose description is as follows. There are active and inactive particles living on the vertices. Each active particle performs a simple symmetric random walk having a geometrically distributed random lifetime with parameter (1 - p). When an active particle hits an inactive particle, the latter becomes active. We obtain an improved upper bound for the critical parameter for having indefinite survival of active particles, in the case of one-particle-per-vertex initial configuration. The main tool is to construct a class of branching processes which are dominated by the frog model and analyze their supercritical behavior. This approach allows us also to present an upper bound for the critical probability in the case of random initial configuration.

  4. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  5. Cost-effectiveness of invitation to food supplementation early in pregnancy combined with multiple micronutrients on infant survival: analysis of data from MINIMat randomized trial, Bangladesh.

    Science.gov (United States)

    Shaheen, Rubina; Persson, Lars Åke; Ahmed, Shakil; Streatfield, Peter Kim; Lindholm, Lars

    2015-05-28

    Absence of cost-effectiveness (CE) analyses limits the relevance of large-scale nutrition interventions in low-income countries. We analyzed if the effect of invitation to food supplementation early in pregnancy combined with multiple micronutrient supplements (MMS) on infant survival represented value for money compared to invitation to food supplementation at usual time in pregnancy combined with iron-folic acid. Outcome data, infant mortality (IM) rates, came from MINIMat trial (Maternal and Infant Nutrition Interventions, Matlab, ISRCTN16581394). In MINIMat, women were randomized to early (E around 9 weeks of pregnancy) or usual invitation (U around 20 weeks) to food supplementation and daily doses of 30 mg, or 60 mg iron with 400 μgm of folic acid, or MMS with 15 micronutrients including 30 mg iron and 400 μgm of folic acid. In MINIMat, EMMS significantly reduced IM compared to UFe60F (U plus 60 mg iron 400 μgm Folic acid). We present incremental CE ratios for incrementing UFe60F to EMMS. Costing data came mainly from a published study. By incrementing UFe60F to EMMS, one extra IM could be averted at a cost of US$907 and US$797 for NGO run and government run CNCs, respectively, and at US$1024 for a hypothetical scenario of highest cost. These comparisons generated one extra life year (LY) saved at US$30, US$27, and US$34, respectively. Incrementing UFe60F to EMMS in pregnancy seems worthwhile from health economic and public health standpoints. Maternal and Infant Nutrition Interventions, Matlab; ISRCTN16581394 ; Date of registration: Feb 16, 2009.

  6. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  7. Precise calculation of a bond percolation transition and survival rates of nodes in a complex network.

    Science.gov (United States)

    Kawamoto, Hirokazu; Takayasu, Hideki; Jensen, Henrik Jeldtoft; Takayasu, Misako

    2015-01-01

    Through precise numerical analysis, we reveal a new type of universal loopless percolation transition in randomly removed complex networks. As an example of a real-world network, we apply our analysis to a business relation network consisting of approximately 3,000,000 links among 300,000 firms and observe the transition with critical exponents close to the mean-field values taking into account the finite size effect. We focus on the largest cluster at the critical point, and introduce survival probability as a new measure characterizing the robustness of each node. We also discuss the relation between survival probability and k-shell decomposition.

  8. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  9. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  10. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  11. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  12. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  13. Elements of quantum probability

    OpenAIRE

    Kummerer, B.; Maassen, Hans

    1996-01-01

    This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of ‘quantum coin tosses’ are discussed, closely related to V.F.R....

  14. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  15. Statistics of adaptive optics speckles: From probability cloud to probability density function

    OpenAIRE

    Yaitskova, Natalia; Gladysz, Szymon

    2016-01-01

    The complex amplitude in the focal plane of adaptive optics system is modelled as an elliptical complex random variable. The geometrical properties of the probability density function of such variable relate directly to the statistics of the residual phase. Building solely on the twodimensional geometry, the expression for the probability density function of speckle intensity is derived.

  16. Genetic correlations between the cumulative pseudo-survival rate, milk yield, and somatic cell score during lactation in Holstein cattle in Japan using a random regression model.

    Science.gov (United States)

    Sasaki, O; Aihara, M; Nishiura, A; Takeda, H

    2017-09-01

    Trends in genetic correlations between longevity, milk yield, and somatic cell score (SCS) during lactation in cows are difficult to trace. In this study, changes in the genetic correlations between milk yield, SCS, and cumulative pseudo-survival rate (PSR) during lactation were examined, and the effect of milk yield and SCS information on the reliability of estimated breeding value (EBV) of PSR were determined. Test day milk yield, SCS, and PSR records were obtained for Holstein cows in Japan from 2004 to 2013. A random subset of the data was used for the analysis (825 herds, 205,383 cows). This data set was randomly divided into 5 subsets (162-168 herds, 83,389-95,854 cows), and genetic parameters were estimated in each subset independently. Data were analyzed using multiple-trait random regression animal models including either the residual effect for the whole lactation period (H0), the residual effects for 5 lactation stages (H5), or both of these residual effects (HD). Milk yield heritability increased until 310 to 351 d in milk (DIM) and SCS heritability increased until 330 to 344 DIM. Heritability estimates for PSR increased with DIM from 0.00 to 0.05. The genetic correlation between milk yield and SCS increased negatively to under -0.60 at 455 DIM. The genetic correlation between milk yield and PSR increased until 342 to 355 DIM (0.53-0.57). The genetic correlation between the SCS and PSR was -0.82 to -0.83 at around 180 DIM, and decreased to -0.65 to -0.71 at 455 DIM. The reliability of EBV of PSR for sires with 30 or more recorded daughters was 0.17 to 0.45 when the effects of correlated traits were ignored. The maximum reliability of EBV was observed at 257 (H0) or 322 (HD) DIM. When the correlations of PSR with milk yield and SCS were considered, the reliabilities of PSR estimates increased to 0.31-0.76. The genetic parameter estimates of H5 were the same as those for HD. The rank correlation coefficients of the EBV of PSR between H0 and H5 or HD were

  17. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  18. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  19. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    Science.gov (United States)

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data

  20. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  1. Absorbing boundary conditions for inertial random processes

    Energy Technology Data Exchange (ETDEWEB)

    Masoliver, J.; Porra, J.M. [Departament de Fisica Fonamental, Universitat de Barcelona, Diagonal, 647, 08028-Barcelona (Spain); Lindenberg, K. [Department of Chemistry and Biochemistry and Institute for Nonlinear Science, University of California, San Diego, La Jolla, California 92093-0340 (United States)

    1996-12-01

    A recent paper by J. Heinrichs [Phys. Rev. E {bold 48}, 2397 (1993)] presents analytic expressions for the first-passage times and the survival probability for a particle moving in a field of random correlated forces. We believe that the analysis there is flawed due to an improper use of boundary conditions. We compare that result, in the white noise limit, with the known exact expression of the mean exit time. {copyright} {ital 1996 The American Physical Society.}

  2. Comparative efficacy, tolerability, and survival outcomes of various radiopharmaceuticals in castration-resistant prostate cancer with bone metastasis: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Tunio M

    2015-09-01

    Full Text Available Mutahir Tunio,1 Mushabbab Al Asiri,1 Abdulrehman Al Hadab,1 Yasser Bayoumi2 1Radiation Oncology, Comprehensive Cancer Center, King Fahad Medical City, Riyadh, Saudi Arabia; 2Radiation Oncology, National Cancer Institute, Cairo University, Cairo, Egypt Background: A meta-analysis was conducted to assess the impact of radiopharmaceuticals (RPs in castration-resistant prostate cancer (CRPC on pain control, symptomatic skeletal events (SSEs, toxicity profile, quality of life (QoL, and overall survival (OS.Materials and methods: The PubMed/MEDLINE, CANCERLIT, EMBASE, Cochrane Library database, and other search engines were searched to identify randomized controlled trials (RCTs comparing RPs with control (placebo or radiation therapy in metastatic CRPC. Data were extracted and assessed for the risk of bias (Cochrane’s risk of bias tool. Pooled data were expressed as odds ratio (OR, with 95% confidence intervals (CIs; Mantel–Haenszel fixed-effects model.Results: Eight RCTs with a total patient population of 1,877 patients were identified. The use of RP was associated with significant reduction in pain intensity and SSE (OR: 0.63, 95% CI: 0.51–0.78, I2=27%, P<0.0001, improved QoL (OR: 0.71, 95% CI: 0.55–0.91, I2=65%, three trials, 1,178 patients, P=0.006, and a minimal improved OS (OR: 0.84, 95% CI: 0.64–1.04, I2=47%, seven trials, 1,845 patients, P=0.11. A subgroup analysis suggested an improved OS with radium-223 (OR: 0.68, 95% CI: 0.51–0.90, one trial, 921 patients and strontium-89 (OR: 0.21, 95% CI: 0.05–0.91, one trial, 49 patients. Strontium-89 (five trials was associated with increased rates of grade 3 and 4 thrombocytopenia (OR: 4.26, 95% CI: 2.22–8.18, P=0.01, leucopenia (OR: 7.98, 95% CI: 1.82–34.95, P=0.02, pain flare (OR: 6.82, 95% CI: 3.42–13.55, P=0.04, and emesis (OR: 3.61, 95% CI: 1.76–7.40, P=0.02.Conclusion: The use of RPs was associated with significant reduction in SSEs and improved QoL, while the radium-223

  3. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  4. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  5. Predictors for contrast media-induced nephropathy and long-term survival: Prospectively assessed data from the randomized controlled Dialysis-Versus-Diuresis (DVD) trial

    Science.gov (United States)

    Hölscher, Birgit; Heitmeyer, Christine; Fobker, Manfred; Breithardt, Günter; Schaefer, Roland M; Reinecke, Holger

    2008-01-01

    BACKGROUND: Among the numerous studies concerning contrast media-induced nephropathy (CIN), there was no prospective trial that provided data on the long-term outcomes. OBJECTIVES: To prospectively assess predictors of CIN and long-term outcomes of affected patients. METHODS: Four hundred twelve consecutive patients with serum creatinine levels of 115 μmol/L to 309 μmol/L (1.3 mg/dL to 3.5 mg/dL) undergoing elective coronary angiography were included. Patients were randomly assigned to periprocedural hydration alone, hydration plus onetime hemodialysis or hydration plus N-acetylcysteine. RESULTS: Multivariate logistic regression identified the following as predictors of CIN within 72 h (equivalent to an increase in creatinine 44.2 μmol/L [0.5 mg/dL] or more) : prophylactic postprocedural hemodialysis (OR 2.86, 95% CI 1.07 to 7.69), use of angiotensin-converting enzyme inhibitors (OR 6.16, 95% CI 2.01 to 18.93), baseline glomerular filtration rate (OR 0.94, 95% CI 0.90 to 0.98) and the amount of contrast media given (OR 1.01, 95% CI 1.00 to 1.01). With regard to long-term outcome (mean follow-up 649 days), multivariate Cox regression models found elevated creatinine levels at 30 days (hazard rate ratio [HRR] 5.48, 95% CI 2.85 to 10.53), but not CIN within 72 h (HRR 1.12, 95% CI 0.63 to 2.02), to be associated with increased mortality. In addition, independent predictors for death during follow-up included left ventricular ejection fraction lower than 35% (HRR 4.01, 95% CI 2.22 to 7.26), serum phosphate (HRR 1.64, 95% CI 1.10 to 2.43) and hemoglobin (HRR 0.80, 95% CI 0.67 to 0.96). CONCLUSION: From the present prospective trial, performance of post-procedural hemodialysis, use of angiotensin-converting enzyme inhibitors, reduced baseline glomerular filtration rate and amount of contrast media were independent predictors of CIN within 72 h after catheterization. Assessing renal function after 30 days, rather than within 72 h, seemed to be more predictive for

  6. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  7. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  8. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  9. Multinomial mixture model with heterogeneous classification probabilities

    Science.gov (United States)

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  10. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    Introductory treatment develops the theory of integration in a general context, making it applicable to other branches of analysis. More specialized topics include convergence theorems and random sequences and functions. 1963 edition.

  11. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  12. Huygens' foundations of probability

    NARCIS (Netherlands)

    Freudenthal, Hans

    It is generally accepted that Huygens based probability on expectation. The term “expectation,” however, stems from Van Schooten's Latin translation of Huygens' treatise. A literal translation of Huygens' Dutch text shows more clearly what Huygens actually meant and how he proceeded.

  13. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  14. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  15. Univariate Probability Distributions

    Science.gov (United States)

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  16. The Theory of Probability

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 4. The Theory of Probability. Andrei Nikolaevich Kolmogorov. Classics Volume 3 Issue 4 April 1998 pp 103-112. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/003/04/0103-0112. Author Affiliations.

  17. Probability Theory Without Tears!

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. Probability Theory Without Tears! S Ramasubramanian. Book Review Volume 1 Issue 2 February 1996 pp 115-116. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/001/02/0115-0116 ...

  18. probably mostly white

    African Journals Online (AJOL)

    Willem Scholtz

    internet – the (probably mostly white) public's interest in the so-called Border War is ostensibly at an all-time high. By far most of the publications are written by ex- ... understanding of this very important episode in the history of Southern Africa. It was, therefore, with some anticipation that one waited for this book, which.

  19. the theory of probability

    Indian Academy of Sciences (India)

    important practical applications in statistical quality control. Of a similar kind are the laws of probability for the scattering of missiles, which are basic in the ..... deviations for different ranges for each type of gun and of shell are found empirically in firing practice on an artillery range. But the subsequent solution of all possible ...

  20. Prospective, Randomized, Double-Blind, Phase III Clinical Trial of Anti-T-Lymphocyte Globulin to Assess Impact on Chronic Graft-Versus-Host Disease-Free Survival in Patients Undergoing HLA-Matched Unrelated Myeloablative Hematopoietic Cell Transplantation.

    Science.gov (United States)

    Soiffer, Robert J; Kim, Haesook T; McGuirk, Joseph; Horwitz, Mitchell E; Johnston, Laura; Patnaik, Mrinal M; Rybka, Witold; Artz, Andrew; Porter, David L; Shea, Thomas C; Boyer, Michael W; Maziarz, Richard T; Shaughnessy, Paul J; Gergis, Usama; Safah, Hana; Reshef, Ran; DiPersio, John F; Stiff, Patrick J; Vusirikala, Madhuri; Szer, Jeff; Holter, Jennifer; Levine, James D; Martin, Paul J; Pidala, Joseph A; Lewis, Ian D; Ho, Vincent T; Alyea, Edwin P; Ritz, Jerome; Glavin, Frank; Westervelt, Peter; Jagasia, Madan H; Chen, Yi-Bin

    2017-10-17

    Purpose Several open-label randomized studies have suggested that in vivo T-cell depletion with anti-T-lymphocyte globulin (ATLG; formerly antithymocyte globulin-Fresenius) reduces chronic graft-versus-host disease (cGVHD) without compromising survival. We report a prospective, double-blind phase III trial to investigate the effect of ATLG (Neovii Biotech, Lexington, MA) on cGVHD-free survival. Patients and Methods Two hundred fifty-four patients 18 to 65 years of age with acute leukemia or myelodysplastic syndrome who underwent myeloablative HLA-matched unrelated hematopoietic cell transplantation (HCT) were randomly assigned one to one to placebo (n =128 placebo) or ATLG (n = 126) treatment at 27 sites. Patients received either ATLG or placebo 20 mg/kg per day on days -3, -2, -1 in addition to tacrolimus and methotrexate as GVHD prophylaxis. The primary study end point was moderate-severe cGVHD-free survival. Results Despite a reduction in grade 2 to 4 acute GVHD (23% v 40%; P = .004) and moderate-severe cGVHD (12% v 33%; P < .001) in ATLG recipients, no difference in moderate-severe cGVHD-free survival between ATLG and placebo was found (2-year estimate: 48% v 44%, respectively; P = .47). Both progression-free survival (PFS) and overall survival (OS) were lower with ATLG (2-year estimate: 47% v 65% [ P = .04] and 59% v 74% [ P = .034], respectively). Multivariable analysis confirmed that ATLG was associated with inferior PFS (hazard ratio, 1.55; 95% CI, 1.05 to 2.28; P = .026) and OS (hazard ratio, 1.74; 95% CI, 1.12 to 2.71; P = .01). Conclusion In this prospective, randomized, double-blind trial of ATLG in unrelated myeloablative HCT, the incorporation of ATLG did not improve moderate-severe cGVHD-free survival. Moderate-severe cGVHD was significantly lower with ATLG, but PFS and OS also were lower. Additional analyses are needed to understand the appropriate role for ATLG in HCT.

  1. Intrauterine human chorionic gonadotropin infusion in oocyte donors promotes endometrial synchrony and induction of early decidual markers for stromal survival: a randomized clinical trial.

    Science.gov (United States)

    Strug, Michael R; Su, Renwei; Young, James E; Dodds, William G; Shavell, Valerie I; Díaz-Gimeno, Patricia; Ruíz-Alonso, Maria; Simón, Carlos; Lessey, Bruce A; Leach, Richard E; Fazleabas, Asgerally T

    2016-07-01

    Does a single intrauterine infusion of human chorionic gonadotropin (hCG) at the time corresponding to a Day 3 embryo transfer in oocyte donors induce favorable molecular changes in the endometrium for embryo implantation? Intrauterine hCG was associated with endometrial synchronization between endometrial glands and stroma following ovarian stimulation and the induction of early decidual markers associated with stromal cell survival. The clinical potential for increasing IVF success rates using an intrauterine hCG infusion prior to embryo transfer remains unclear based on previously reported positive and non-significant findings. However, infusion of CG in the non-human primate increases the expression of pro-survival early decidual markers important for endometrial receptivity, including α-smooth muscle actin (α-SMA) and NOTCH1. Oocyte donors (n=15) were randomly assigned to receive an intrauterine infusion of 500 IU hCG (n=7) or embryo culture media vehicle (n=8) 3 days following oocyte retrieval during their donor stimulation cycle. Endometrial biopsies were performed 2 days later, followed by either RNA isolation or tissue fixation in formalin and paraffin embedding. Reverse transcription of total RNA from endometrial biopsies generated cDNA, which was used for analysis in the endometrial receptivity array (ERA; n = 5/group) or quantitative RT-PCR to determine relative expression of ESR1, PGR, C3 and NOTCH1. Tissue sections were stained with hematoxylin and eosin followed by blinded staging analysis for dating of endometrial glands and stroma. Immunostaining for ESR1, PGR, α-SMA, C3 and NOTCH1 was performed to determine their tissue localization. Intrauterine hCG infusion was associated with endometrial synchrony and reprograming of stromal development following ovarian stimulation. ESR1 and PGR were significantly elevated in the endometrium of hCG-treated patients, consistent with earlier staging. The ERA did not predict an overall positive impact of

  2. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  3. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  4. Probability learning and Piagetian probability conceptions in children 5 to 12 years old.

    Science.gov (United States)

    Kreitler, S; Zigler, E; Kreitler, H

    1989-11-01

    This study focused on the relations between performance on a three-choice probability-learning task and conceptions of probability as outlined by Piaget concerning mixture, normal distribution, random selection, odds estimation, and permutations. The probability-learning task and four Piagetian tasks were administered randomly to 100 male and 100 female, middle SES, average IQ children in three age groups (5 to 6, 8 to 9, and 11 to 12 years old) from different schools. Half the children were from Middle Eastern backgrounds, and half were from European or American backgrounds. As predicted, developmental level of probability thinking was related to performance on the probability-learning task. The more advanced the child's probability thinking, the higher his or her level of maximization and hypothesis formulation and testing and the lower his or her level of systematically patterned responses. The results suggest that the probability-learning and Piagetian tasks assess similar cognitive skills and that performance on the probability-learning task reflects a variety of probability concepts.

  5. Impact of 3-Monthly Vitamin D Supplementation Plus Exercise on Survival after Surgery for Osteoporotic Hip Fracture in Adult Patients over 50 Years: A Pragmatic Randomized, Partially Blinded, Controlled Trial.

    Science.gov (United States)

    Laiz, A; Malouf, J; Marin, A; Longobardi, V; de Caso, J; Farrerons, J; Casademont, J

    2017-01-01

    To determine whether 3-monthly supplementation of an oral vitamin D widely used in Spain (calcifediol) plus daily exercise could influence survival at one and four years after surgery for osteoporotic hip fracture. A pragmatic, randomized, partially single-blind placebo-controlled study. Patients admitted to a tertiary university hospital for acute hip fracture. 675 healthy adult patients undergoing surgery for osteoporotic hip fracture were recruited from January 2004 to December 2007. Patients were randomized to receive either 3-monthly oral doses of 3 mg calcifediol (Hidroferol Choque®) or placebo in the 12 months postsurgery. Patients who received calcifediol were also given an exercise programme. The placebo group received standard health recommendations only. The primary endpoint was survival at 1 year and at 4 year follow-up. We also recorded new fractures, medical complications and anti-osteoporotic treatment compliance. We included a total of 88 patients, aged 62 to 99 years. Mean age was 82 years and 88.6% were women. At 12 months, 10 (11.3%) patients had died, 9 of them, from the non-intervention group. At 4 years after surgery, 20 (22.7%) had died, 3 (3.4%) from the intervention group and 17 (19.3%) from the non-intervention group. At this time, survival curve analysis showed 93% survival in the intervention group and 62% in the non-intervention group (p=0.001). At 12-month follow up, there were 18 new fractures, 9 in each group. The non-intervention group had more medical complications, with significant differences at visit 2 (p = 0.04) and 3 (p = 0.02) but not at visit 4 (p = 0.18). No significant differences between groups were found regarding treatment compliance. 3-monthly, oral supplements of 3 mg calcifediol plus daily exercise improved survival at one-year and four-year follow up after surgery for an osteoporotic hip fracture.

  6. Refinement of Probability of Survival Decision Aid (PSDA)

    Science.gov (United States)

    2014-03-01

    thermoregulation , search and rescue, SaR, predictive modeling Adam W. Potter Unclassified 26Unclass Unclass Unclass 508-233-4735 Form Approved OMB No. 0704-0188...SCTM, then posts or updates the display predictions for cold functional time (i.e., the point in time when core temperature reaches 34 °C), cold

  7. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  8. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...

  9. Waste Package Misload Probability

    Energy Technology Data Exchange (ETDEWEB)

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  10. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  11. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  12. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  13. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  14. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  15. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    eligible voters who support a particular political party. A random sample of size n is selected from this population and suppose k voters support this party. What is a good estimate of the required proportion? How do we obtain a probability model for the experi- ment just conducted? Let us examine the following simple ex-.

  16. Structural Minimax Probability Machine.

    Science.gov (United States)

    Gu, Bin; Sun, Xingming; Sheng, Victor S

    2017-07-01

    Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.

  17. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  18. Frequentist probability and frequentist statistics

    Energy Technology Data Exchange (ETDEWEB)

    Neyman, J.

    1977-01-01

    A brief, nontechnical outline is given of the author's views on the ''frequentist'' theory of probability and the ''frequentist'' theory of statistics; their applications are illustrated in a few domains of study of nature. The phenomenon of apparently stable relative frequencies as the source of the frequentist theories of probability and statistics is taken up first. Three steps are set out: empirical establishment of apparently stable long-run relative frequencies of events judged interesting, as they develop in nature; guessing and then verifying the chance mechanism, the repeated operation of which produced the observed frequencies--this is a problem of frequentist probability theory; using the hypothetical chance mechanism of the phenomenon studied to deduce rules of adjusting our actions to the observations to ensure the highest ''measure'' of ''success''. Illustrations of the three steps are given. The theory of testing statistical hypotheses is sketched: basic concepts, simple and composite hypotheses, hypothesis tested, importance of the power of the test used, practical applications of the theory of testing statistical hypotheses. Basic ideas and an example of the randomization of experiments are discussed, and an ''embarrassing'' example is given. The problem of statistical estimation is sketched: example of an isolated problem, example of connected problems treated routinely, empirical Bayes theory, point estimation. The theory of confidence intervals is outlined: basic concepts, anticipated misunderstandings, construction of confidence intervals: regions of acceptance. Finally, the theory of estimation by confidence intervals or regions is considered briefly. 4 figures. (RWR)

  19. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  20. What is Probability Theory?

    Indian Academy of Sciences (India)

    IAS Admin

    gambling problems in the 18th century. Europe. random) phenomena, especially those evolving over time. The study of motion of physical objects over time by. Newton led to his famous three laws of motion as well as many important developments in the theory of ordi- nary differential equations. Similarly, the construction ...

  1. Network ties and survival

    DEFF Research Database (Denmark)

    Acheampong, George; Narteh, Bedman; Rand, John

    2017-01-01

    Poultry farming has been touted as one of the major ways by which poverty can be reduced in low-income economies like Ghana. Yet, anecdotally there is a high failure rate among these poultry farms. This current study seeks to understand the relationship between network ties and survival chances...... of small commercial poultry farms (SCPFs). We utilize data from a 2-year network survey of SCPFs in rural Ghana. The survival of these poultry farms are modelled using a lagged probit model of farms that persisted from 2014 into 2015. We find that network ties are important to the survival chances...... but this probability reduces as the number of industry ties increases but moderation with dynamic capability of the firm reverses this trend. Our findings show that not all network ties aid survival and therefore small commercial poultry farmers need to be circumspect in the network ties they cultivate and develop....

  2. Diabetes Mellitus Is Associated With Decreased Limb Survival in Patients With Critical Limb Ischemia : Pooled Data From Two Randomized Controlled Trials

    NARCIS (Netherlands)

    Spreen, Marlon I; Gremmels, Hendrik; Teraa, Martin; Sprengers, Ralf W; Verhaar, Marianne C; Statius van Eps, Randolph G; de Vries, Jean-Paul P M; Mali, Willem P.Th.M.; van Overhagen, Hans

    2016-01-01

    OBJECTIVE: Although never assessed prospectively, diabetes mellitus (DM) is assumed to negatively affect the outcomes of critical limb ischemia (CLI). DM was highly prevalent in two recently conducted randomized controlled trials in CLI patients, the PADI (Percutaneous Transluminal Balloon

  3. Statistics, Probability and Chaos

    OpenAIRE

    Berliner, L. Mark

    1992-01-01

    The study of chaotic behavior has received substantial attention in many disciplines. Although often based on deterministic models, chaos is associated with complex, "random" behavior and forms of unpredictability. Mathematical models and definitions associated with chaos are reviewed. The relationship between the mathematics of chaos and probabilistic notions, including ergodic theory and uncertainty modeling, are emphasized. Popular data analytic methods appearing in the literature are disc...

  4. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  5. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  6. A seismic probability map

    Directory of Open Access Journals (Sweden)

    J. M. MUNUERA

    1964-06-01

    Full Text Available The material included in former two papers (SB and EF
    which summs 3307 shocks corresponding to 2360 years, up to I960, was
    reduced to a 50 years period by means the weight obtained for each epoch.
    The weitliing factor is the ratio 50 and the amount of years for every epoch.
    The frequency has been referred over basis VII of the international
    seismic scale of intensity, for all cases in which the earthquakes are equal or
    greater than VI and up to IX. The sum of products: frequency and parameters
    previously exposed, is the probable frequency expected for the 50
    years period.
    On each active small square, we have made the corresponding computation
    and so we have drawn the Map No 1, in percentage. The epicenters with
    intensity since X to XI are plotted in the Map No 2, in order to present a
    complementary information.
    A table shows the return periods obtained for all data (VII to XI,
    and after checking them with other computed from the first up to last shock,
    a list includes the probable approximate return periods estimated for the area.
    The solution, we suggest, is an appropriated form to express the seismic
    contingent phenomenon and it improves the conventional maps showing
    the equal intensity curves corresponding to the maximal values of given side.

  7. Long-term survival results of a randomized trial comparing gemcitabine plus cisplatin, with methotrexate, vinblastine, doxorubicin, plus cisplatin in patients with bladder cancer

    DEFF Research Database (Denmark)

    Maase, Hans von der; Sengeløv, Lisa; Roberts, James T.

    2005-01-01

    PURPOSE: To compare long-term survival in patients with locally advanced       or metastatic transitional cell carcinoma (TCC) of the urothelium treated       with gemcitabine/cisplatin (GC) or       methotrexate/vinblastine/doxorubicin/cisplatin (MVAC). PATIENTS AND       METHODS: Efficacy data...... in patients with locally advanced or       metastatic TCC...

  8. Long-term survival results of a randomized trial comparing gemcitabine/cisplatin and methotrexate/vinblastine/doxorubicin/cisplatin in patients with locally advanced and metastatic bladder cancer

    DEFF Research Database (Denmark)

    Roberts, J T; von der Maase, H; Sengeløv, L

    2006-01-01

    PURPOSE: To compare long-term survival in patients with locally advanced and metastatic transitional cell carcinoma (TCC) of the urothelium treated with gemcitabine plus cisplatin (GC) or methotrexate/vinblastine/doxorubicin/cisplatin (MVAC). PATIENTS AND METHODS: Efficacy data from a large....... These results strengthen the role of GC as a standard of care in patients with locally advanced and metastatic transitional-cell carcinoma (TCC)....

  9. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  10. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  11. Return probability: Exponential versus Gaussian decay

    Energy Technology Data Exchange (ETDEWEB)

    Izrailev, F.M. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)]. E-mail: izrailev@sirio.ifuap.buap.mx; Castaneda-Mendoza, A. [Instituto de Fisica, BUAP, Apdo. Postal J-48, 72570 Puebla (Mexico)

    2006-02-13

    We analyze, both analytically and numerically, the time-dependence of the return probability in closed systems of interacting particles. Main attention is paid to the interplay between two regimes, one of which is characterized by the Gaussian decay of the return probability, and another one is the well-known regime of the exponential decay. Our analytical estimates are confirmed by the numerical data obtained for two models with random interaction. In view of these results, we also briefly discuss the dynamical model which was recently proposed for the implementation of a quantum computation.

  12. Breeding Experience Might Be a Major Determinant of Breeding Probability in Long-Lived Species: The Case of the Greater Flamingo

    Science.gov (United States)

    Pradel, Roger; Choquet, Rémi; Béchet, Arnaud

    2012-01-01

    The probability of breeding is known to increase with age early in life in many long-lived species. This increase may be due to experience accumulated through past breeding attempts. Recent methodological advances allowing accounting for unobserved breeding episodes, we analyzed the encounter histories of 14716 greater flamingos over 25 years to get a detailed picture of the interactions of age and experience. Survival did not improve with experience, seemingly ruling out the selection hypothesis. Breeding probability varied within three levels of experience : no breeding experience, 1 experience, 2+ experiences. We fitted models with and without among-individual differences in breeding probabilities by including or not an additive individual random effect. Including the individual random effect improved the model fit less than including experience but the best model retained both. However, because modeling individual heterogeneity by means of an additive static individual random effect is currently criticized and may not be appropriate, we discuss the results with and without random effect. Without random effect, breeding probability of inexperienced birds was always times lower than that of same age experienced birds, and breeding probability increased more with one additional experience than with one additional year of age. With random effects, the advantage of experience was unequivocal only after age 9 while in young having experience was penalizing. Another pattern, that breeding probability of birds with experiences dropped after some age (8 without random effect; up to 11 with it), may point to differences in the timing of reproductive senescence or to the existence of a sensitive period for acquiring behavioral skills. Overall, the role of experience appears strong in this long-lived species. We argue that overlooking the role of experience may hamper detection of trade-offs and assessment of individual heterogeneity. However, manipulative experiments are

  13. Contrasting treatment-specific survival using double-robust estimators.

    Science.gov (United States)

    Zhang, Min; Schaubel, Douglas E

    2012-12-30

    In settings where a randomized trial is infeasible, observational data are frequently used to compare treatment-specific survival. The average causal effect (ACE) can be used to make inference regarding treatment policies on patient populations, and a valid ACE estimator must account for imbalances with respect to treatment-specific covariate distributions. One method through which the ACE on survival can be estimated involves appropriately averaging over Cox-regression-based fitted survival functions. A second available method balances the treatment-specific covariate distributions through inverse probability of treatment weighting and then contrasts weighted nonparametric survival function estimators. Because both methods have their advantages and disadvantages, we propose methods that essentially combine both estimators. The proposed methods are double robust, in the sense that they are consistent if at least one of the two working regression models (i.e., logistic model for treatment and Cox model for death hazard) is correct. The proposed methods involve estimating the ACE with respect to restricted mean survival time, defined as the area under the survival curve up to some prespecified time point. We derive and evaluate asymptotic results through simulation. We apply the proposed methods to estimate the ACE of donation-after-cardiac-death kidney transplantation with the use of data obtained from multiple centers in the Netherlands. Copyright © 2012 John Wiley & Sons, Ltd.

  14. Inferential Statistics from Black Hispanic Breast Cancer Survival Data

    Directory of Open Access Journals (Sweden)

    Hafiz M. R. Khan

    2014-01-01

    Full Text Available In this paper we test the statistical probability models for breast cancer survival data for race and ethnicity. Data was collected from breast cancer patients diagnosed in United States during the years 1973–2009. We selected a stratified random sample of Black Hispanic female patients from the Surveillance Epidemiology and End Results (SEER database to derive the statistical probability models. We used three common model building criteria which include Akaike Information Criteria (AIC, Bayesian Information Criteria (BIC, and Deviance Information Criteria (DIC to measure the goodness of fit tests and it was found that Black Hispanic female patients survival data better fit the exponentiated exponential probability model. A novel Bayesian method was used to derive the posterior density function for the model parameters as well as to derive the predictive inference for future response. We specifically focused on Black Hispanic race. Markov Chain Monte Carlo (MCMC method was used for obtaining the summary results of posterior parameters. Additionally, we reported predictive intervals for future survival times. These findings would be of great significance in treatment planning and healthcare resource allocation.

  15. Hazard Function Estimation with Cause-of-Death Data Missing at Random

    OpenAIRE

    Wang, Qihua; Dinse, Gregg E.; Liu, Chunling

    2012-01-01

    Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted e...

  16. Quantization of Prior Probabilities for Hypothesis Testing

    OpenAIRE

    Varshney, Kush R.; Varshney, Lav R.

    2008-01-01

    Bayesian hypothesis testing is investigated when the prior probabilities of the hypotheses, taken as a random vector, are quantized. Nearest neighbor and centroid conditions are derived using mean Bayes risk error as a distortion measure for quantization. A high-resolution approximation to the distortion-rate function is also obtained. Human decision making in segregated populations is studied assuming Bayesian hypothesis testing with quantized priors.

  17. Multiple decomposability of probabilities on contractible locally ...

    Indian Academy of Sciences (India)

    Definition 3.1). As mentioned before, μ is n-times τ-decomposable iff μ has a representation as (n + 1)-times iterated convolution product. To be allowed to ..... Then the classical version of the equivalence theorem holds: If νi , i ≥ 0, ν, are probabilities and Xi ,i ≥ 0, Y are independent G-valued random variables with ...

  18. A probability of synthesis of the superheavy element Z = 124

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First Grade College, Department of Physics, Kolar, Karnataka (India)

    2017-10-15

    We have studied the fusion cross section, evaporation residue cross section, compound nucleus formation probability (P{sub CN}) and survival probability (P{sub sur}) of different projectile target combinations to synthesize the superheavy element Z=124. Hence, we have identified the most probable projectile-target combination to synthesize the superheavy element Z = 124. To synthesize the superheavy element Z=124, the most probable projectile target combinations are Kr+Ra, Ni+Cm, Se+Th, Ge+U and Zn+Pu. We hope that our predictions may be a guide for the future experiments in the synthesis of superheavy nuclei Z = 124. (orig.)

  19. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  20. Tobit regression for modeling mean survival time using data subject to multiple sources of censoring.

    Science.gov (United States)

    Gong, Qi; Schaubel, Douglas E

    2018-01-22

    Mean survival time is often of inherent interest in medical and epidemiologic studies. In the presence of censoring and when covariate effects are of interest, Cox regression is the strong default, but mostly due to convenience and familiarity. When survival times are uncensored, covariate effects can be estimated as differences in mean survival through linear regression. Tobit regression can validly be performed through maximum likelihood when the censoring times are fixed (ie, known for each subject, even in cases where the outcome is observed). However, Tobit regression is generally inapplicable when the response is subject to random right censoring. We propose Tobit regression methods based on weighted maximum likelihood which are applicable to survival times subject to both fixed and random censoring times. Under the proposed approach, known right censoring is handled naturally through the Tobit model, with inverse probability of censoring weighting used to overcome random censoring. Essentially, the re-weighting data are intended to represent those that would have been observed in the absence of random censoring. We develop methods for estimating the Tobit regression parameter, then the population mean survival time. A closed form large-sample variance estimator is proposed for the regression parameter estimator, with a semiparametric bootstrap standard error estimator derived for the population mean. The proposed methods are easily implementable using standard software. Finite-sample properties are assessed through simulation. The methods are applied to a large cohort of patients wait-listed for kidney transplantation. Copyright © 2018 John Wiley & Sons, Ltd.

  1. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    George, L.L.

    1983-01-01

    This paper describes seismic risk, load combination, and probabilistic risk problems in power plant reliability, and it suggests applications of extreme value theory. Seismic risk analysis computes the probability of power plant failure in an earthquake and the resulting risk. Components fail if their peak responses to an earthquake exceed their strengths. Dependent stochastic processes represent responses, and peak responses are maxima. A Boolean function of component failures and survivals represents plant failure. Load combinations analysis computes the cdf of the peak of the superposition of stochastic processes that represent earthquake and operating loads. It also computes the probability of pipe fracture due to crack growth, a Markov process, caused by loads. Pipe fracture is an absorbing state. Probabilistic risk analysis computes the cdf's of probabilities which represent uncertainty. These Cdf's are induced by randomizing parameters of cdf's and by randomizing properties of stochastic processes such as initial crack size distributions, marginal cdf's, and failure criteria.

  2. Quantum Probability, Orthogonal Polynomials and Quantum Field Theory

    Science.gov (United States)

    Accardi, Luigi

    2017-03-01

    The main thesis of the present paper is that: Quantum Probability is not a generalization of classical probability, but it is a deeper level of it. Classical random variables have an intrinsic (microscopic) non-commutative structure that generalize usual quantum theory. The study of this generalization is the core of the non-linear quantization program.

  3. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    Science.gov (United States)

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  4. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  5. The Tanzania Connect Project: a cluster-randomized trial of the child survival impact of adding paid community health workers to an existing facility-focused health system

    Science.gov (United States)

    2013-01-01

    Background Tanzania has been a pioneer in establishing community-level services, yet challenges remain in sustaining these systems and ensuring adequate human resource strategies. In particular, the added value of a cadre of professional community health workers is under debate. While Tanzania has the highest density of primary health care facilities in Africa, equitable access and quality of care remain a challenge. Utilization for many services proven to reduce child and maternal mortality is unacceptably low. Tanzanian policy initiatives have sought to address these problems by proposing expansion of community-based providers, but the Ministry of Health and Social Welfare (MoHSW ) lacks evidence that this merits national implementation. The Tanzania Connect Project is a randomized cluster trial located in three rural districts with a population of roughly 360,000 ( Kilombero, Rufiji, and Ulanga). Description of intervention Connect aims to test whether introducing a community health worker into a general program of health systems strengthening and referral improvement will reduce child mortality, improve access to services, expand utilization, and alter reproductive, maternal, newborn and child health seeking behavior; thereby accelerating progress towards Millennium Development Goals 4 and 5. Connect has introduced a new cadre — Community Health Agents (CHA) — who were recruited from and work in their communities. To support the CHA, Connect developed supervisory systems, launched information and monitoring operations, and implemented logistics support for integration with existing district and village operations. In addition, Connect’s district-wide emergency referral strengthening intervention includes clinical and operational improvements. Evaluation design Designed as a community-based cluster-randomized trial, CHA were randomly assigned to 50 of the 101 villages within the Health and Demographic Surveillance System (HDSS) in the three study districts

  6. Smoking relapse-prevention intervention for cancer patients: Study design and baseline data from the surviving SmokeFree randomized controlled trial.

    Science.gov (United States)

    Díaz, Diana B; Brandon, Thomas H; Sutton, Steven K; Meltzer, Lauren R; Hoehn, Hannah J; Meade, Cathy D; Jacobsen, Paul B; McCaffrey, Judith C; Haura, Eric B; Lin, Hui-Yi; Simmons, Vani N

    2016-09-01

    Continued smoking after a cancer diagnosis contributes to several negative health outcomes. Although many cancer patients attempt to quit smoking, high smoking relapse rates have been observed. This highlights the need for a targeted, evidence-based smoking-relapse prevention intervention. The design, method, and baseline characteristics of a randomized controlled trial assessing the efficacy of a self-help smoking-relapse prevention intervention are presented. Cancer patients who had recently quit smoking were randomized to one of two conditions. The Usual Care (UC) group received the institution's standard of care. The smoking relapse-prevention intervention (SRP) group received standard of care, plus 8 relapse-prevention booklets mailed over a 3month period, and a targeted educational DVD developed specifically for cancer patients. Four hundred and fourteen participants were enrolled and completed a baseline survey. Primary outcomes will be self-reported smoking status at 6 and 12-months after baseline. Biochemical verification of smoking status was completed for a subsample. If found to be efficacious, this low-cost intervention could be easily disseminated with significant potential for reducing the risk of negative cancer outcomes associated with continued smoking. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  8. Community and District Empowerment for Scale-up (CODES): a complex district-level management intervention to improve child survival in Uganda: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Waiswa, Peter; O'Connell, Thomas; Bagenda, Danstan; Mullachery, Pricila; Mpanga, Flavia; Henriksson, Dorcus Kiwanuka; Katahoire, Anne Ruhweza; Ssegujja, Eric; Mbonye, Anthony K; Peterson, Stefan Swartling

    2016-03-11

    Innovative and sustainable strategies to strengthen districts and other sub-national health systems and management are urgently required to reduce child mortality. Although highly effective evidence-based and affordable child survival interventions are well-known, at the district level, lack of data, motivation, analytic and planning capacity often impedes prioritization and management weaknesses impede implementation. The Community and District Empowerment for Scale-up (CODES) project is a complex management intervention designed to test whether districts when empowered with data and management tools can prioritize and implement evidence-based child survival interventions equitably. The CODES strategy combines management, diagnostic, and evaluation tools to identify and analyze the causes of bottlenecks to implementation, build capacity of district management teams to implement context-specific solutions, and to foster community monitoring and social accountability to increase demand for services. CODES combines UNICEF tools designed to systematize priority setting, allocation of resources and problem solving with Community dialogues based on Citizen Report Cards and U-Reports used to engage and empower communities in monitoring health service provision and to demand for quality services. Implementation and all data collection will be by the districts teams or local Community-based Organizations who will be supported by two local implementing partners. The study will be evaluated as a cluster randomized trial with eight intervention and eight comparison districts over a period of 3 years. Evaluation will focus on differences in uptake of child survival interventions and will follow an intention-to-treat analysis. We will also document and analyze experiences in implementation including changes in management practices. By increasing the District Health Management Teams' capacity to prioritize and implement context-specific solutions, and empowering communities to

  9. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  10. Entanglement probabilities of polymers: a white noise functional approach

    CERN Document Server

    Bernido, C C

    2003-01-01

    The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)theta. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel.

  11. Effect of provision of an integrated neonatal survival kit and early cognitive stimulation package by community health workers on developmental outcomes of infants in Kwale County, Kenya: study protocol for a cluster randomized trial.

    Science.gov (United States)

    Pell, Lisa G; Bassani, Diego G; Nyaga, Lucy; Njagi, Isaac; Wanjiku, Catherine; Thiruchselvam, Thulasi; Macharia, William; Minhas, Ripudaman S; Kitsao-Wekulo, Patricia; Lakhani, Amyn; Bhutta, Zulfiqar A; Armstrong, Robert; Morris, Shaun K

    2016-09-08

    Each year, more than 200 million children under the age of 5 years, almost all in low- and middle-income countries (LMICs), fail to achieve their developmental potential. Risk factors for compromised development often coexist and include inadequate cognitive stimulation, poverty, nutritional deficiencies, infection and complications of being born low birthweight and/or premature. Moreover, many of these risk factors are closely associated with newborn morbidity and mortality. As compromised development has significant implications on human capital, inexpensive and scalable interventions are urgently needed to promote neurodevelopment and reduce risk factors for impaired development. This cluster randomized trial aims at evaluating the impact of volunteer community health workers delivering either an integrated neonatal survival kit, an early stimulation package, or a combination of both interventions, to pregnant women during their third trimester of pregnancy, compared to the current standard of care in Kwale County, Kenya. The neonatal survival kit comprises a clean delivery kit (sterile blade, cord clamp, clean plastic sheet, surgical gloves and hand soap), sunflower oil emollient, chlorhexidine, ThermoSpot(TM), Mylar infant sleeve, and a reusable instant heater. Community health workers are also equipped with a portable hand-held electric scale. The early cognitive stimulation package focuses on enhancing caregiver practices by teaching caregivers three key messages that comprise combining a gentle touch with making eye contact and talking to children, responsive feeding and caregiving, and singing. The primary outcome measure is child development at 12 months of age assessed with the Protocol for Child Monitoring (Infant and Toddler version). The main secondary outcome is newborn mortality. This study will provide evidence on effectiveness of delivering an innovative neonatal survival kit and/or early stimulation package to pregnant women in Kwale County

  12. Probability and statistics: selected problems

    OpenAIRE

    Machado, J.A. Tenreiro; Pinto, Carla M. A.

    2014-01-01

    Probability and Statistics—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Probability and Statistics. Descriptive statistics are presented first, and probability is reviewed secondly. Discrete and continuous distributions are presented. Sample and estimation with hypothesis testing are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

  13. Long-term survival and dialysis dependency following acute kidney injury in intensive care: extended follow-up of a randomized controlled trial.

    Directory of Open Access Journals (Sweden)

    Martin Gallagher

    2014-02-01

    Full Text Available The incidence of acute kidney injury (AKI is increasing globally and it is much more common than end-stage kidney disease. AKI is associated with high mortality and cost of hospitalisation. Studies of treatments to reduce this high mortality have used differing renal replacement therapy (RRT modalities and have not shown improvement in the short term. The reported long-term outcomes of AKI are variable and the effect of differing RRT modalities upon them is not clear. We used the prolonged follow-up of a large clinical trial to prospectively examine the long-term outcomes and effect of RRT dosing in patients with AKI.We extended the follow-up of participants in the Randomised Evaluation of Normal vs. Augmented Levels of RRT (RENAL study from 90 days to 4 years after randomization. Primary and secondary outcomes were mortality and requirement for maintenance dialysis, respectively, assessed in 1,464 (97% patients at a median of 43.9 months (interquartile range [IQR] 30.0-48.6 months post randomization. A total of 468/743 (63% and 444/721 (62% patients died in the lower and higher intensity groups, respectively (risk ratio [RR] 1.04, 95% CI 0.96-1.12, p = 0.49. Amongst survivors to day 90, 21 of 411 (5.1% and 23 of 399 (5.8% in the respective groups were treated with maintenance dialysis (RR 1.12, 95% CI 0.63-2.00, p = 0.69. The prevalence of albuminuria among survivors was 40% and 44%, respectively (p = 0.48. Quality of life was not different between the two treatment groups. The generalizability of these findings to other populations with AKI requires further exploration.Patients with AKI requiring RRT in intensive care have high long-term mortality but few require maintenance dialysis. Long-term survivors have a heavy burden of proteinuria. Increased intensity of RRT does not reduce mortality or subsequent treatment with dialysis.www.ClinicalTrials.govNCT00221013.

  14. Non-random temporary emigration and the robust design: Conditions for bias at the end of a time series: Section VIII

    Science.gov (United States)

    Langtimm, Catherine A.

    2008-01-01

    Deviations from model assumptions in the application of capture–recapture models to real life situations can introduce unknown bias. Understanding the type and magnitude of bias under these conditions is important to interpreting model results. In a robust design analysis of long-term photo-documented sighting histories of the endangered Florida manatee, I found high survival rates, high rates of non-random temporary emigration, significant time-dependence, and a diversity of factors affecting temporary emigration that made it difficult to model emigration in any meaningful fashion. Examination of the time-dependent survival estimates indicated a suspicious drop in survival rates near the end of the time series that persisted when the original capture histories were truncated and reanalyzed under a shorter time frame. Given the wide swings in manatee emigration estimates from year to year, a likely source of bias in survival was the convention to resolve confounding of the last survival probability in a time-dependent model with the last emigration probabilities by setting the last unmeasurable emigration probability equal to the previous year’s probability when the equality was actually false. Results of a series of simulations demonstrated that if the unmeasurable temporary emigration probabilities in the last time period were not accurately modeled, an estimation model with significant annual variation in survival probabilities and emigration probabilities produced bias in survival estimates at the end of the study or time series being explored. Furthermore, the bias propagated back in time beyond the last two time periods and the number of years affected varied positively with survival and emigration probabilities. Truncating the data to a shorter time frame and reanalyzing demonstrated that with additional years of data surviving temporary emigrants eventually return and are detected, thus in subsequent analysis unbiased estimates are eventually realized.

  15. Characteristic miRNA expression signature and random forest survival analysis identify potential cancer-driving miRNAs in a broad range of head and neck squamous cell carcinoma subtypes.

    Science.gov (United States)

    Nunez Lopez, Yury O; Victoria, Berta; Golusinski, Pawel; Golusinski, Wojciech; Masternak, Michal M

    2018-01-01

    To characterize the miRNA expression profile in head and neck squamous cell carcinoma (HNSSC) accounting for a broad range of cancer subtypes and consequently identify an optimal miRNA signature with prognostic value. HNSCC is consistently among the most common cancers worldwide. Its mortality rate is about 50% because of the characteristic aggressive behavior of these cancers and the prevalent late diagnosis. The heterogeneity of the disease has hampered the development of robust prognostic tools with broad clinical utility. The Cancer Genome Atlas HNSC dataset was used to analyze level 3 miRNA-Seq data from 497 HNSCC patients. Differential expression (DE) analysis was implemented using the limma package and multivariate linear model that adjusted for the confounding effects of age at diagnosis, gender, race, alcohol history, anatomic neoplasm subdivision, pathologic stage, T and N stages, and vital status. Random forest (RF) for survival analysis was implemented using the randomForestSRC package. A characteristic DE miRNA signature of HNSCC, comprised of 11 upregulated (i.e., miR-196b-5p, miR-1269a, miR-196a-5p, miR-4652-3p, miR-210-3p, miR-1293, miR-615-3p, miR-503-5p, miR-455-3p, miR-205-5p, and miR-21-5p) and 9 downregulated (miR-376c-3p, miR-378c, miR-29c-3p, miR-101-3p, miR-195-5p, miR-299-5p, miR-139-5p, miR-6510-3p, miR-375) miRNAs was identified. An optimal RF survival model was built from seven variables including age at diagnosis, miR-378c, miR-6510-3p, stage N, pathologic stage, gender, and race (listed in order of variable importance). The joint differential miRNA expression and survival analysis controlling for multiple confounding covariates implemented in this study allowed for the identification of a previously undetected prognostic miRNA signature characteristic of a broad range of HNSCC.

  16. Considerations on probability: from games of chance to modern science

    Directory of Open Access Journals (Sweden)

    Paola Monari

    2015-12-01

    Full Text Available The article sets out a number of considerations on the distinction between variability and uncertainty over the centuries. Games of chance have always been useful random experiments which through combinatorial calculation have opened the way to probability theory and to the interpretation of modern science through statistical laws. The article also looks briefly at the stormy nineteenth-century debate concerning the definitions of probability which went over the same grounds – sometimes without any historical awareness – as the debate which arose at the very beginnings of probability theory, when the great probability theorists were open to every possible meaning of the term.

  17. Hip Fracture Surgery and Survival in Centenarians.

    Science.gov (United States)

    Mazzola, Paolo; Rea, Federico; Merlino, Luca; Bellelli, Giuseppe; Dubner, Lauren; Corrao, Giovanni; Pasinetti, Giulio M; Annoni, Giorgio

    2016-11-01

    Hip fracture (HF) is increasingly frequent with advancing age. Studies describing the HF incidence rate and survival after surgery in centenarians are scanty. To fill this gap, we performed a large population-based investigation on Lombardy centenarians (Italy). Retrospective observational cohort study based on information from the Healthcare Utilization Database. Among the cohort of 7,830 residents that reached 100 years of age between 2004 and 2011, incidence rate of HF was calculated. Two hundred fifty-nine patients were discharged alive from a hospital after HF and surgical repair (HF cohort). For each HF cohort member, a control was randomly selected from the initial cohort to be matched for gender and date of birth, and who did not experience HF from the date of their hundredth birthday until the date of hospital discharge of the corresponding HF cohort member. The survival curves and the hazard functions of HF and control cohort were calculated within 2 years. Over a mean follow-up of 1.85 years, HF incidence rate was 23.1 per 1,000 centenarians per year. Survival probability was significantly lower in HF cohort than in control cohort (31.5 vs 48.1%, p < .001). Hazard functions showed an increased risk of death in HF cohort than in control cohort, especially in the 3 months after surgery. Survival analysis exhibited an excess mortality in the first 3 months among HF cohort members, but not beyond this period. Every effort to counteract HF is warranted, including prevention of falls and high quality of care, especially in the early postsurgical time. © The Author 2016. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Survival Analysis

    CERN Document Server

    Miller, Rupert G

    2011-01-01

    A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.

  19. Modelling survival

    DEFF Research Database (Denmark)

    Ashauer, Roman; Albert, Carlo; Augustine, Starrlight

    2016-01-01

    well GUTS, calibrated with short-term survival data of Gammarus pulex exposed to four pesticides, can forecast effects of longer-term pulsed exposures. Thirdly, we tested the ability of GUTS to estimate 14-day median effect concentrations of malathion for a range of species and use these estimates...

  20. The Prevention of Respiratory Insufficiency after Surgical Management (PRISM) Trial. Report of the protocol for a pragmatic randomized controlled trial of CPAP to prevent respiratory complications and improve survival following major abdominal surgery.

    Science.gov (United States)

    Pearse, Rupert M; Abbott, Tom E; Haslop, Richard; Ahmad, Tahania; Kahan, Brennan C; Filipini, Claudia; Rhodes, Andrew; Ranieri, Marco

    2017-02-01

    Over 300 million patients undergo surgery worldwide each year. Postoperative morbidity - particularly respiratory complications - are most frequent and severe among high-risk patients undergoing major abdominal surgery. However, standard treatments, like physiotherapy or supplemental oxygen, often fail to prevent these. Preliminary research suggests that prophylactic continuous positive airways pressure (CPAP) can reduce the risk of postoperative respiratory complications. However, without evidence from a large clinical effectiveness trial, CPAP has not become routine care. This trial aims to determine whether early postoperative CPAP reduces the incidence of respiratory complications and improves one-year survival following major intra-peritoneal surgery. This is an international multicenter randomized controlled trial with open study group allocation. The participants are aged 50 years and over undergoing major elective intra-peritoneal surgery. The intervention is CPAP for at least four hours, started within four hours of the end of surgery. The primary outcome is a composite of pneumonia, re-intubation, or death within 30 days of randomization. All participants with a recorded outcome will be analyzed on an intention-to-treat basis. The primary analysis will use a mixed-effects logistic regression model, which includes center as a random-intercept, and will be adjusted for the minimization factors and other pre-specified covariates. Trial Registration: ISRCTN 56012545. This is the first proposed clinical effectiveness trial of postoperative CPAP to prevent respiratory complications of which we are aware. The large sample size and multicenter international design will make the result generalizable to a variety of healthcare settings.

  1. Training Teachers to Teach Probability

    Science.gov (United States)

    Batanero, Carmen; Godino, Juan D.; Roa, Rafael

    2004-01-01

    In this paper we analyze the reasons why the teaching of probability is difficult for mathematics teachers, describe the contents needed in the didactical preparation of teachers to teach probability and analyze some examples of activities to carry out this training. These activities take into account the experience at the University of Granada,…

  2. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  3. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  4. [Physical activity and cancer survival].

    Science.gov (United States)

    Romieu, Isabelle; Touillaud, Marina; Ferrari, Pietro; Bignon, Yves-Jean; Antoun, Sami; Berthouze-Aranda, Sophie; Bachmann, Patrick; Duclos, Martine; Ninot, Grégory; Romieu, Gilles; Sénesse, Pierre; Behrendt, Jan; Balosso, Jacques; Pavic, Michel; Kerbrat, Pierre; Serin, Daniel; Trédan, Olivier; Fervers, Béatrice

    2012-10-01

    Physical activity has been shown in large cohort studies to positively impact survival in cancer survivors. Existing randomized controlled trials showed a beneficial effect of physical activity on physical fitness, quality of life, anxiety and self-esteem; however, the small sample size, the short follow-up and the lack of standardization of physical activity intervention across studies impaired definite conclusion in terms of survival. Physical activity reduces adiposity and circulating estrogen levels and increases insulin sensitivity among other effects. A workshop was conducted at the International Agency for Research on Cancer in April 2011 to discuss the role of physical activity on cancer survival and the methodology to develop multicentre randomized intervention trials, including the type of physical activity to implement and its association with nutritional recommendations. The authors discuss the beneficial effect of physical activity on cancer survival with a main focus on breast cancer and report the conclusions from this workshop.

  5. Probability density of quantum expectation values

    Science.gov (United States)

    Campos Venuti, L.; Zanardi, P.

    2013-10-01

    We consider the quantum expectation value A= of an observable A over the state |ψ>. We derive the exact probability distribution of A seen as a random variable when |ψ> varies over the set of all pure states equipped with the Haar-induced measure. To illustrate our results we compare the exact predictions for few concrete examples with the concentration bounds obtained using Levy's lemma. We also comment on the relevance of the central limit theorem and finally draw some results on an alternative statistical mechanics based on the uniform measure on the energy shell.

  6. Foreign Ownership and Long-term Survival

    DEFF Research Database (Denmark)

    Kronborg, Dorte; Thomsen, Steen

    2006-01-01

    Does foreign ownership enhance or decrease a firm's chances of survival? Over the 100 year period 1895-2001 this paper compares the survival of foreign subsidiaries in Denmark to a control sample matched by industry and firm size. We find that foreign-owned companies have higher survival...... probability. On average exit risk for domestic companies is 2.3 times higher than for foreign companies. First movers like Siemens, Philips, Kodak, Ford, GM or Goodyear have been active in the country for almost a century. Relative foreign survival increases with company age. However, the foreign survival...

  7. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  8. Flu Shots, Mammogram, and the Perception of Probabilities

    NARCIS (Netherlands)

    Carman, K.G.; Kooreman, P.

    2010-01-01

    We study individuals’ decisions to decline or accept preventive health care interventions such as flu shots and mammograms. In particular, we analyze the role of perceptions of the effectiveness of the intervention, by eliciting individuals' subjective probabilities of sickness and survival, with

  9. Maximum-entropy probability distributions under Lp-norm constraints

    Science.gov (United States)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  10. Relief for surviving relatives following a suicide.

    NARCIS (Netherlands)

    Oud, MJT; de Groot, MH

    2006-01-01

    Relief for surviving relatives following a suicide. - After the suicide of a 43-year-old woman with known depression, a 41-year-old paraplegic man who recently developed diarrhoea and a 41-year-old woman with probable depression with symptoms of psychosis, the general practitioners of the surviving

  11. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  12. Considerations on a posteriori probability

    Directory of Open Access Journals (Sweden)

    Corrado Gini

    2015-06-01

    Full Text Available In this first paper of 1911 relating to the sex ratio at birth, Gini repurposed a Laplace’s succession rule according to a Bayesian version. The Gini's intuition consisted in assuming for prior probability a Beta type distribution and introducing the "method of results (direct and indirect" for the determination of  prior probabilities according to the statistical frequency obtained from statistical data.

  13. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....

  14. Palliative radiotherapy in addition to self-expanding metal stent for improving dysphagia and survival in advanced oesophageal cancer (ROCS: Radiotherapy after Oesophageal Cancer Stenting): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Adamson, Douglas; Blazeby, Jane; Nelson, Annmarie; Hurt, Chris; Nixon, Lisette; Fitzgibbon, Jim; Crosby, Tom; Staffurth, John; Evans, Mim; Kelly, Noreen Hopewell; Cohen, David; Griffiths, Gareth; Byrne, Anthony

    2014-10-22

    The single most distressing symptom for patients with advanced esophageal cancer is dysphagia. Amongst the more effective treatments for relief of dysphagia is insertion of a self-expanding metal stent (SEMS). It is possible that the addition of a palliative dose of external beam radiotherapy may prolong the relief of dysphagia and provide additional survival benefit. The ROCS trial will assess the effect of adding palliative radiotherapy after esophageal stent insertion. The study is a randomized multicenter phase III trial, with an internal pilot phase, comparing stent alone versus stent plus palliative radiotherapy in patients with incurable esophageal cancer. Eligible participants are those with advanced esophageal cancer who are in need of stent insertion for primary management of dysphagia. Radiotherapy will be administered as 20 Gray (Gy) in five fractions over one week or 30 Gy in 10 fractions over two weeks, within four weeks of stent insertion. The internal pilot will assess rates and methods of recruitment; pre-agreed criteria will determine progression to the main trial. In total, 496 patients will be randomized in a 1:1 ratio with follow up until death. The primary outcome is time to progression of patient-reported dysphagia. Secondary outcomes include survival, toxicity, health resource utilization, and quality of life. An embedded qualitative study will explore the feasibility of patient recruitment by examining patients' motivations for involvement and their experiences of consent and recruitment, including reasons for not consenting. It will also explore patients' experiences of each trial arm. The ROCS study will be a challenging trial studying palliation in patients with a poor prognosis. The internal pilot design will optimize methods for recruitment and data collection to ensure that the main trial is completed on time. As a pragmatic trial, study strengths include collection of all follow-up data in the usual place of care, and a focus on

  15. Measuring survival rates from sudden cardiac arrest: the elusive definition.

    Science.gov (United States)

    Sayre, Michael R; Travers, Andrew H; Daya, Mohamud; Greene, H Leon; Salive, Marcel E; Vijayaraghavan, Krishnaswami; Craven, Richard A; Groh, William J; Hallstrom, Alfred P

    2004-07-01

    Measuring survival from sudden out-of-hospital cardiac arrest (OOH-CA) is often used as a benchmark of the quality of a community's emergency medical service (EMS) system. The definition of OOH-CA survival rates depends both upon the numerator (surviving cases) and the denominator (all cases). The purpose of the public access defibrillation (PAD) trial was to measure the impact on survival of adding an automated external defibrillator (AED) to a volunteer response system trained in CPR. This paper reports the definition of OOH-CA developed by the PAD trial investigators, and it evaluates alternative statistical methods used to assess differences in reported "survival." Case surveillance was limited to the prospectively determined geographic boundaries of the participating trial units. The numerator in calculating a survival rate should include only those patients who survived an event but who otherwise would have died except for the application of some facet of emergency medical care-in this trial a defibrillatory shock. Among denominators considered were: total population of the study unit, all deaths within the study unit, and documented ventricular fibrillation cardiac arrests. The PAD classification focused upon cases that might have benefited from the early use of an AED, in addition to the likely benefit from early recognition of OOH-CA, early access of EMS, and early cardiopulmonary resuscitation (CPR). Results of this classification system were used to evaluate the impact of the PAD definition on the distribution of cardiac arrest case types between CPR only and CPR + AED units. Potential OOH-CA episodes were classified into one of four groups: definite, probable, uncertain, or not an OOH-CA. About half of cardiac arrests in the PAD units were judged to be definite OOH-CA events and therefore potentially treatable with an AED. However, events that occurred in CPR-only units were less likely to be classified as definite or probable OOH-CA events than those

  16. Transition Probabilities of Gd I

    Science.gov (United States)

    Bilty, Katherine; Lawler, J. E.; Den Hartog, E. A.

    2011-01-01

    Rare earth transition probabilities are needed within the astrophysics community to determine rare earth abundances in stellar photospheres. The current work is part an on-going study of rare earth element neutrals. Transition probabilities are determined by combining radiative lifetimes measured using time-resolved laser-induced fluorescence on a slow atom beam with branching fractions measured from high resolution Fourier transform spectra. Neutral rare earth transition probabilities will be helpful in improving abundances in cool stars in which a significant fraction of rare earths are neutral. Transition probabilities are also needed for research and development in the lighting industry. Rare earths have rich spectra containing 100's to 1000's of transitions throughout the visible and near UV. This makes rare earths valuable additives in Metal Halide - High Intensity Discharge (MH-HID) lamps, giving them a pleasing white light with good color rendering. This poster presents the work done on neutral gadolinium. We will report radiative lifetimes for 135 levels and transition probabilities for upwards of 1500 lines of Gd I. The lifetimes are reported to ±5% and the transition probabilities range from 5% for strong lines to 25% for weak lines. This work is supported by the National Science Foundation under grant CTS 0613277 and the National Science Foundation's REU program through NSF Award AST-1004881.

  17. Estimating true instead of apparent survival using spatial Cormack-Jolly-Seber models

    Science.gov (United States)

    Schaub, Michael; Royle, J. Andrew

    2014-01-01

    Survival is often estimated from capture–recapture data using Cormack–Jolly–Seber (CJS) models, where mortality and emigration cannot be distinguished, and the estimated apparent survival probability is the product of the probabilities of true survival and of study area fidelity. Consequently, apparent survival is lower than true survival unless study area fidelity equals one. Underestimation of true survival from capture–recapture data is a main limitation of the method.

  18. Universal randomness

    Energy Technology Data Exchange (ETDEWEB)

    Dotsenko, Viktor S [Landau Institute for Theoretical Physics, Russian Academy of Sciences, Moscow (Russian Federation)

    2011-03-31

    In the last two decades, it has been established that a single universal probability distribution function, known as the Tracy-Widom (TW) distribution, in many cases provides a macroscopic-level description of the statistical properties of microscopically different systems, including both purely mathematical ones, such as increasing subsequences in random permutations, and quite physical ones, such as directed polymers in random media or polynuclear crystal growth. In the first part of this review, we use a number of models to examine this phenomenon at a simple qualitative level and then consider the exact solution for one-dimensional directed polymers in a random environment, showing that free energy fluctuations in such a system are described by the universal TW distribution. The second part provides detailed appendix material containing the necessary mathematical background for the first part. (reviews of topical problems)

  19. Sampling Random Bioinformatics Puzzles using Adaptive Probability Distributions

    DEFF Research Database (Denmark)

    Have, Christian Theil; Appel, Emil Vincent; Bork-Jensen, Jette

    2016-01-01

    We present a probabilistic logic program to generate an educational puzzle that introduces the basic principles of next generation sequencing, gene finding and the translation of genes to proteins following the central dogma in biology. In the puzzle, a secret "protein word" must be found by asse...

  20. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Semkow, Thomas M. E-mail: semkow@wadsworth.org

    1999-11-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  1. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    CERN Document Server

    Semkow, T M

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided.

  2. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  3. Probability, statistics, and reliability for engineers and scientists

    CERN Document Server

    Ayyub, Bilal M

    2012-01-01

    IntroductionIntroduction Knowledge, Information, and Opinions Ignorance and Uncertainty Aleatory and Epistemic Uncertainties in System Abstraction Characterizing and Modeling Uncertainty Simulation for Uncertainty Analysis and Propagation Simulation Projects Data Description and TreatmentIntroduction Classification of Data Graphical Description of Data Histograms and Frequency Diagrams Descriptive Measures Applications Analysis of Simulated Data Simulation Projects Fundamentals of ProbabilityIntroduction Sets, Sample Spaces, and EventsMathematics of Probability Random Variables and Their Proba

  4. Innovations’ Survival

    Directory of Open Access Journals (Sweden)

    Jakub Tabas

    2016-01-01

    Full Text Available Innovations currently represent a tool of maintaining the going concern of a business entity and its competitiveness. However, effects of innovations are not infinite and if an innovation should constantly preserve a life of business entity, it has to be a continual chain of innovations, i.e. continual process. Effective live of a single innovation is limited while the limitation is derived especially from industry. The paper provides the results of research on innovations effects in the financial performance of small and medium-sized enterprises in the Czech Republic. Objective of this paper is to determine the length and intensity of the effects of technical innovations in company’s financial performance. The economic effect of innovations has been measured at application of company’s gross production power while the Deviation Analysis has been applied for three years’ time series. Subsequently the Survival Analysis has been applied. The analyses are elaborated for three statistical samples of SMEs constructed in accordance to the industry. The results obtained show significant differences in innovations’ survival within these three samples of enterprises then. The results are quite specific for the industries, and are confronted and discussed with the results of authors’ former research on the issue.

  5. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  6. Visualization techniques for spatial probability density function data

    Directory of Open Access Journals (Sweden)

    Udeepta D Bordoloi

    2006-01-01

    Full Text Available Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways of interpreting and clustering the data. The clustering methods are used on two datasets, and the results are discussed with the help of visualization techniques designed for the spatial probability data.

  7. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  8. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  9. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  10. Flood hazard probability mapping method

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  11. Incompatible Stochastic Processes and Complex Probabilities

    Science.gov (United States)

    Zak, Michail

    1997-01-01

    The definition of conditional probabilities is based upon the existence of a joint probability. However, a reconstruction of the joint probability from given conditional probabilities imposes certain constraints upon the latter, so that if several conditional probabilities are chosen arbitrarily, the corresponding joint probability may not exist.

  12. Probability measures on metric spaces

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    In this book, the author gives a cohesive account of the theory of probability measures on complete metric spaces (which is viewed as an alternative approach to the general theory of stochastic processes). After a general description of the basics of topology on the set of measures, the author discusses regularity, tightness, and perfectness of measures, properties of sampling distributions, and metrizability and compactness theorems. Next, he describes arithmetic properties of probability measures on metric groups and locally compact abelian groups. Covered in detail are notions such as decom

  13. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fit...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  14. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  15. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  16. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  17. Asymptotic Theory for the Probability Density Functions in Burgers Turbulence

    CERN Document Server

    Weinan, E; Eijnden, Eric Vanden

    1999-01-01

    A rigorous study is carried out for the randomly forced Burgers equation in the inviscid limit. No closure approximations are made. Instead the probability density functions of velocity and velocity gradient are related to the statistics of quantities defined along the shocks. This method allows one to compute the anomalies, as well as asymptotics for the structure functions and the probability density functions. It is shown that the left tail for the probability density function of the velocity gradient has to decay faster than $|\\xi|^{-3}$. A further argument confirms the prediction of E et al., Phys. Rev. Lett. {\\bf 78}, 1904 (1997), that it should decay as $|\\xi|^{-7/2}$.

  18. Introduction to probability and stochastic processes with applications

    CERN Document Server

    Castañ, Blanco; Arunachalam, Viswanathan; Dharmaraja, Selvamuthu

    2012-01-01

    An easily accessible, real-world approach to probability and stochastic processes Introduction to Probability and Stochastic Processes with Applications presents a clear, easy-to-understand treatment of probability and stochastic processes, providing readers with a solid foundation they can build upon throughout their careers. With an emphasis on applications in engineering, applied sciences, business and finance, statistics, mathematics, and operations research, the book features numerous real-world examples that illustrate how random phenomena occur in nature and how to use probabilistic t

  19. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  20. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  1. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics Impact factor: 1.357, year: 2016 http:// library .utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  2. Probability and statistics: A reminder

    Science.gov (United States)

    Clément, Benoit

    2013-07-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from "data analysis in experimental sciences" given in [1

  3. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  4. Probability and statistics: A reminder

    OpenAIRE

    Clément Benoit

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  5. Ergodicity of Random Walks on Random DFA

    OpenAIRE

    Balle, Borja

    2013-01-01

    Given a DFA we consider the random walk that starts at the initial state and at each time step moves to a new state by taking a random transition from the current state. This paper shows that for typical DFA this random walk induces an ergodic Markov chain. The notion of typical DFA is formalized by showing that ergodicity holds with high probability when a DFA is sampled uniformly at random from the set of all automata with a fixed number of states. We also show the same result applies to DF...

  6. Quantifying extinction probabilities from sighting records: inference and uncertainties.

    Directory of Open Access Journals (Sweden)

    Peter Caley

    Full Text Available Methods are needed to estimate the probability that a population is extinct, whether to underpin decisions regarding the continuation of a invasive species eradication program, or to decide whether further searches for a rare and endangered species could be warranted. Current models for inferring extinction probability based on sighting data typically assume a constant or declining sighting rate. We develop methods to analyse these models in a Bayesian framework to estimate detection and survival probabilities of a population conditional on sighting data. We note, however, that the assumption of a constant or declining sighting rate may be hard to justify, especially for incursions of invasive species with potentially positive population growth rates. We therefore explored introducing additional process complexity via density-dependent survival and detection probabilities, with population density no longer constrained to be constant or decreasing. These models were applied to sparse carcass discoveries associated with the recent incursion of the European red fox (Vulpes vulpes into Tasmania, Australia. While a simple model provided apparently precise estimates of parameters and extinction probability, estimates arising from the more complex model were much more uncertain, with the sparse data unable to clearly resolve the underlying population processes. The outcome of this analysis was a much higher possibility of population persistence. We conclude that if it is safe to assume detection and survival parameters are constant, then existing models can be readily applied to sighting data to estimate extinction probability. If not, methods reliant on these simple assumptions are likely overstating their accuracy, and their use to underpin decision-making potentially fraught. Instead, researchers will need to more carefully specify priors about possible population processes.

  7. On the tail of the overlap probability distribution in the Sherrington-Kirkpatrick model 75.50.Lk Spin glasses and other random magnets; 75.10.Nr Spin-glass and other random models; 75.40.Gb Dynamic properties (dynamic susceptibility, spin waves, spin diffusion, dynamic scaling, etc.);

    CERN Document Server

    Billoire, A; Marinari, E

    2003-01-01

    We investigate the large deviation behaviour of the overlap probability density in the Sherrington-Kirkpatrick (SK) model using the coupled replica scheme, and we compare with the results of a large-scale numerical simulation. In the spin glass phase we show that, generically, for any model with continuous replica symmetry breaking (RSB), 1/N log P sub N (q)approx -A(|q| - q sub E sub A) sup 3 , and we compute the first correction to the expansion of A in powers of T sub c - T for the SK model. We also study the paramagnetic phase, where results are obtained in the replica symmetric scheme that do not involve an expansion in powers of q - q sub E sub A or T sub c - T. Finally we give precise semi-analytical estimates of P(|q| = 1). The overall agreement between the various points of view is very satisfactory.

  8. A standardized randomized 6-month aerobic exercise-training down-regulated pro-inflammatory genes, but up-regulated anti-inflammatory, neuron survival and axon growth-related genes.

    Science.gov (United States)

    Iyalomhe, Osigbemhe; Chen, Yuanxiu; Allard, Joanne; Ntekim, Oyonumo; Johnson, Sheree; Bond, Vernon; Goerlitz, David; Li, James; Obisesan, Thomas O

    2015-09-01

    There is considerable support for the view that aerobic exercise may confer cognitive benefits to mild cognitively impaired elderly persons. However, the biological mechanisms mediating these effects are not entirely clear. As a preliminary step towards informing this gap in knowledge, we enrolled older adults confirmed to have mild cognitive impairment (MCI) in a 6-month exercise program. Male and female subjects were randomized into a 6-month program of either aerobic or stretch (control) exercise. Data collected from the first 10 completers, aerobic exercise (n=5) or stretch (control) exercise (n=5), were used to determine intervention-induced changes in the global gene expression profiles of the aerobic and stretch groups. Using microarray, we identified genes with altered expression (relative to baseline values) in response to the 6-month exercise intervention. Genes whose expression were altered by at least two-fold, and met the p-value cutoff of 0.01 were inputted into the Ingenuity Pathway Knowledge Base Library to generate gene-interaction networks. After a 6-month aerobic exercise-training, genes promoting inflammation became down-regulated, whereas genes having anti-inflammatory properties and those modulating immune function or promoting neuron survival and axon growth, became up-regulated (all fold change≥±2.0, paerobic program as opposed to the stretch group. We conclude that three distinct cellular pathways may collectively influence the training effects of aerobic exercise in MCI subjects. We plan to confirm these effects using rt-PCR and correlate such changes with the cognitive phenotype. Copyright © 2015. Published by Elsevier Inc.

  9. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  10. Probability, Information and Statistical Physics

    Science.gov (United States)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  11. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness. Copyright © 2013 Cognitive Science Society, Inc.

  12. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  13. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    compensation (a $10 lottery ) on Amazon Mechanical Turk, an online platform hosted on Amazon.com [31]. All of the participants stated that they were native...Probability, Statistics and Truth (Allen & Unwin, London). 4. de Finetti B (1970) Logical foundations and measurement of subjective probabil- ity...F.P. Ramsey: Philosophical Papers, ed Mellor DH (Cam- bridge University Press, Cambridge). 7. Savage L (1972) The Foundations of Statistics (Dover

  14. Probability and statistics: A reminder

    Directory of Open Access Journals (Sweden)

    Clément Benoit

    2013-07-01

    Full Text Available The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from “data analysis in experimental sciences” given in [1

  15. Probability, Statistics, and Computational Science

    OpenAIRE

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient infe...

  16. Two-slit experiment: quantum and classical probabilities

    Science.gov (United States)

    Khrennikov, Andrei

    2015-06-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum-classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane).

  17. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...

  18. On Field Size and Success Probability in Network Coding

    DEFF Research Database (Denmark)

    Geil, Hans Olav; Matsumoto, Ryutaroh; Thomsen, Casper

    2008-01-01

    Using tools from algebraic geometry and Gröbner basis theory we solve two problems in network coding. First we present a method to determine the smallest field size for which linear network coding is feasible. Second we derive improved estimates on the success probability of random linear network...

  19. Flipping Out: Calculating Probability with a Coin Game

    Science.gov (United States)

    Degner, Kate

    2015-01-01

    In the author's experience with this activity, students struggle with the idea of representativeness in probability. Therefore, this student misconception is part of the classroom discussion about the activities in this lesson. Representativeness is related to the (incorrect) idea that outcomes that seem more random are more likely to happen. This…

  20. Entropy in probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Rolke, W.A.

    1992-01-01

    The author develops a theory of entropy, where entropy is defined as the Legendre-Fenchel transform of the logarithmic moment generating function of a probability measure on a Banach space. A variety of properties relating the probability measure and its entropy are proven. It is shown that the entropy of a large class of stochastic processes can be approximated by the entropies of the finite-dimensional distributions of the process. For several types of measures the author finds explicit formulas for the entropy, for example for stochastic processes with independent increments and for Gaussian processes. For the entropy of Markov chains, evaluated at the observations of the process, the author proves a central limit theorem. Theorems relating weak convergence of probability measures on a finite dimensional space and pointwise convergence of their entropies are developed and then used to give a new proof of Donsker's theorem. Finally the use of entropy in statistics is discussed. The author shows the connection between entropy and Kullback's minimum discrimination information. A central limit theorem yields a test for the independence of a sequence of observations.

  1. Probability density of quantum expectation values

    Energy Technology Data Exchange (ETDEWEB)

    Campos Venuti, L., E-mail: lcamposv@usc.edu; Zanardi, P.

    2013-10-30

    We consider the quantum expectation value A=〈ψ|A|ψ〉 of an observable A over the state |ψ〉. We derive the exact probability distribution of A seen as a random variable when |ψ〉 varies over the set of all pure states equipped with the Haar-induced measure. To illustrate our results we compare the exact predictions for few concrete examples with the concentration bounds obtained using Levy's lemma. We also comment on the relevance of the central limit theorem and finally draw some results on an alternative statistical mechanics based on the uniform measure on the energy shell. - Highlights: • We compute the probability distribution of quantum expectation values for states sampled uniformly. • As a special case we consider in some detail the degenerate case where A is a one-dimensional projector. • We compare the concentration results obtained using Levy's lemma with the exact values obtained using our exact formulae. • We comment on the possibility of a Central Limit Theorem and show approach to Gaussian for a few physical operators. • Some implications of our results for the so-called “Quantum Microcanonical Equilibration” (Refs. [5–9]) are derived.

  2. Misuse of randomization

    DEFF Research Database (Denmark)

    Liu, Jianping; Kjaergard, Lise Lotte; Gluud, Christian

    2002-01-01

    The quality of randomization of Chinese randomized trials on herbal medicines for hepatitis B was assessed. Search strategy and inclusion criteria were based on the published protocol. One hundred and seventy-six randomized clinical trials (RCTs) involving 20,452 patients with chronic hepatitis B....../150) of the studies were imbalanced at the 0.05 level of probability for the two treatments and 13.3% (20/150) imbalanced at the 0.01 level in the randomization. It is suggested that there may exist misunderstanding of the concept and the misuse of randomization based on the review....

  3. Probability analysis of position errors using uncooled IR stereo camera

    Science.gov (United States)

    Oh, Jun Ho; Lee, Sang Hwa; Lee, Boo Hwan; Park, Jong-Il

    2016-05-01

    This paper analyzes the random phenomenon of 3D positions when tracking moving objects using the infrared (IR) stereo camera, and proposes a probability model of 3D positions. The proposed probability model integrates two random error phenomena. One is the pixel quantization error which is caused by discrete sampling pixels in estimating disparity values of stereo camera. The other is the timing jitter which results from the irregular acquisition-timing in the uncooled IR cameras. This paper derives a probability distribution function by combining jitter model with pixel quantization error. To verify the proposed probability function of 3D positions, the experiments on tracking fast moving objects are performed using IR stereo camera system. The 3D depths of moving object are estimated by stereo matching, and be compared with the ground truth obtained by laser scanner system. According to the experiments, the 3D depths of moving object are estimated within the statistically reliable range which is well derived by the proposed probability distribution. It is expected that the proposed probability model of 3D positions can be applied to various IR stereo camera systems that deal with fast moving objects.

  4. On the probability of cure for heavy-ion radiotherapy.

    Science.gov (United States)

    Hanin, Leonid; Zaider, Marco

    2014-07-21

    The probability of a cure in radiation therapy (RT)-viewed as the probability of eventual extinction of all cancer cells-is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule.

  5. Probability theory a concise course

    CERN Document Server

    Rozanov, Y A

    1977-01-01

    This clear exposition begins with basic concepts and moves on to combination of events, dependent events and random variables, Bernoulli trials and the De Moivre-Laplace theorem, a detailed treatment of Markov chains, continuous Markov processes, and more. Includes 150 problems, many with answers. Indispensable to mathematicians and natural scientists alike.

  6. Frailty Models in Survival Analysis

    CERN Document Server

    Wienke, Andreas

    2010-01-01

    The concept of frailty offers a convenient way to introduce unobserved heterogeneity and associations into models for survival data. In its simplest form, frailty is an unobserved random proportionality factor that modifies the hazard function of an individual or a group of related individuals. "Frailty Models in Survival Analysis" presents a comprehensive overview of the fundamental approaches in the area of frailty models. The book extensively explores how univariate frailty models can represent unobserved heterogeneity. It also emphasizes correlated frailty models as extensions of

  7. Probability of Detection Demonstration Transferability

    Science.gov (United States)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  8. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  9. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...... to distinguish between a player's assessment of ambiguity and his attitude towards ambiguity. We also generalize the concept of trembling hand perfect equilibrium. Finally, we demonstrate that for certain attitudes towards ambiguity it is possible to explain cooperation in the one-shot Prisoner's Dilemma...

  10. Random broadcast on random geometric graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  11. Lectures on probability and statistics

    Energy Technology Data Exchange (ETDEWEB)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  12. Social class and survival on the S.S. Titanic.

    Science.gov (United States)

    Hall, W

    1986-01-01

    Passengers' chances of surviving the sinking of the S.S. Titanic were related to their sex and their social class: females were more likely to survive than males, and the chances of survival declined with social class as measured by the class in which the passenger travelled. The probable reasons for these differences in rates of survival are discussed as are the reasons accepted by the Mersey Committee of Inquiry into the sinking.

  13. On estimating the fracture probability of nuclear graphite components

    Science.gov (United States)

    Srinivasan, Makuteswara

    2008-10-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation.

  14. Psychology and survival.

    Science.gov (United States)

    Phillips, D P; Ruth, T E; Wagner, L M

    1993-11-06

    We examined the deaths of 28,169 adult Chinese-Americans, and 412,632 randomly selected, matched controls coded "white" on the death certificate. Chinese-Americans, but not whites, die significantly earlier than normal (1.3-4.9 yr) if they have a combination of disease and birthyear which Chinese astrology and medicine consider ill-fated. The more strongly a group is attached to Chinese traditions, the more years of life are lost. Our results hold for nearly all major causes of death studied. The reduction in survival cannot be completely explained by a change in the behaviour of the Chinese patient, doctor, or death-registrar, but seems to result at least partly from psychosomatic processes.

  15. Survival function of hypo-exponential distributions

    OpenAIRE

    Lotfy, Mamdouh M.; Abdelsamad, Ali S.

    1985-01-01

    Approved for public release; distribution is unlimited The reliability of a system is the probability that the system will survive or complete an intended mission of certain duration. Describing all possible ways that a system can survive a mission in reliability shorthand gives a simple approach to reliability computations. Reliability computation for a system defined by shorthand notation is greatly dependent upon the convolution problem. Assuming constant component failure rates, this p...

  16. Daniel Courgeau: Probability and social science: methodological relationships between the two approaches [Review of: . Probability and social science: methodological relationships between the two approaches

    NARCIS (Netherlands)

    Willekens, F.J.C.

    2013-01-01

    Throughout history, humans engaged in games in which randomness plays a role. In the 17th century, scientists started to approach chance scientifically and to develop a theory of probability. Courgeau describes how the relationship between probability theory and social sciences emerged and evolved

  17. KNOTS AND RANDOM WALKS IN VIBRATED GRANULAR CHAINS

    Energy Technology Data Exchange (ETDEWEB)

    E. BEN-NAIM; ET AL

    2000-08-01

    The authors study experimentally statistical properties of the opening times of knots in vertically vibrated granular chains. Our measurements are in good qualitative and quantitative agreement with a theoretical model involving three random walks interacting via hard core exclusion in one spatial dimension. In particular, the knot survival probability follows a universal scaling function which is independent of the chain length, with a corresponding diffusive characteristic time scale. Both the large-exit-time and the small-exit-time tails of the distribution are suppressed exponentially, and the corresponding decay coefficients are in excellent agreement with the theoretical values.

  18. Lévy laws in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes.

  19. What Randomized Benchmarking Actually Measures

    Science.gov (United States)

    Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; Sarovar, Mohan; Blume-Kohout, Robin

    2017-09-01

    Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r . For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not a well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. These theories allow explicit computation of the error rate that RB measures (r ), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.

  20. On the probability distribution for the cosmological constant

    Science.gov (United States)

    Elizalde, E.; Gaztañaga, E.

    1990-01-01

    The behaviour in Coleman's approach of the probability distribution for the cosmological constant Λ is shown to depend rather strongly on the corrections to the effective action. In particular, when one includes terms proportional to Λ2, the infinite peak in the probability density at Λ=0 smoothly disappears (provided that the coefficient of Λ2 is positive). A random distribution for Λ can then be obtained (as a limiting case) in a domain around Λ=0. This is in accordance with the results of an approach recently proposed by Fischler, Klebanov, Polchinski and Susskind.

  1. Gendist: An R Package for Generated Probability Distribution Models.

    Directory of Open Access Journals (Sweden)

    Shaiful Anuar Abu Bakar

    Full Text Available In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements.

  2. Digital dice computational solutions to practical probability problems

    CERN Document Server

    Nahin, Paul J

    2013-01-01

    Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the

  3. Gendist: An R Package for Generated Probability Distribution Models.

    Science.gov (United States)

    Abu Bakar, Shaiful Anuar; Nadarajah, Saralees; Absl Kamarul Adzhar, Zahrul Azmir; Mohamed, Ibrahim

    2016-01-01

    In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements.

  4. A practical overview on probability distributions

    OpenAIRE

    Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca

    2015-01-01

    Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a bino...

  5. Long-term survival results of a randomized phase III trial of vinflunine plus best supportive care versus best supportive care alone in advanced urothelial carcinoma patients after failure of platinum-based chemotherapy

    DEFF Research Database (Denmark)

    Bellmunt, J; Fougeray, R; Rosenberg, J E

    2013-01-01

    To compare long-term, updated overall survival (OS) of patients with advanced transitional cell carcinoma of the urothelium (TCCU) treated with vinflunine plus best supportive care (BSC) or BSC alone, after failure of platinum-based chemotherapy.......To compare long-term, updated overall survival (OS) of patients with advanced transitional cell carcinoma of the urothelium (TCCU) treated with vinflunine plus best supportive care (BSC) or BSC alone, after failure of platinum-based chemotherapy....

  6. Foreign Ownership and long-term Survival

    OpenAIRE

    Kronborg, Dorte; Thomsen, Steen

    2006-01-01

    Does foreign ownership enhance or decrease a firm’s chances of survival? Over the 100 year period 1895-2001 this paper compares the survival of foreign subsidiaries in Denmark to a control sample matched by industry and firm size. We find that foreign-owned companies have higher survival probability. On average exit risk for domestic companies is 2.3 times higher than for foreign companies. First movers like Siemens, Philips, Kodak, Ford, GM or Goodyear have been active in the country for alm...

  7. Random walks on random Koch curves

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, S; Hoffmann, K H [Institut fuer Physik, Technische Universitaet, D-09107 Chemnitz (Germany); Essex, C [Department of Applied Mathematics, University of Western Ontario, London, ON N6A 5B7 (Canada)

    2009-06-05

    Diffusion processes in porous materials are often modeled as random walks on fractals. In order to capture the randomness of the materials random fractals are employed, which no longer show the deterministic self-similarity of regular fractals. Finding a continuum differential equation describing the diffusion on such fractals has been a long-standing goal, and we address the question of whether the concepts developed for regular fractals are still applicable. We use the random Koch curve as a convenient example as it provides certain technical advantages by its separation of time and space features. While some of the concepts developed for regular fractals can be used unaltered, others have to be modified. Based on the concept of fibers, we introduce ensemble-averaged density functions which produce a differentiable estimate of probability explicitly and compare it to random walk data.

  8. Statins and risk of diabetes: an analysis of electronic medical records to evaluate possible bias due to differential survival.

    Science.gov (United States)

    Danaei, Goodarz; García Rodríguez, Luis A; Fernandez Cantero, Oscar; Hernán, Miguel A

    2013-05-01

    Two meta-analyses of randomized trials of statins found increased risk of type 2 diabetes. One possible explanation is bias due to differential survival when patients who are at higher risk of diabetes survive longer under statin treatment. We used electronic medical records from 500 general practices in the U.K. and included data from 285,864 men and women aged 50-84 years from January 2000 to December 2010. We emulated the design and analysis of a hypothetical randomized trial of statins, estimated the observational analog of the intention-to-treat effect, and adjusted for differential survival bias using inverse-probability weighting. During 1.2 million person-years of follow-up, there were 13,455 cases of type 2 diabetes and 8,932 deaths. Statin initiation was associated with increased risk of type 2 diabetes. The hazard ratio (95% CI) of diabetes was 1.45 (1.39-1.50) before adjusting for potential confounders and 1.14 (1.10-1.19) after adjustment. Adjusting for differential survival did not change the estimates. Initiating atorvastatin and simvastatin was associated with increased risk of type 2 diabetes. In this sample of the general population, statin therapy was associated with 14% increased risk of type 2 diabetes. Differential survival did not explain this increased risk.

  9. Hitting probabilities for nonlinear systems of stochastic waves

    CERN Document Server

    Dalang, Robert C

    2015-01-01

    The authors consider a d-dimensional random field u = \\{u(t,x)\\} that solves a non-linear system of stochastic wave equations in spatial dimensions k \\in \\{1,2,3\\}, driven by a spatially homogeneous Gaussian noise that is white in time. They mainly consider the case where the spatial covariance is given by a Riesz kernel with exponent \\beta. Using Malliavin calculus, they establish upper and lower bounds on the probabilities that the random field visits a deterministic subset of \\mathbb{R}^d, in terms, respectively, of Hausdorff measure and Newtonian capacity of this set. The dimension that ap

  10. Stochastic invertible mappings between power law and Gaussian probability distributions

    OpenAIRE

    Vignat, C.; Plastino, A.

    2005-01-01

    We construct "stochastic mappings" between power law probability distributions (PD's) and Gaussian ones. To a given vector $N$, Gaussian distributed (respectively $Z$, exponentially distributed), one can associate a vector $X$, "power law distributed", by multiplying $X$ by a random scalar variable $a$, $N= a X$. This mapping is "invertible": one can go via multiplication by another random variable $b$ from $X$ to $N$ (resp. from $X$ to $Z$), i.e., $X=b N$ (resp. $X=b Z$). Note that all the a...

  11. Some results of ruin probability for the classical risk process

    Directory of Open Access Journals (Sweden)

    He Yuanjiang

    2003-01-01

    the assumption that random sequence followed the Γ distribution with density function f(x=x1β−1β1βΓ(1/βe−xβ, x>0, where β>1. This paper studies the ruin probability of the classical model where the random sequence follows the Γ distribution with density function f(x=αnΓ(nxn−1e−αx, x>0, where α>0 and n≥2 is a positive integer. An intermediate general result is given and a complete solution is provided for n=2. Simulation studies for the case of n=2 is also provided.

  12. XI Symposium on Probability and Stochastic Processes

    CERN Document Server

    Pardo, Juan; Rivero, Víctor; Bravo, Gerónimo

    2015-01-01

    This volume features lecture notes and a collection of contributed articles from the XI Symposium on Probability and Stochastic Processes, held at CIMAT Mexico in September 2013. Since the symposium was part of the activities organized in Mexico to celebrate the International Year of Statistics, the program included topics from the interface between statistics and stochastic processes. The book starts with notes from the mini-course given by Louigi Addario-Berry with an accessible description of some features of the multiplicative coalescent and its connection with random graphs and minimum spanning trees. It includes a number of exercises and a section on unanswered questions. Further contributions provide the reader with a broad perspective on the state-of-the art of active areas of research. Contributions by: Louigi Addario-Berry Octavio Arizmendi Fabrice Baudoin Jochen Blath Loïc Chaumont J. Armando Domínguez-Molina Bjarki Eldon Shui Feng Tulio Gaxiola Adrián González Casanova Evgueni Gordienko Daniel...

  13. Essays on probability elicitation scoring rules

    Science.gov (United States)

    Firmino, Paulo Renato A.; dos Santos Neto, Ademir B.

    2012-10-01

    In probability elicitation exercises it has been usual to considerer scoring rules (SRs) to measure the performance of experts when inferring about a given unknown, Θ, for which the true value, θ*, is (or will shortly be) known to the experimenter. Mathematically, SRs quantify the discrepancy between f(θ) (the distribution reflecting the expert's uncertainty about Θ) and d(θ), a zero-one indicator function of the observation θ*. Thus, a remarkable characteristic of SRs is to contrast expert's beliefs with the observation θ*. The present work aims at extending SRs concepts and formulas for the cases where Θ is aleatory, highlighting advantages of goodness-of-fit and entropy-like measures. Conceptually, it is argued that besides of evaluating the personal performance of the expert, SRs may also play a role when comparing the elicitation processes adopted to obtain f(θ). Mathematically, it is proposed to replace d(θ) by g(θ), the distribution that model the randomness of Θ, and do also considerer goodness-of-fit and entropylike metrics, leading to SRs that measure the adherence of f(θ) to g(θ). The implications of this alternative perspective are discussed and illustrated by means of case studies based on the simulation of controlled experiments. The usefulness of the proposed approach for evaluating the performance of experts and elicitation processes is investigated.

  14. How to simulate a universal quantum computer using negative probabilities

    Science.gov (United States)

    Hofmann, Holger F.

    2009-07-01

    The concept of negative probabilities can be used to decompose the interaction of two qubits mediated by a quantum controlled-NOT into three operations that require only classical interactions (that is, local operations and classical communication) between the qubits. For a single gate, the probabilities of the three operations are 1, 1 and -1. This decomposition can be applied in a probabilistic simulation of quantum computation by randomly choosing one of the three operations for each gate and assigning a negative statistical weight to the outcomes of sequences with an odd number of negative probability operations. The maximal exponential speed-up of a quantum computer can then be evaluated in terms of the increase in the number of sequences needed to simulate a single operation of the quantum circuit.

  15. Randomized study of whole-abdomen irradiation versus pelvic irradiation plus cyclophosphamide in treatment of early ovarian cancer

    Energy Technology Data Exchange (ETDEWEB)

    Sell, A.; Bertelsen, K.; Andersen, J.E.; Stroyer, I. (Arhus Univ. Hospital (Denmark))

    1990-06-01

    From 1 September 1981 to 1 January 1987, 118 patients with FIGO Stage IB, IC, IIA, IIB, and IIC epithelial ovarian cancer were randomized to abdominal irradiation or pelvic irradiation + cyclophosphamide. There was no difference between the regimens with respect to recurrence-free survival (55%) and 4-year overall survival (63%). At routine second-look laparotomy, 16% of patients without clinical detectable tumor showed recurrence. Twenty-five percent of the patients treated with pelvic irradiation + cyclophosphamide had hemorrhagic cystitis, probably caused by radiation damage and cyclophosphamide cystitis. Eight percent had late gastrointestinal symptoms requiring surgery.

  16. Theory of random sets

    CERN Document Server

    Molchanov, Ilya

    2017-01-01

    This monograph, now in a thoroughly revised second edition, offers the latest research on random sets. It has been extended to include substantial developments achieved since 2005, some of them motivated by applications of random sets to econometrics and finance. The present volume builds on the foundations laid by Matheron and others, including the vast advances in stochastic geometry, probability theory, set-valued analysis, and statistical inference. It shows the various interdisciplinary relationships of random set theory within other parts of mathematics, and at the same time fixes terminology and notation that often vary in the literature, establishing it as a natural part of modern probability theory and providing a platform for future development. It is completely self-contained, systematic and exhaustive, with the full proofs that are necessary to gain insight. Aimed at research level, Theory of Random Sets will be an invaluable reference for probabilists; mathematicians working in convex and integ...

  17. The probability that a pair of group elements is autoconjugate

    Indian Academy of Sciences (India)

    Abstract. Let g and h be arbitrary elements of a given finite group G. Then g and h are said to be autoconjugate if there exists some automorphism α of G such that h = gα. In this article, we construct some sharp bounds for the probability that two random elements of G are autoconjugate, denoted by Pa(G). It is also shown ...

  18. Probability distribution of the number of deceits in collective robotics

    OpenAIRE

    Murciano, Antonio; Zamora, Javier; Lopez-Sanchez, Jesus; Rodriguez-Santamaria, Emilia

    2002-01-01

    The benefit obtained by a selfish robot by cheating in a real multirobotic system can be represented by the random variable Xn,q: the number of cheating interactions needed before all the members in a cooperative team of robots, playing a TIT FOR TAT strategy, recognize the selfish robot. Stability of cooperation depends on the ratio between the benefit obtained by selfish and cooperative robots. In this paper, we establish the probability model for Xn,q. If the values...

  19. Androgen Suppression Combined with Elective Nodal and Dose Escalated Radiation Therapy (the ASCENDE-RT Trial): An Analysis of Survival Endpoints for a Randomized Trial Comparing a Low-Dose-Rate Brachytherapy Boost to a Dose-Escalated External Beam Boost for High- and Intermediate-risk Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Morris, W. James, E-mail: jmorris@bccancer.bc.ca [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Vancouver Centre, Vancouver, British Columbia (Canada); Tyldesley, Scott [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Vancouver Centre, Vancouver, British Columbia (Canada); Rodda, Sree [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); Halperin, Ross [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Centre for the Southern Interior, Vancouver, British Columbia (Canada); Pai, Howard [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Vancouver Island Centre, Vancouver, British Columbia (Canada); McKenzie, Michael; Duncan, Graeme [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Vancouver Centre, Vancouver, British Columbia (Canada); Morton, Gerard [Department of Radiation Oncology, University of Toronto, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Hamm, Jeremy [Department of Population Oncology, BC Cancer Agency, Vancouver, British Columbia (Canada); Murray, Nevin [BC Cancer Agency–Vancouver Centre, Vancouver, British Columbia (Canada); Department of Medicine, University of British Columbia, Vancouver, British Columbia (Canada)

    2017-06-01

    Purpose: To report the primary endpoint of biochemical progression-free survival (b-PFS) and secondary survival endpoints from ASCENDE-RT, a randomized trial comparing 2 methods of dose escalation for intermediate- and high-risk prostate cancer. Methods and Materials: ASCENDE-RT enrolled 398 men, with a median age of 68 years; 69% (n=276) had high-risk disease. After stratification by risk group, the subjects were randomized to a standard arm with 12 months of androgen deprivation therapy, pelvic irradiation to 46 Gy, followed by a dose-escalated external beam radiation therapy (DE-EBRT) boost to 78 Gy, or an experimental arm that substituted a low-dose-rate prostate brachytherapy (LDR-PB) boost. Of the 398 trial subjects, 200 were assigned to DE-EBRT boost and 198 to LDR-PB boost. The median follow-up was 6.5 years. Results: In an intent-to-treat analysis, men randomized to DE-EBRT were twice as likely to experience biochemical failure (multivariable analysis [MVA] hazard ratio [HR] 2.04; P=.004). The 5-, 7-, and 9-year Kaplan-Meier b-PFS estimates were 89%, 86%, and 83% for the LDR-PB boost versus 84%, 75%, and 62% for the DE-EBRT boost (log-rank P<.001). The LDR-PB boost benefited both intermediate- and high-risk patients. Because the b-PFS curves for the treatment arms diverge sharply after 4 years, the relative advantage of the LDR-PB should increase with longer follow-up. On MVA, the only variables correlated with reduced overall survival were age (MVA HR 1.06/y; P=.004) and biochemical failure (MVA HR 6.30; P<.001). Although biochemical failure was associated with increased mortality and randomization to DE-EBRT doubled the rate of biochemical failure, no significant overall survival difference was observed between the treatment arms (MVA HR 1.13; P=.62). Conclusions: Compared with 78 Gy EBRT, men randomized to the LDR-PB boost were twice as likely to be free of biochemical failure at a median follow-up of 6.5 years.

  20. Androgen Suppression Combined with Elective Nodal and Dose Escalated Radiation Therapy (the ASCENDE-RT Trial): An Analysis of Survival Endpoints for a Randomized Trial Comparing a Low-Dose-Rate Brachytherapy Boost to a Dose-Escalated External Beam Boost for High- and Intermediate-risk Prostate Cancer.

    Science.gov (United States)

    Morris, W James; Tyldesley, Scott; Rodda, Sree; Halperin, Ross; Pai, Howard; McKenzie, Michael; Duncan, Graeme; Morton, Gerard; Hamm, Jeremy; Murray, Nevin

    2017-06-01

    To report the primary endpoint of biochemical progression-free survival (b-PFS) and secondary survival endpoints from ASCENDE-RT, a randomized trial comparing 2 methods of dose escalation for intermediate- and high-risk prostate cancer. ASCENDE-RT enrolled 398 men, with a median age of 68 years; 69% (n=276) had high-risk disease. After stratification by risk group, the subjects were randomized to a standard arm with 12 months of androgen deprivation therapy, pelvic irradiation to 46 Gy, followed by a dose-escalated external beam radiation therapy (DE-EBRT) boost to 78 Gy, or an experimental arm that substituted a low-dose-rate prostate brachytherapy (LDR-PB) boost. Of the 398 trial subjects, 200 were assigned to DE-EBRT boost and 198 to LDR-PB boost. The median follow-up was 6.5 years. In an intent-to-treat analysis, men randomized to DE-EBRT were twice as likely to experience biochemical failure (multivariable analysis [MVA] hazard ratio [HR] 2.04; P=.004). The 5-, 7-, and 9-year Kaplan-Meier b-PFS estimates were 89%, 86%, and 83% for the LDR-PB boost versus 84%, 75%, and 62% for the DE-EBRT boost (log-rank P<.001). The LDR-PB boost benefited both intermediate- and high-risk patients. Because the b-PFS curves for the treatment arms diverge sharply after 4 years, the relative advantage of the LDR-PB should increase with longer follow-up. On MVA, the only variables correlated with reduced overall survival were age (MVA HR 1.06/y; P=.004) and biochemical failure (MVA HR 6.30; P<.001). Although biochemical failure was associated with increased mortality and randomization to DE-EBRT doubled the rate of biochemical failure, no significant overall survival difference was observed between the treatment arms (MVA HR 1.13; P=.62). Compared with 78 Gy EBRT, men randomized to the LDR-PB boost were twice as likely to be free of biochemical failure at a median follow-up of 6.5 years. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Pediatric Triage in a Severe Pandemic: Maximizing Survival by Establishing Triage Thresholds.

    Science.gov (United States)

    Gall, Christine; Wetzel, Randall; Kolker, Alexander; Kanter, Robert K; Toltzis, Philip

    2016-09-01

    To develop and validate an algorithm to guide selection of patients for pediatric critical care admission during a severe pandemic when Crisis Standards of Care are implemented. Retrospective observational study using secondary data. Children admitted to VPS-participating PICUs between 2009-2012. A total of 111,174 randomly selected nonelective cases from the Virtual PICU Systems database were used to estimate each patient's probability of death and duration of ventilation employing previously derived predictive equations. Using real and projected statistics for the State of Ohio as an example, triage thresholds were established for casualty volumes ranging from 5,000 to 10,000 for a modeled pandemic with peak duration of 6 weeks and 280 pediatric intensive care beds. The goal was to simultaneously maximize casualty survival and bed occupancy. Discrete Event Simulation was used to determine triage thresholds for probability of death and duration of ventilation as a function of casualty volume and the total number of available beds. Simulation was employed to compare survival between the proposed triage algorithm and a first come first served distribution of scarce resources. Population survival was greater using the triage thresholds compared with a first come first served strategy. In this model, for five, six, seven, eight, and 10 thousand casualties, the triage algorithm increased the number of lives saved by 284, 386, 547, 746, and 1,089, respectively, compared with first come first served (all p triage thresholds based on probability of death and duration of mechanical ventilation determined from actual critically ill children's data demonstrated superior population survival during a simulated overwhelming pandemic.

  2. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  3. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  4. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...

  5. Probability output modeling for support vector machines

    Science.gov (United States)

    Zhang, Xiang; Xiao, Xiaoling; Tian, Jinwen; Liu, Jian

    2007-11-01

    In this paper we propose an approach to model the posterior probability output of multi-class SVMs. The sigmoid function is used to estimate the posterior probability output in binary classification. This approach modeling the posterior probability output of multi-class SVMs is achieved by directly solving the equations that are based on the combination of the probability outputs of binary classifiers using the Bayes's rule. The differences and different weights among these two-class SVM classifiers, based on the posterior probability, are considered and given for the combination of the probability outputs among these two-class SVM classifiers in this method. The comparative experiment results show that our method achieves the better classification precision and the better probability distribution of the posterior probability than the pairwise couping method and the Hastie's optimization method.

  6. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  7. Data-driven probability concentration and sampling on manifold

    Energy Technology Data Exchange (ETDEWEB)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)

    2016-09-15

    A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.

  8. Using Inverse Probability Bootstrap Sampling to Eliminate Sample Induced Bias in Model Based Analysis of Unequal Probability Samples.

    Science.gov (United States)

    Nahorniak, Matthew; Larsen, David P; Volk, Carol; Jordan, Chris E

    2015-01-01

    In ecology, as in other research fields, efficient sampling for population estimation often drives sample designs toward unequal probability sampling, such as in stratified sampling. Design based statistical analysis tools are appropriate for seamless integration of sample design into the statistical analysis. However, it is also common and necessary, after a sampling design has been implemented, to use datasets to address questions that, in many cases, were not considered during the sampling design phase. Questions may arise requiring the use of model based statistical tools such as multiple regression, quantile regression, or regression tree analysis. However, such model based tools may require, for ensuring unbiased estimation, data from simple random samples, which can be problematic when analyzing data from unequal probability designs. Despite numerous method specific tools available to properly account for sampling design, too often in the analysis of ecological data, sample design is ignored and consequences are not properly considered. We demonstrate here that violation of this assumption can lead to biased parameter estimates in ecological research. In addition, to the set of tools available for researchers to properly account for sampling design in model based analysis, we introduce inverse probability bootstrapping (IPB). Inverse probability bootstrapping is an easily implemented method for obtaining equal probability re-samples from a probability sample, from which unbiased model based estimates can be made. We demonstrate the potential for bias in model-based analyses that ignore sample inclusion probabilities, and the effectiveness of IPB sampling in eliminating this bias, using both simulated and actual ecological data. For illustration, we considered three model based analysis tools--linear regression, quantile regression, and boosted regression tree analysis. In all models, using both simulated and actual ecological data, we found inferences to be

  9. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...

  10. Probability of flooding: An uncertainty analysis

    NARCIS (Netherlands)

    Slijkhuis, K.A.H.; Frijters, M.P.C.; Cooke, R.M.; Vrouwenvelder, A.C.W.M.

    1998-01-01

    In the Netherlands a new safety approach concerning the flood defences will probably be implemented in the near future. Therefore, an uncertainty analysis is currently being carried out to determine the uncertainty in the probability of flooding . The uncertainty of the probability of flooding could

  11. Lévy processes in free probability

    OpenAIRE

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This is the continuation of a previous article that studied the relationship between the classes of infinitely divisible probability measures in classical and free probability, respectively, via the Bercovici–Pata bijection. Drawing on the results of the preceding article, the present paper outlines recent developments in the theory of Lévy processes in free probability.

  12. The trajectory of the target probability effect.

    Science.gov (United States)

    Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B

    2013-05-01

    The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.

  13. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623... Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be computed to no less than three significant digits. Probabilities will be truncated to the number of...

  14. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  15. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  16. USING THE WEB-SERVICES WOLFRAM|ALPHA TO SOLVE PROBLEMS IN PROBABILITY THEORY

    Directory of Open Access Journals (Sweden)

    Taras Kobylnyk

    2015-10-01

    Full Text Available The trend towards the use of remote network resources on the Internet clearly delineated. Traditional training combined with increasingly networked, remote technologies become popular cloud computing. Research methods of probability theory are used in various fields. Of particular note is the use of methods of probability theory in psychological and educational research in statistical analysis of experimental data. Conducting such research is impossible without the use of modern information technology. Given the advantages of web-based software, the article describes web-service Wolfram|Alpha. Detailed analysis of the possibilities of using web-service Wolfram|Alpha for solving problems of probability theory. In the case studies described the results of queries for solving of probability theory, in particular the sections random events and random variables. Considered and analyzed the problem of the number of occurrences of event A in n independent trials using Wolfram|Alpha, detailed analysis of the possibilities of using the service Wolfram|Alpha for the study of continuous random variable that has a normal and uniform probability distribution, including calculating the probability of getting the value of a random variable in a given interval. The problem in applying the binomial and hypergeometric probability distribution of a discrete random variable and demonstrates the possibility of using the service Wolfram|Alpha for solving it.

  17. Probability distribution of the time-averaged mean-square displacement of a Gaussian process.

    Science.gov (United States)

    Grebenkov, Denis S

    2011-09-01

    We study the probability distribution of the time-averaged mean-square displacement of a discrete Gaussian process. An empirical approximation for the probability density is suggested and numerically validated for fractional Brownian motion. The optimality of quadratic forms for inferring dynamical and microrheological quantities from individual random trajectories is discussed, with emphasis on a reliable interpretation of single-particle tracking experiments.

  18. Probability theory and statistical applications a profound treatise for self-study

    CERN Document Server

    Zörnig, Peter

    2016-01-01

    This accessible and easy-to-read book provides many examples to illustrate diverse topics in probability and statistics, from initial concepts up to advanced calculations. Special attention is devoted e.g. to independency of events, inequalities in probability and functions of random variables. The book is directed to students of mathematics, statistics, engineering, and other quantitative sciences.

  19. Design and simulation of stratified probability digital receiver with application to the multipath communication

    Science.gov (United States)

    Deal, J. H.

    1975-01-01

    One approach to the problem of simplifying complex nonlinear filtering algorithms is through using stratified probability approximations where the continuous probability density functions of certain random variables are represented by discrete mass approximations. This technique is developed in this paper and used to simplify the filtering algorithms developed for the optimum receiver for signals corrupted by both additive and multiplicative noise.

  20. Probability Judgements in Multi-Stage Problems : Experimental Evidence of Systematic Biases

    NARCIS (Netherlands)

    Gneezy, U.

    1996-01-01

    We report empirical evidence that in problems of random walk with positive drift, bounded rationality leads individuals to under-estimate the probability of success in the long run.In particular, individuals who were given the stage by stage probability distribution failed to aggregate this

  1. Ruin probability with claims modeled by a stationary ergodic stable process

    NARCIS (Netherlands)

    Mikosch, T; Samorodnitsky, G

    2000-01-01

    For a random walk with negative drift we study the exceedance probability (ruin probability) of a high threshold. The steps of this walk (claim sizes) constitute a stationary ergodic stable process. We study how ruin occurs in this situation and evaluate the asymptotic behavior of the ruin

  2. Dynamic randomization and a randomization model for clinical trials data.

    Science.gov (United States)

    Kaiser, Lee D

    2012-12-20

    Randomization models are useful in supporting the validity of linear model analyses applied to data from a clinical trial that employed randomization via permuted blocks. Here, a randomization model for clinical trials data with arbitrary randomization methodology is developed, with treatment effect estimators and standard error estimators valid from a randomization perspective. A central limit theorem for the treatment effect estimator is also derived. As with permuted-blocks randomization, a typical linear model analysis provides results similar to the randomization model results when, roughly, unit effects display no pattern over time. A key requirement for the randomization inference is that the unconditional probability that any patient receives active treatment is constant across patients; when this probability condition is violated, the treatment effect estimator is biased from a randomization perspective. Most randomization methods for balanced, 1 to 1, treatment allocation satisfy this condition. However, many dynamic randomization methods for planned unbalanced treatment allocation, like 2 to 1, do not satisfy this constant probability condition, and these methods should be avoided. Copyright © 2012 John Wiley & Sons, Ltd.

  3. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  4. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  5. Probability Theory as Logic: Data Assimilation for Multiple Source Reconstruction

    Science.gov (United States)

    Yee, Eugene

    2012-03-01

    Probability theory as logic (or Bayesian probability theory) is a rational inferential methodology that provides a natural and logically consistent framework for source reconstruction. This methodology fully utilizes the information provided by a limited number of noisy concentration data obtained from a network of sensors and combines it in a consistent manner with the available prior knowledge (mathematical representation of relevant physical laws), hence providing a rigorous basis for the assimilation of this data into models of atmospheric dispersion for the purpose of contaminant source reconstruction. This paper addresses the application of this framework to the reconstruction of contaminant source distributions consisting of an unknown number of localized sources, using concentration measurements obtained from a sensor array. To this purpose, Bayesian probability theory is used to formulate the full joint posterior probability density function for the parameters of the unknown source distribution. A simulated annealing algorithm, applied in conjunction with a reversible-jump Markov chain Monte Carlo technique, is used to draw random samples of source distribution models from the posterior probability density function. The methodology is validated against a real (full-scale) atmospheric dispersion experiment involving a multiple point source release.

  6. Modelling survival after treatment of intraocular melanoma using artificial neural networks and Bayes theorem

    Energy Technology Data Exchange (ETDEWEB)

    Taktak, Azzam F G [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Fisher, Anthony C [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Damato, Bertil E [Department of Ophthalmology, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom)

    2004-01-07

    This paper describes the development of an artificial intelligence (AI) system for survival prediction from intraocular melanoma. The system used artificial neural networks (ANNs) with five input parameters: coronal and sagittal tumour location, anterior tumour margin, largest basal tumour diameter and the cell type. After excluding records with missing data, 2331 patients were included in the study. These were split randomly into training and test sets. Date censorship was applied to the records to deal with patients who were lost to follow-up and patients who died from general causes. Bayes theorem was then applied to the ANN output to construct survival probability curves. A validation set with 34 patients unseen to both training and test sets was used to compare the AI system with Cox's regression (CR) and Kaplan-Meier (KM) analyses. Results showed large differences in the mean 5 year survival probability figures when the number of records with matching characteristics was small. However, as the number of matches increased to >100 the system tended to agree with CR and KM. The validation set was also used to compare the system with a clinical expert in predicting time to metastatic death. The rms error was 3.7 years for the system and 4.3 years for the clinical expert for 15 years survival. For <10 years survival, these figures were 2.7 and 4.2, respectively. We concluded that the AI system can match if not better the clinical expert's prediction. There were significant differences with CR and KM analyses when the number of records was small, but it was not known which model is more accurate.

  7. Long-term survival of a randomized phase III trial of head and neck cancer patients receiving concurrent chemoradiation therapy with or without low-level laser therapy (LLLT) to prevent oral mucositis.

    Science.gov (United States)

    Antunes, Héliton S; Herchenhorn, Daniel; Small, Isabele A; Araújo, Carlos M M; Viégas, Celia Maria Pais; de Assis Ramos, Gabriela; Dias, Fernando L; Ferreira, Carlos G

    2017-08-01

    The impact of low-level laser therapy (LLLT) to prevent oral mucositis in patients treated with exclusive chemoradiation therapy remains unknown. This study evaluated the overall, disease-free and progression-free survival of these patients. Overall, disease-free and progression-free survival of 94 patients diagnosed with oropharynx, nasopharynx, and hypopharynx cancer, who participated on a phase III study, was evaluated from 2007 to 2015. The patients were subjected to conventional radiotherapy plus cisplatin every 3weeks. LLLT was applied with an InGaAlP diode (660nm-100mW-1J-4J/cm2). With a median follow-up of 41.3months (range 0.7-101.9), patients receiving LLLT had a statistically significant better complete response to treatment than those in the placebo group (LG=89.1%; PG=67.4%; p=0.013). Patients subjected to LLLT also displayed increase in progression-free survival than those in the placebo group (61.7% vs. 40.4%; p=0.030; HR:1:93; CI 95%: 1.07-3.5) and had a tendency for better overall survival (57.4% vs. 40.4%; p=0.90; HR:1.64; CI 95%: 0.92-2.91). This is the first study to suggest that LLLT may improve survival of head and neck cancer patients treated with chemoradiotherapy. Further studies, with a larger sample, are necessary to confirm our findings. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Approximating the Probability of Mortality Due to Protracted Radiation Exposures

    Science.gov (United States)

    2016-06-01

    2.25 168 573.4 924.7 1291.1 2.25 Radiological weapons (“dirty bombs”) will in most cases disperse radionuclides whose half-life is long enough that...Under the current Nuclear Survivability and Forensics contract, HDTRA1-14-D-0003; 0005, Dr. Paul Blake of DTRA/NTPR has supported the transition of...present approximate methods for estimating the probability of mortality due to radiological environments from nuclear weapon detonations or from a

  9. Adolescents' misinterpretation of health risk probability expressions.

    Science.gov (United States)

    Cohn, L D; Schydlower, M; Foley, J; Copeland, R L

    1995-05-01

    To determine if differences exist between adolescents and physicians in their numerical translation of 13 commonly used probability expressions (eg, possibly, might). Cross-sectional. Adolescent medicine and pediatric orthopedic outpatient units. 150 adolescents and 51 pediatricians, pediatric orthopedic surgeons, and nurses. Numerical ratings of the degree of certainty implied by 13 probability expressions (eg, possibly, probably). Adolescents were significantly more likely than physicians to display comprehension errors, reversing or equating the meaning of terms such as probably/possibly and likely/possibly. Numerical expressions of uncertainty (eg, 30% chance) elicited less variability in ratings than lexical expressions of uncertainty (eg, possibly). Physicians should avoid using probability expressions such as probably, possibly, and likely when communicating health risks to children and adolescents. Numerical expressions of uncertainty may be more effective for conveying the likelihood of an illness than lexical expressions of uncertainty (eg, probably).

  10. A practical overview on probability distributions.

    Science.gov (United States)

    Viti, Andrea; Terzi, Alberto; Bertolaccini, Luca

    2015-03-01

    Aim of this paper is a general definition of probability, of its main mathematical features and the features it presents under particular circumstances. The behavior of probability is linked to the features of the phenomenon we would predict. This link can be defined probability distribution. Given the characteristics of phenomena (that we can also define variables), there are defined probability distribution. For categorical (or discrete) variables, the probability can be described by a binomial or Poisson distribution in the majority of cases. For continuous variables, the probability can be described by the most important distribution in statistics, the normal distribution. Distributions of probability are briefly described together with some examples for their possible application.

  11. Integrated statistical modelling of spatial landslide probability

    Science.gov (United States)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  12. Estimating survival of dental fillings on the basis of interval-censored data and multi-state models

    DEFF Research Database (Denmark)

    Joly, Pierre; Gerds, Thomas A; Qvist, Vibeke

    2012-01-01

    We aim to compare the life expectancy of a filling in a primary tooth between two types of treatments. We define the probabilities that a dental filling survives without complication until the permanent tooth erupts from beneath (exfoliation). We relate the time to exfoliation of the tooth...... with all these particularities, we propose to use a parametric four-state model with three random effects to take into account the hierarchical cluster structure. For inference, right and interval censoring as well as left truncation have to be dealt with. With the proposed approach, we can conclude...

  13. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  14. Blocked randomization with randomly selected block sizes.

    Science.gov (United States)

    Efird, Jimmy

    2011-01-01

    When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  15. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  16. Trisomy 13 (Patau syndrome) with an 11-year survival.

    Science.gov (United States)

    Zoll, B; Wolf, J; Lensing-Hebben, D; Pruggmayer, M; Thorpe, B

    1993-01-01

    Trisomy 13 is very rare in live-born children. Only a small number of these children survive the first year and very few cases are reported to live longer. Survival time depends partly on the cytogenetic findings--full trisomy 13 or trisomy 13 mosaicism--and partly on the existence of serious somatic malformations. We report on a 11-year-old girl with full trisomy 13. In this case, missing cerebral and cardiovascular malformations probably allowed the long survival.

  17. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  18. Daily survival rate and habitat characteristics of nests of Wilson's Plover

    Science.gov (United States)

    Zinsser, Elizabeth; Sanders, Felicia J.; Gerard, Patrick D.; Jodice, Patrick G.R.

    2017-01-01

    We assessed habitat characteristics and measured daily survival rate of 72 nests of Charadrius wilsonia (Wilson's Plover) during 2012 and 2013 on South Island and Sand Island on the central coast of South Carolina. At both study areas, nest sites were located at slightly higher elevations (i.e., small platforms of sand) relative to randomly selected nearby unused sites, and nests at each study area also appeared to be situated to enhance crypsis and/or vigilance. Daily survival rate (DSR) of nests ranged from 0.969 to 0.988 among study sites and years, and the probability of nest survival ranged from 0.405 to 0.764. Flooding and predation were the most common causes of nest failure at both sites. At South Island, DSR was most strongly related to maximum tide height, which suggests that flooding and overwash may be common causes of nest loss for Wilson's Plovers at these study sites. The difference in model results between the 2 nearby study sites may be partially due to more-frequent flooding at Sand Island because of some underlying yet unmeasured physiographic feature. Remaining data gaps for the species include regional assessments of nest and chick survival and habitat requirements during chick rearing.

  19. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  20. Experience Matters: Information Acquisition Optimizes Probability Gain

    Science.gov (United States)

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915