WorldWideScience

Sample records for random survival probabilities

  1. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  2. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  3. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  4. Estimating the joint survival probabilities of married individuals

    NARCIS (Netherlands)

    Sanders, Lisanne; Melenberg, Bertrand

    We estimate the joint survival probability of spouses using a large random sample drawn from a Dutch census. As benchmarks we use two bivariate Weibull models. We consider more flexible models, using a semi-nonparametric approach, by extending the independent Weibull distribution using squared

  5. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  6. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  7. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  8. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  9. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  10. Finite-size scaling of survival probability in branching processes

    OpenAIRE

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Alvaro

    2014-01-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We reveal the finite-size scaling law of the survival probability for a given branching process ruled by a probability distribution of the number of offspring per element whose standard deviation is finite, obtaining the exact scaling function as well as the critical exponents. Our findings prove the universal behavi...

  11. Survival probability of diffusion with trapping in cellular neurobiology

    Science.gov (United States)

    Holcman, David; Marchewka, Avi; Schuss, Zeev

    2005-09-01

    The problem of diffusion with absorption and trapping sites arises in the theory of molecular signaling inside and on the membranes of biological cells. In particular, this problem arises in the case of spine-dendrite communication, where the number of calcium ions, modeled as random particles, is regulated across the spine microstructure by pumps, which play the role of killing sites, while the end of the dendritic shaft is an absorbing boundary. We develop a general mathematical framework for diffusion in the presence of absorption and killing sites and apply it to the computation of the time-dependent survival probability of ions. We also compute the ratio of the number of absorbed particles at a specific location to the number of killed particles. We show that the ratio depends on the distribution of killing sites. The biological consequence is that the position of the pumps regulates the fraction of calcium ions that reach the dendrite.

  12. Gluon saturation: Survival probability for leading neutrons in DIS

    International Nuclear Information System (INIS)

    Levin, Eugene; Tapia, Sebastian

    2012-01-01

    In this paper we discuss the example of one rapidity gap process: the inclusive cross sections of the leading neutrons in deep inelastic scattering with protons (DIS). The equations for this process are proposed and solved, giving the example of theoretical calculation of the survival probability for one rapidity gap processes. It turns out that the value of the survival probability is small and it decreases with energy.

  13. Survival probability in a one-dimensional quantum walk on a trapped lattice

    International Nuclear Information System (INIS)

    Goenuelol, Meltem; Aydiner, Ekrem; Shikano, Yutaka; Muestecaplioglu, Oezguer E

    2011-01-01

    The dynamics of the survival probability of quantum walkers on a one-dimensional lattice with random distribution of absorbing immobile traps is investigated. The survival probability of quantum walkers is compared with that of classical walkers. It is shown that the time dependence of the survival probability of quantum walkers has a piecewise stretched exponential character depending on the density of traps in numerical and analytical observations. The crossover between the quantum analogues of the Rosenstock and Donsker-Varadhan behavior is identified.

  14. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    1988-01-01

    This book has been written for several reasons, not all of which are academic. This material was for many years the first half of a book in progress on information and ergodic theory. The intent was and is to provide a reasonably self-contained advanced treatment of measure theory, prob ability theory, and the theory of discrete time random processes with an emphasis on general alphabets and on ergodic and stationary properties of random processes that might be neither ergodic nor stationary. The intended audience was mathematically inc1ined engineering graduate students and visiting scholars who had not had formal courses in measure theoretic probability . Much of the material is familiar stuff for mathematicians, but many of the topics and results have not previously appeared in books. The original project grew too large and the first part contained much that would likely bore mathematicians and dis courage them from the second part. Hence I finally followed the suggestion to separate the material and split...

  15. Bayesian Analysis for EMP Survival Probability of Solid State Relay

    International Nuclear Information System (INIS)

    Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang

    2009-01-01

    The principle to estimate the parameter p of binomial distribution by Bayesian method and the several non-informative prior are introduced. The survival probability of DC solid state relay under current injection at certain amplitude is obtained by this method. (authors)

  16. Random phenomena fundamentals of probability and statistics for engineers

    CERN Document Server

    Ogunnaike, Babatunde A

    2009-01-01

    PreludeApproach PhilosophyFour Basic PrinciplesI FoundationsTwo Motivating ExamplesYield Improvement in a Chemical ProcessQuality Assurance in a Glass Sheet Manufacturing ProcessOutline of a Systematic ApproachRandom Phenomena, Variability, and UncertaintyTwo Extreme Idealizations of Natural PhenomenaRandom Mass PhenomenaIntroducing ProbabilityThe Probabilistic FrameworkII ProbabilityFundamentals of Probability TheoryBuilding BlocksOperationsProbabilityConditional ProbabilityIndependenceRandom Variables and DistributionsDistributionsMathematical ExpectationCharacterizing DistributionsSpecial Derived Probability FunctionsMultidimensional Random VariablesDistributions of Several Random VariablesDistributional Characteristics of Jointly Distributed Random VariablesRandom Variable TransformationsSingle Variable TransformationsBivariate TransformationsGeneral Multivariate TransformationsApplication Case Studies I: ProbabilityMendel and HeredityWorld War II Warship Tactical Response Under AttackIII DistributionsIde...

  17. Mean exit time and survival probability within the CTRW formalism

    Science.gov (United States)

    Montero, M.; Masoliver, J.

    2007-05-01

    An intense research on financial market microstructure is presently in progress. Continuous time random walks (CTRWs) are general models capable to capture the small-scale properties that high frequency data series show. The use of CTRW models in the analysis of financial problems is quite recent and their potentials have not been fully developed. Here we present two (closely related) applications of great interest in risk control. In the first place, we will review the problem of modelling the behaviour of the mean exit time (MET) of a process out of a given region of fixed size. The surveyed stochastic processes are the cumulative returns of asset prices. The link between the value of the MET and the timescale of the market fluctuations of a certain degree is crystal clear. In this sense, MET value may help, for instance, in deciding the optimal time horizon for the investment. The MET is, however, one among the statistics of a distribution of bigger interest: the survival probability (SP), the likelihood that after some lapse of time a process remains inside the given region without having crossed its boundaries. The final part of the manuscript is devoted to the study of this quantity. Note that the use of SPs may outperform the standard “Value at Risk" (VaR) method for two reasons: we can consider other market dynamics than the limited Wiener process and, even in this case, a risk level derived from the SP will ensure (within the desired quintile) that the quoted value of the portfolio will not leave the safety zone. We present some preliminary theoretical and applied results concerning this topic.

  18. Finite-size scaling of survival probability in branching processes.

    Science.gov (United States)

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Álvaro

    2015-04-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We derive analytically the existence of finite-size scaling for the survival probability as a function of the control parameter and the maximum number of generations, obtaining the critical exponents as well as the exact scaling function, which is G(y)=2ye(y)/(e(y)-1), with y the rescaled distance to the critical point. Our findings are valid for any branching process of the Galton-Watson type, independently of the distribution of the number of offspring, provided its variance is finite. This proves the universal behavior of the finite-size effects in branching processes, including the universality of the metric factors. The direct relation to mean-field percolation is also discussed.

  19. [Survival analysis with competing risks: estimating failure probability].

    Science.gov (United States)

    Llorca, Javier; Delgado-Rodríguez, Miguel

    2004-01-01

    To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  20. Survival probabilities for branching Brownian motion with absorption

    OpenAIRE

    Harris, John; Harris, Simon

    2007-01-01

    We study a branching Brownian motion (BBM) with absorption, in which particles move as Brownian motions with drift $-\\rho$, undergo dyadic branching at rate $\\beta>0$, and are killed on hitting the origin. In the case $\\rho>\\sqrt{2\\beta}$ the extinction time for this process, $\\zeta$, is known to be finite almost surely. The main result of this article is a large-time asymptotic formula for the survival probability $P^x(\\zeta>t)$ in the case $\\rho>\\sqrt{2\\beta}$, where $P^x$ is...

  1. Survival chance in papillary thyroid cancer in Hungary: individual survival probability estimation using the Markov method

    International Nuclear Information System (INIS)

    Esik, Olga; Tusnady, Gabor; Daubner, Kornel; Nemeth, Gyoergy; Fuezy, Marton; Szentirmay, Zoltan

    1997-01-01

    Purpose: The typically benign, but occasionally rapidly fatal clinical course of papillary thyroid cancer has raised the need for individual survival probability estimation, to tailor the treatment strategy exclusively to a given patient. Materials and methods: A retrospective study was performed on 400 papillary thyroid cancer patients with a median follow-up time of 7.1 years to establish a clinical database for uni- and multivariate analysis of the prognostic factors related to survival (Kaplan-Meier product limit method and Cox regression). For a more precise prognosis estimation, the effect of the most important clinical events were then investigated on the basis of a Markov renewal model. The basic concept of this approach is that each patient has an individual disease course which (besides the initial clinical categories) is affected by special events, e.g. internal covariates (local/regional/distant relapses). On the supposition that these events and the cause-specific death are influenced by the same biological processes, the parameters of transient survival probability characterizing the speed of the course of the disease for each clinical event and their sequence were determined. The individual survival curves for each patient were calculated by using these parameters and the independent significant clinical variables selected from multivariate studies, summation of which resulted in a mean cause-specific survival function valid for the entire group. On the basis of this Markov model, prediction of the cause-specific survival probability is possible for extrastudy cases, if it is supposed that the clinical events occur within new patients in the same manner and with the similar probability as within the study population. Results: The patient's age, a distant metastasis at presentation, the extent of the surgical intervention, the primary tumor size and extent (pT), the external irradiation dosage and the degree of TSH suppression proved to be

  2. Fusion probability and survivability in estimates of heaviest nuclei production

    International Nuclear Information System (INIS)

    Sagaidak, Roman

    2012-01-01

    A number of theoretical models have been recently developed to predict production cross sections for the heaviest nuclei in fusion-evaporation reactions. All the models reproduce cross sections obtained in experiments quite well. At the same time they give fusion probability values P fus ≡ P CN differed within several orders of the value. This difference implies a corresponding distinction in the calculated values of survivability. The production of the heaviest nuclei (from Cm to the region of superheavy elements (SHE) close to Z = 114 and N = 184) in fusion-evaporation reactions induced by heavy ions has been considered in a systematic way within the framework of the barrier-passing (fusion) model coupled with the standard statistical model (SSM) of the compound nucleus (CN) decay. Both models are incorporated into the HIVAP code. Available data on the excitation functions for fission and evaporation residues (ER) produced in very asymmetric combinations can be described rather well within the framework of HIVAP. Cross-section data obtained in these reactions allow one to choose model parameters quite definitely. Thus one can scale and fix macroscopic (liquid-drop) fission barriers for nuclei involved in the evaporation-fission cascade. In less asymmetric combinations (with 22 Ne and heavier projectiles) effects of fusion suppression caused by quasi-fission are starting to appear in the entrance channel of reactions. The P fus values derived from the capture-fission and fusion-fission cross-sections obtained at energies above the Bass barrier were plotted as a function of the Coulomb parameter. For more symmetric combinations one can deduce the P fus values semi-empirically, using the ER and fission excitation functions measured in experiments, and applying SSM model with parameters obtained in the analysis of a very asymmetric combination leading to the production of (nearly) the same CN, as was done for reactions leading to the pre-actinide nuclei formation

  3. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  4. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  5. People's Intuitions about Randomness and Probability: An Empirical Study

    Science.gov (United States)

    Lecoutre, Marie-Paule; Rovira, Katia; Lecoutre, Bruno; Poitevineau, Jacques

    2006-01-01

    What people mean by randomness should be taken into account when teaching statistical inference. This experiment explored subjective beliefs about randomness and probability through two successive tasks. Subjects were asked to categorize 16 familiar items: 8 real items from everyday life experiences, and 8 stochastic items involving a repeatable…

  6. Fusion probability and survivability in estimates of heaviest nuclei production

    Directory of Open Access Journals (Sweden)

    Sagaidak Roman N.

    2012-02-01

    Full Text Available Production of the heavy and heaviest nuclei (from Po to the region of superheavy elements close to Z=114 and N=184 in fusion-evaporation reactions induced by heavy ions has been considered in a systematic way within the framework of the barrier-passing model coupled with the statistical model (SM of de-excitation of a compound nucleus (CN. Excitation functions for fission and evaporation residues (ER measured in very asymmetric combinations can be described rather well. One can scale and fix macroscopic (liquid-drop fission barriers for nuclei involved in the calculation of survivability with SM. In less asymmetric combinations, effects of fusion suppression caused by quasi-fission (QF are starting to appear in the entrance channel of reactions. QF effects could be semi-empirically taken into account using fusion probabilities deduced as the ratio of measured ER cross sections to the ones obtained in the assumption of absence of the fusion suppression in corresponding reactions. SM parameters (fission barriers obtained at the analysis of a very asymmetric combination leading to the production of (nearly the same CN should be used for this evaluation.

  7. Age replacement policy based on imperfect repair with random probability

    International Nuclear Information System (INIS)

    Lim, J.H.; Qu, Jian; Zuo, Ming J.

    2016-01-01

    In most of literatures of age replacement policy, failures before planned replacement age can be either minimally repaired or perfectly repaired based on the types of failures, cost for repairs and so on. In this paper, we propose age replacement policy based on imperfect repair with random probability. The proposed policy incorporates the case that such intermittent failure can be either minimally repaired or perfectly repaired with random probabilities. The mathematical formulas of the expected cost rate per unit time are derived for both the infinite-horizon case and the one-replacement-cycle case. For each case, we show that the optimal replacement age exists and is finite. - Highlights: • We propose a new age replacement policy with random probability of perfect repair. • We develop the expected cost per unit time. • We discuss the optimal age for replacement minimizing the expected cost rate.

  8. Non-equilibrium random matrix theory. Transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Pedro, Francisco Gil [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica; Westphal, Alexander [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Gruppe Theorie

    2016-06-15

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  9. Non-equilibrium random matrix theory. Transition probabilities

    International Nuclear Information System (INIS)

    Pedro, Francisco Gil; Westphal, Alexander

    2016-06-01

    In this letter we present an analytic method for calculating the transition probability between two random Gaussian matrices with given eigenvalue spectra in the context of Dyson Brownian motion. We show that in the Coulomb gas language, in large N limit, memory of the initial state is preserved in the form of a universal linear potential acting on the eigenvalues. We compute the likelihood of any given transition as a function of time, showing that as memory of the initial state is lost, transition probabilities converge to those of the static ensemble.

  10. Data-Driven Lead-Acid Battery Prognostics Using Random Survival Forests

    Science.gov (United States)

    2014-10-02

    Kogalur, Blackstone , & Lauer, 2008; Ishwaran & Kogalur, 2010). Random survival forest is a sur- vival analysis extension of Random Forests (Breiman, 2001...Statistics & probability letters, 80(13), 1056–1064. Ishwaran, H., Kogalur, U. B., Blackstone , E. H., & Lauer, M. S. (2008). Random survival forests. The...and environment for sta- tistical computing [Computer software manual]. Vienna, Austria. Retrieved from http://www.R-project .org/ Wager, S., Hastie, T

  11. The extinction probability in systems randomly varying in time

    Directory of Open Access Journals (Sweden)

    Imre Pázsit

    2017-09-01

    Full Text Available The extinction probability of a branching process (a neutron chain in a multiplying medium is calculated for a system randomly varying in time. The evolution of the first two moments of such a process was calculated previously by the authors in a system randomly shifting between two states of different multiplication properties. The same model is used here for the investigation of the extinction probability. It is seen that the determination of the extinction probability is significantly more complicated than that of the moments, and it can only be achieved by pure numerical methods. The numerical results indicate that for systems fluctuating between two subcritical or two supercritical states, the extinction probability behaves as expected, but for systems fluctuating between a supercritical and a subcritical state, there is a crucial and unexpected deviation from the predicted behaviour. The results bear some significance not only for neutron chains in a multiplying medium, but also for the evolution of biological populations in a time-varying environment.

  12. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  13. Analytic results for asymmetric random walk with exponential transition probabilities

    International Nuclear Information System (INIS)

    Gutkowicz-Krusin, D.; Procaccia, I.; Ross, J.

    1978-01-01

    We present here exact analytic results for a random walk on a one-dimensional lattice with asymmetric, exponentially distributed jump probabilities. We derive the generating functions of such a walk for a perfect lattice and for a lattice with absorbing boundaries. We obtain solutions for some interesting moment properties, such as mean first passage time, drift velocity, dispersion, and branching ratio for absorption. The symmetric exponential walk is solved as a special case. The scaling of the mean first passage time with the size of the system for the exponentially distributed walk is determined by the symmetry and is independent of the range

  14. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  15. Measuring survival time: a probability-based approach useful in healthcare decision-making.

    Science.gov (United States)

    2011-01-01

    In some clinical situations, the choice between treatment options takes into account their impact on patient survival time. Due to practical constraints (such as loss to follow-up), survival time is usually estimated using a probability calculation based on data obtained in clinical studies or trials. The two techniques most commonly used to estimate survival times are the Kaplan-Meier method and the actuarial method. Despite their limitations, they provide useful information when choosing between treatment options.

  16. Killing (absorption) versus survival in random motion

    Science.gov (United States)

    Garbaczewski, Piotr

    2017-09-01

    We address diffusion processes in a bounded domain, while focusing on somewhat unexplored affinities between the presence of absorbing and/or inaccessible boundaries. For the Brownian motion (Lévy-stable cases are briefly mentioned) model-independent features are established of the dynamical law that underlies the short-time behavior of these random paths, whose overall lifetime is predefined to be long. As a by-product, the limiting regime of a permanent trapping in a domain is obtained. We demonstrate that the adopted conditioning method, involving the so-called Bernstein transition function, works properly also in an unbounded domain, for stochastic processes with killing (Feynman-Kac kernels play the role of transition densities), provided the spectrum of the related semigroup operator is discrete. The method is shown to be useful in the case, when the spectrum of the generator goes down to zero and no isolated minimal (ground state) eigenvalue is in existence, like in the problem of the long-term survival on a half-line with a sink at origin.

  17. Inverse probability weighting for covariate adjustment in randomized studies.

    Science.gov (United States)

    Shen, Changyu; Li, Xiaochun; Li, Lingling

    2014-02-20

    Covariate adjustment in randomized clinical trials has the potential benefit of precision gain. It also has the potential pitfall of reduced objectivity as it opens the possibility of selecting a 'favorable' model that yields strong treatment benefit estimate. Although there is a large volume of statistical literature targeting on the first aspect, realistic solutions to enforce objective inference and improve precision are rare. As a typical randomized trial needs to accommodate many implementation issues beyond statistical considerations, maintaining the objectivity is at least as important as precision gain if not more, particularly from the perspective of the regulatory agencies. In this article, we propose a two-stage estimation procedure based on inverse probability weighting to achieve better precision without compromising objectivity. The procedure is designed in a way such that the covariate adjustment is performed before seeing the outcome, effectively reducing the possibility of selecting a 'favorable' model that yields a strong intervention effect. Both theoretical and numerical properties of the estimation procedure are presented. Application of the proposed method to a real data example is presented. Copyright © 2013 John Wiley & Sons, Ltd.

  18. Survival probability of a local excitation in a non-Markovian environment: Survival collapse, Zeno and anti-Zeno effects

    International Nuclear Information System (INIS)

    Rufeil-Fiori, E.; Pastawski, H.M.

    2009-01-01

    The decay dynamics of a local excitation interacting with a non-Markovian environment, modeled by a semi-infinite tight-binding chain, is exactly evaluated. We identify distinctive regimes for the dynamics. Sequentially: (i) early quadratic decay of the initial-state survival probability, up to a spreading time t S , (ii) exponential decay described by a self-consistent Fermi Golden Rule, and (iii) asymptotic behavior governed by quantum diffusion through the return processes, leading to an inverse power law decay. At this last cross-over time t R a survival collapse becomes possible. This could reduce the survival probability by several orders of magnitude. The cross-over times t S and t R allow to assess the range of applicability of the Fermi Golden Rule and give the conditions for the observation of the Zeno and anti-Zeno effect.

  19. Estimating Probability of Default on Peer to Peer Market – Survival Analysis Approach

    Directory of Open Access Journals (Sweden)

    Đurović Andrija

    2017-05-01

    Full Text Available Arguably a cornerstone of credit risk modelling is the probability of default. This article aims is to search for the evidence of relationship between loan characteristics and probability of default on peer-to-peer (P2P market. In line with that, two loan characteristics are analysed: 1 loan term length and 2 loan purpose. The analysis is conducted using survival analysis approach within the vintage framework. Firstly, 12 months probability of default through the cycle is used to compare riskiness of analysed loan characteristics. Secondly, log-rank test is employed in order to compare complete survival period of cohorts. Findings of the paper suggest that there is clear evidence of relationship between analysed loan characteristics and probability of default. Longer term loans are more risky than the shorter term ones and the least risky loans are those used for credit card payoff.

  20. Fingerprints of exceptional points in the survival probability of resonances in atomic spectra

    International Nuclear Information System (INIS)

    Cartarius, Holger; Moiseyev, Nimrod

    2011-01-01

    The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=| | 2 decays exactly as |1-at| 2 e -Γ EP t/(ℎ/2π) , where Γ EP is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.

  1. Fingerprints of exceptional points in the survival probability of resonances in atomic spectra

    Science.gov (United States)

    Cartarius, Holger; Moiseyev, Nimrod

    2011-07-01

    The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=||2 decays exactly as |1-at|2e-ΓEPt/ℏ, where ΓEP is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.

  2. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  3. Notes on the Lumped Backward Master Equation for the Neutron Extinction/Survival Probability

    Energy Technology Data Exchange (ETDEWEB)

    Prinja, Anil K [Los Alamos National Laboratory

    2012-07-02

    chains (a fission chain is defined as the initial source neutron and all its subsequent progeny) in which some chains are short lived while others propagate for unusually long times. Under these conditions, fission chains do not overlap strongly and this precludes the cancellation of neutron number fluctuations necessary for the mean to become established as the dominant measure of the neutron population. The fate of individual chains then plays a defining role in the evolution of the neutron population in strongly stochastic systems, and of particular interest and importance in supercritical systems is the extinction probability, defined as the probability that the neutron chain (initiating neutron and its progeny) will be extinguished at a particular time, or its complement, the time-dependent survival probability. The time-asymptotic limit of the latter, the probability of divergence, gives the probability that the neutron population will grow without bound, and is more commonly known as the probability of initiation or just POI. The ability to numerically compute these probabilities, with high accuracy and without overly restricting the underlying physics (e.g., fission neutron multiplicity, reactivity variation) is clearly essential in developing an understanding of the behavior of strongly stochastic systems.

  4. Effects of amphibian chytrid fungus on individual survival probability in wild boreal toads

    Science.gov (United States)

    Pilliod, D.S.; Muths, E.; Scherer, R. D.; Bartelt, P.E.; Corn, P.S.; Hossack, B.R.; Lambert, B.A.; Mccaffery, R.; Gaughan, C.

    2010-01-01

    Chytridiomycosis is linked to the worldwide decline of amphibians, yet little is known about the demographic effects of the disease. We collected capture-recapture data on three populations of boreal toads (Bufo boreas [Bufo = Anaxyrus]) in the Rocky Mountains (U.S.A.). Two of the populations were infected with chytridiomycosis and one was not. We examined the effect of the presence of amphibian chytrid fungus (Batrachochytrium dendrobatidis [Bd]; the agent of chytridiomycosis) on survival probability and population growth rate. Toads that were infected with Bd had lower average annual survival probability than uninfected individuals at sites where Bd was detected, which suggests chytridiomycosis may reduce survival by 31-42% in wild boreal toads. Toads that were negative for Bd at infected sites had survival probabilities comparable to toads at the uninfected site. Evidence that environmental covariates (particularly cold temperatures during the breeding season) influenced toad survival was weak. The number of individuals in diseased populations declined by 5-7%/year over the 6 years of the study, whereas the uninfected population had comparatively stable population growth. Our data suggest that the presence of Bd in these toad populations is not causing rapid population declines. Rather, chytridiomycosis appears to be functioning as a low-level, chronic disease whereby some infected individuals survive but the overall population effects are still negative. Our results show that some amphibian populations may be coexisting with Bd and highlight the importance of quantitative assessments of survival in diseased animal populations. Journal compilation. ?? 2010 Society for Conservation Biology. No claim to original US government works.

  5. Bounds on survival probability given mean probability of failure per demand; and the paradoxical advantages of uncertainty

    International Nuclear Information System (INIS)

    Strigini, Lorenzo; Wright, David

    2014-01-01

    When deciding whether to accept into service a new safety-critical system, or choosing between alternative systems, uncertainty about the parameters that affect future failure probability may be a major problem. This uncertainty can be extreme if there is the possibility of unknown design errors (e.g. in software), or wide variation between nominally equivalent components. We study the effect of parameter uncertainty on future reliability (survival probability), for systems required to have low risk of even only one failure or accident over the long term (e.g. their whole operational lifetime) and characterised by a single reliability parameter (e.g. probability of failure per demand – pfd). A complete mathematical treatment requires stating a probability distribution for any parameter with uncertain value. This is hard, so calculations are often performed using point estimates, like the expected value. We investigate conditions under which such simplified descriptions yield reliability values that are sure to be pessimistic (or optimistic) bounds for a prediction based on the true distribution. Two important observations are (i) using the expected value of the reliability parameter as its true value guarantees a pessimistic estimate of reliability, a useful property in most safety-related decisions; (ii) with a given expected pfd, broader distributions (in a formally defined meaning of “broader”), that is, systems that are a priori “less predictable”, lower the risk of failures or accidents. Result (i) justifies the simplification of using a mean in reliability modelling; we discuss within which scope this justification applies, and explore related scenarios, e.g. how things improve if we can test the system before operation. Result (ii) not only offers more flexible ways of bounding reliability predictions, but also has important, often counter-intuitive implications for decision making in various areas, like selection of components, project management

  6. Probability on graphs random processes on graphs and lattices

    CERN Document Server

    Grimmett, Geoffrey

    2018-01-01

    This introduction to some of the principal models in the theory of disordered systems leads the reader through the basics, to the very edge of contemporary research, with the minimum of technical fuss. Topics covered include random walk, percolation, self-avoiding walk, interacting particle systems, uniform spanning tree, random graphs, as well as the Ising, Potts, and random-cluster models for ferromagnetism, and the Lorentz model for motion in a random medium. This new edition features accounts of major recent progress, including the exact value of the connective constant of the hexagonal lattice, and the critical point of the random-cluster model on the square lattice. The choice of topics is strongly motivated by modern applications, and focuses on areas that merit further research. Accessible to a wide audience of mathematicians and physicists, this book can be used as a graduate course text. Each chapter ends with a range of exercises.

  7. Random survival forests for competing risks

    DEFF Research Database (Denmark)

    Ishwaran, Hemant; Gerds, Thomas A; Kogalur, Udaya B

    2014-01-01

    We introduce a new approach to competing risks using random forests. Our method is fully non-parametric and can be used for selecting event-specific variables and for estimating the cumulative incidence function. We show that the method is highly effective for both prediction and variable selection...

  8. Survival and compound nucleus probability of super heavy element Z = 117

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First grade College, Department of Physics, Kolar, Karnataka (India)

    2017-05-15

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of {sup 289-297}Ts, we have calculated the transmission probability (T{sub l}), compound nucleus formation probabilities (P{sub CN}) and survival probability (P{sub sur}) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of {sup 289-297}Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei {sup 289-297}Ts are worked out and listed explicitly. We have also studied the variation of P{sub CN} and P{sub sur} with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  9. Survival and compound nucleus probability of super heavy element Z = 117

    International Nuclear Information System (INIS)

    Manjunatha, H.C.; Sridhar, K.N.

    2017-01-01

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of "2"8"9"-"2"9"7Ts, we have calculated the transmission probability (T_l), compound nucleus formation probabilities (P_C_N) and survival probability (P_s_u_r) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of "2"8"9"-"2"9"7Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei "2"8"9"-"2"9"7Ts are worked out and listed explicitly. We have also studied the variation of P_C_N and P_s_u_r with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  10. Survival behavior in the cyclic Lotka-Volterra model with a randomly switching reaction rate

    Science.gov (United States)

    West, Robert; Mobilia, Mauro; Rucklidge, Alastair M.

    2018-02-01

    We study the influence of a randomly switching reproduction-predation rate on the survival behavior of the nonspatial cyclic Lotka-Volterra model, also known as the zero-sum rock-paper-scissors game, used to metaphorically describe the cyclic competition between three species. In large and finite populations, demographic fluctuations (internal noise) drive two species to extinction in a finite time, while the species with the smallest reproduction-predation rate is the most likely to be the surviving one (law of the weakest). Here we model environmental (external) noise by assuming that the reproduction-predation rate of the strongest species (the fastest to reproduce and predate) in a given static environment randomly switches between two values corresponding to more and less favorable external conditions. We study the joint effect of environmental and demographic noise on the species survival probabilities and on the mean extinction time. In particular, we investigate whether the survival probabilities follow the law of the weakest and analyze their dependence on the external noise intensity and switching rate. Remarkably, when, on average, there is a finite number of switches prior to extinction, the survival probability of the predator of the species whose reaction rate switches typically varies nonmonotonically with the external noise intensity (with optimal survival about a critical noise strength). We also outline the relationship with the case where all reaction rates switch on markedly different time scales.

  11. Survival behavior in the cyclic Lotka-Volterra model with a randomly switching reaction rate.

    Science.gov (United States)

    West, Robert; Mobilia, Mauro; Rucklidge, Alastair M

    2018-02-01

    We study the influence of a randomly switching reproduction-predation rate on the survival behavior of the nonspatial cyclic Lotka-Volterra model, also known as the zero-sum rock-paper-scissors game, used to metaphorically describe the cyclic competition between three species. In large and finite populations, demographic fluctuations (internal noise) drive two species to extinction in a finite time, while the species with the smallest reproduction-predation rate is the most likely to be the surviving one (law of the weakest). Here we model environmental (external) noise by assuming that the reproduction-predation rate of the strongest species (the fastest to reproduce and predate) in a given static environment randomly switches between two values corresponding to more and less favorable external conditions. We study the joint effect of environmental and demographic noise on the species survival probabilities and on the mean extinction time. In particular, we investigate whether the survival probabilities follow the law of the weakest and analyze their dependence on the external noise intensity and switching rate. Remarkably, when, on average, there is a finite number of switches prior to extinction, the survival probability of the predator of the species whose reaction rate switches typically varies nonmonotonically with the external noise intensity (with optimal survival about a critical noise strength). We also outline the relationship with the case where all reaction rates switch on markedly different time scales.

  12. Exact results for survival probability in the multistate Landau-Zener model

    International Nuclear Information System (INIS)

    Volkov, M V; Ostrovsky, V N

    2004-01-01

    An exact formula is derived for survival probability in the multistate Landau-Zener model in the special case where the initially populated state corresponds to the extremal (maximum or minimum) slope of a linear diabatic potential curve. The formula was originally guessed by S Brundobler and V Elzer (1993 J. Phys. A: Math. Gen. 26 1211) based on numerical calculations. It is a simple generalization of the expression for the probability of diabatic passage in the famous two-state Landau-Zener model. Our result is obtained via analysis and summation of the entire perturbation theory series

  13. Survival probability in small angle scattering of low energy alkali ions from alkali covered metal surfaces

    International Nuclear Information System (INIS)

    Neskovic, N.; Ciric, D.; Perovic, B.

    1982-01-01

    The survival probability in small angle scattering of low energy alkali ions from alkali covered metal surfaces is considered. The model is based on the momentum approximation. The projectiles are K + ions and the target is the (001)Ni+K surface. The incident energy is 100 eV and the incident angle 5 0 . The interaction potential of the projectile and the target consists of the Born-Mayer, the dipole and the image charge potentials. The transition probability function corresponds to the resonant electron transition to the 4s projectile energy level. (orig.)

  14. Return probabilities for the reflected random walk on N_0

    NARCIS (Netherlands)

    Essifi, R.; Peigné, M.

    2015-01-01

    Let \\((Y_n)\\) be a sequence of i.i.d. \\(\\mathbb{Z }\\)-valued random variables with law \\(\\mu \\). The reflected random walk \\((X_n)\\) is defined recursively by \\(X_0=x \\in \\mathbb{N }_0, X_{n+1}=\\vert X_n+Y_{n+1}\\vert \\). Under mild hypotheses on the law \\(\\mu \\), it is proved that, for any \\( y \\in

  15. Passage and survival probabilities of juvenile Chinook salmon at Cougar Dam, Oregon, 2012

    Science.gov (United States)

    Beeman, John W.; Evans, Scott D.; Haner, Philip V.; Hansel, Hal C.; Hansen, Amy C.; Smith, Collin D.; Sprando, Jamie M.

    2014-01-01

    This report describes studies of juvenile-salmon dam passage and apparent survival at Cougar Dam, Oregon, during two operating conditions in 2012. Cougar Dam is a 158-meter tall rock-fill dam used primarily for flood control, and passes water through a temperature control tower to either a powerhouse penstock or to a regulating outlet (RO). The temperature control tower has moveable weir gates to enable water of different elevations and temperatures to be drawn through the dam to control water temperatures downstream. A series of studies of downstream dam passage of juvenile salmonids were begun after the National Oceanic and Atmospheric Administration determined that Cougar Dam was impacting the viability of anadromous fish stocks. The primary objectives of the studies described in this report were to estimate the route-specific fish passage probabilities at the dam and to estimate the survival probabilities of fish passing through the RO. The first set of dam operating conditions, studied in November, consisted of (1) a mean reservoir elevation of 1,589 feet, (2) water entering the temperature control tower through the weir gates, (3) most water routed through the turbines during the day and through the RO during the night, and (4) mean RO gate openings of 1.2 feet during the day and 3.2 feet during the night. The second set of dam operating conditions, studied in December, consisted of (1) a mean reservoir elevation of 1,507 ft, (2) water entering the temperature control tower through the RO bypass, (3) all water passing through the RO, and (4) mean RO gate openings of 7.3 feet during the day and 7.5 feet during the night. The studies were based on juvenile Chinook salmon (Oncorhynchus tshawytscha) surgically implanted with radio transmitters and passive integrated transponder (PIT) tags. Inferences about general dam passage percentage and timing of volitional migrants were based on surface-acclimated fish released in the reservoir. Dam passage and apparent

  16. Mean-field behavior for the survival probability and the point-to-surface connectivity

    CERN Document Server

    Sakai, A

    2003-01-01

    We consider the critical survival probability for oriented percolation and the contact process, and the point-to-surface connectivity for critical percolation. By similarity, let \\rho denote the critical expoents for both quantities. We prove in a unified fashion that, if \\rho exists and if both two-point function and its certain restricted version exhibit the same mean-field behavior, then \\rho=2 for percolation with d>7 and \\rho=1 for the time-oriented models with d>4.

  17. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    Science.gov (United States)

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  18. Technical report. The application of probability-generating functions to linear-quadratic radiation survival curves.

    Science.gov (United States)

    Kendal, W S

    2000-04-01

    To illustrate how probability-generating functions (PGFs) can be employed to derive a simple probabilistic model for clonogenic survival after exposure to ionizing irradiation. Both repairable and irreparable radiation damage to DNA were assumed to occur by independent (Poisson) processes, at intensities proportional to the irradiation dose. Also, repairable damage was assumed to be either repaired or further (lethally) injured according to a third (Bernoulli) process, with the probability of lethal conversion being directly proportional to dose. Using the algebra of PGFs, these three processes were combined to yield a composite PGF that described the distribution of lethal DNA lesions in irradiated cells. The composite PGF characterized a Poisson distribution with mean, chiD+betaD2, where D was dose and alpha and beta were radiobiological constants. This distribution yielded the conventional linear-quadratic survival equation. To test the composite model, the derived distribution was used to predict the frequencies of multiple chromosomal aberrations in irradiated human lymphocytes. The predictions agreed well with observation. This probabilistic model was consistent with single-hit mechanisms, but it was not consistent with binary misrepair mechanisms. A stochastic model for radiation survival has been constructed from elementary PGFs that exactly yields the linear-quadratic relationship. This approach can be used to investigate other simple probabilistic survival models.

  19. Do ducks and songbirds initiate more nests when the probability of survival is greater?

    Science.gov (United States)

    Grant, Todd A.; Shaffer, Terry L.

    2015-01-01

    Nesting chronology in grassland birds can vary by species, locality, and year. The date a nest is initiated can influence the subsequent probability of its survival in some grassland bird species. Because predation is the most significant cause of nest loss in grassland birds, we examined the relation between timing of nesting and nest survival. Periods of high nest survival that correspond with the peak of nesting activity might reflect long-term adaptations to specific predation pressures commonly recurring during certain periods of the nesting cycle. We evaluated this theory by comparing timing of nesting with date-specific nest survival rates for several duck and passerine species breeding in north-central North Dakota during 1998–2003. Nest survival decreased seasonally with date for five of the seven species we studied. We found little evidence to support consistent relations between timing of nesting, the number of nest initiations, and nest survival for any species we studied, suggesting that factors other than nest predation may better explain nesting chronology for these species. The apparent mismatch between date-specific patterns of nest survival and nest initiation underscores uncertainty about the process of avian nest site selection driven mainly by predation. Although timing of nesting differed among species, the general nesting period was fairly predictable across all years of study, suggesting the potential for research activities or management actions to be timed to take advantage of known periods when nests are active (or inactive). However, our results do not support the notion that biologists can take advantage of periods when many nests are active and survival is also high.

  20. Effect of drift on the temporal asymptotic form of the particle survival probability in media with absorbing traps

    International Nuclear Information System (INIS)

    Arkhincheev, V. E.

    2017-01-01

    A new asymptotic form of the particle survival probability in media with absorbing traps has been established. It is shown that this drift mechanism determines a new temporal behavior of the probability of particle survival in media with absorbing traps over long time intervals.

  1. Corticosterone levels predict survival probabilities of Galapagos marine iguanas during El Nino events.

    Science.gov (United States)

    Romero, L M; Wikelski, M

    2001-06-19

    Plasma levels of corticosterone are often used as a measure of "stress" in wild animal populations. However, we lack conclusive evidence that different stress levels reflect different survival probabilities between populations. Galápagos marine iguanas offer an ideal test case because island populations are affected differently by recurring El Niño famine events, and population-level survival can be quantified by counting iguanas locally. We surveyed corticosterone levels in six populations during the 1998 El Niño famine and the 1999 La Niña feast period. Iguanas had higher baseline and handling stress-induced corticosterone concentrations during famine than feast conditions. Corticosterone levels differed between islands and predicted survival through an El Niño period. However, among individuals, baseline corticosterone was only elevated when body condition dropped below a critical threshold. Thus, the population-level corticosterone response was variable but nevertheless predicted overall population health. Our results lend support to the use of corticosterone as a rapid quantitative predictor of survival in wild animal populations.

  2. Lower survival probabilities for adult Florida manatees in years with intense coastal storms

    Science.gov (United States)

    Langtimm, C.A.; Beck, C.A.

    2003-01-01

    The endangered Florida manatee (Trichechus manatus latirostris) inhabits the subtropical waters of the southeastern United States, where hurricanes are a regular occurrence. Using mark-resighting statistical models, we analyzed 19 years of photo-identification data and detected significant annual variation in adult survival for a subpopulation in northwest Florida where human impact is low. That variation coincided with years when intense hurricanes (Category 3 or greater on the Saffir-Simpson Hurricane Scale) and a major winter storm occurred in the northern Gulf of Mexico. Mean survival probability during years with no or low intensity storms was 0.972 (approximate 95% confidence interval = 0.961-0.980) but dropped to 0.936 (0.864-0.971) in 1985 with Hurricanes Elena, Kate, and Juan; to 0.909 (0.837-0.951) in 1993 with the March "Storm of the Century"; and to 0.817 (0.735-0.878) in 1995 with Hurricanes Opal, Erin, and Allison. These drops in survival probability were not catastrophic in magnitude and were detected because of the use of state-of-the-art statistical techniques and the quality of the data. Because individuals of this small population range extensively along the north Gulf coast of Florida, it was possible to resolve storm effects on a regional scale rather than the site-specific local scale common to studies of more sedentary species. This is the first empirical evidence in support of storm effects on manatee survival and suggests a cause-effect relationship. The decreases in survival could be due to direct mortality, indirect mortality, and/or emigration from the region as a consequence of storms. Future impacts to the population by a single catastrophic hurricane, or series of smaller hurricanes, could increase the probability of extinction. With the advent in 1995 of a new 25- to 50-yr cycle of greater hurricane activity, and longer term change possible with global climate change, it becomes all the more important to reduce mortality and injury

  3. Prognostic Factors for Survival in Patients with Gastric Cancer using a Random Survival Forest

    Science.gov (United States)

    Adham, Davoud; Abbasgholizadeh, Nategh; Abazari, Malek

    2017-01-01

    Background: Gastric cancer is the fifth most common cancer and the third top cause of cancer related death with about 1 million new cases and 700,000 deaths in 2012. The aim of this investigation was to identify important factors for outcome using a random survival forest (RSF) approach. Materials and Methods: Data were collected from 128 gastric cancer patients through a historical cohort study in Hamedan-Iran from 2007 to 2013. The event under consideration was death due to gastric cancer. The random survival forest model in R software was applied to determine the key factors affecting survival. Four split criteria were used to determine importance of the variables in the model including log-rank, conversation?? of events, log-rank score, and randomization. Efficiency of the model was confirmed in terms of Harrell’s concordance index. Results: The mean age of diagnosis was 63 ±12.57 and mean and median survival times were 15.2 (95%CI: 13.3, 17.0) and 12.3 (95%CI: 11.0, 13.4) months, respectively. The one-year, two-year, and three-year rates for survival were 51%, 13%, and 5%, respectively. Each RSF approach showed a slightly different ranking order. Very important covariates in nearly all the 4 RSF approaches were metastatic status, age at diagnosis and tumor size. The performance of each RSF approach was in the range of 0.29-0.32 and the best error rate was obtained by the log-rank splitting rule; second, third, and fourth ranks were log-rank score, conservation of events, and the random splitting rule, respectively. Conclusion: Low survival rate of gastric cancer patients is an indication of absence of a screening program for early diagnosis of the disease. Timely diagnosis in early phases increases survival and decreases mortality. Creative Commons Attribution License

  4. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    International Nuclear Information System (INIS)

    Hall, Jim W.; Lawry, Jonathan

    2004-01-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM

  5. Discrete probability models and methods probability on graphs and trees, Markov chains and random fields, entropy and coding

    CERN Document Server

    Brémaud, Pierre

    2017-01-01

    The emphasis in this book is placed on general models (Markov chains, random fields, random graphs), universal methods (the probabilistic method, the coupling method, the Stein-Chen method, martingale methods, the method of types) and versatile tools (Chernoff's bound, Hoeffding's inequality, Holley's inequality) whose domain of application extends far beyond the present text. Although the examples treated in the book relate to the possible applications, in the communication and computing sciences, in operations research and in physics, this book is in the first instance concerned with theory. The level of the book is that of a beginning graduate course. It is self-contained, the prerequisites consisting merely of basic calculus (series) and basic linear algebra (matrices). The reader is not assumed to be trained in probability since the first chapters give in considerable detail the background necessary to understand the rest of the book. .

  6. Learning Binomial Probability Concepts with Simulation, Random Numbers and a Spreadsheet

    Science.gov (United States)

    Rochowicz, John A., Jr.

    2005-01-01

    This paper introduces the reader to the concepts of binomial probability and simulation. A spreadsheet is used to illustrate these concepts. Random number generators are great technological tools for demonstrating the concepts of probability. Ideas of approximation, estimation, and mathematical usefulness provide numerous ways of learning…

  7. The two-parametric scaling and new temporal asymptotic of survival probability of diffusing particle in the medium with traps.

    Science.gov (United States)

    Arkhincheev, V E

    2017-03-01

    The new asymptotic behavior of the survival probability of particles in a medium with absorbing traps in an electric field has been established in two ways-by using the scaling approach and by the direct solution of the diffusion equation in the field. It has shown that at long times, this drift mechanism leads to a new temporal behavior of the survival probability of particles in a medium with absorbing traps.

  8. Cluster geometry and survival probability in systems driven by reaction-diffusion dynamics

    International Nuclear Information System (INIS)

    Windus, Alastair; Jensen, Henrik J

    2008-01-01

    We consider a reaction-diffusion model incorporating the reactions A→φ, A→2A and 2A→3A. Depending on the relative rates for sexual and asexual reproduction of the quantity A, the model exhibits either a continuous or first-order absorbing phase transition to an extinct state. A tricritical point separates the two phase lines. While we comment on this critical behaviour, the main focus of the paper is on the geometry of the population clusters that form. We observe the different cluster structures that arise at criticality for the three different types of critical behaviour and show that there exists a linear relationship for the survival probability against initial cluster size at the tricritical point only.

  9. Cluster geometry and survival probability in systems driven by reaction-diffusion dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Windus, Alastair; Jensen, Henrik J [The Institute for Mathematical Sciences, 53 Prince' s Gate, South Kensington, London SW7 2PG (United Kingdom)], E-mail: h.jensen@imperial.ac.uk

    2008-11-15

    We consider a reaction-diffusion model incorporating the reactions A{yields}{phi}, A{yields}2A and 2A{yields}3A. Depending on the relative rates for sexual and asexual reproduction of the quantity A, the model exhibits either a continuous or first-order absorbing phase transition to an extinct state. A tricritical point separates the two phase lines. While we comment on this critical behaviour, the main focus of the paper is on the geometry of the population clusters that form. We observe the different cluster structures that arise at criticality for the three different types of critical behaviour and show that there exists a linear relationship for the survival probability against initial cluster size at the tricritical point only.

  10. Predicting longitudinal trajectories of health probabilities with random-effects multinomial logit regression.

    Science.gov (United States)

    Liu, Xian; Engel, Charles C

    2012-12-20

    Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond existing work by developing a retransformation method that derives longitudinal growth trajectories of unbiased health probabilities. We estimated variances of the predicted probabilities by using the delta method. Additionally, we transformed the covariates' regression coefficients on the multinomial logit function, not substantively meaningful, to the conditional effects on the predicted probabilities. The empirical illustration uses the longitudinal data from the Asset and Health Dynamics among the Oldest Old. Our analysis compared three sets of the predicted probabilities of three health states at six time points, obtained from, respectively, the retransformation method, the best linear unbiased prediction, and the fixed-effects approach. The results demonstrate that neglect of retransforming random errors in the random-effects multinomial logit model results in severely biased longitudinal trajectories of health probabilities as well as overestimated effects of covariates on the probabilities. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Survival probabilities of first and second clutches of blackbird (Turdus merula in an urban environment

    Directory of Open Access Journals (Sweden)

    Kurucz Kornelia

    2010-01-01

    Full Text Available The breeding success of blackbirds was investigated in April and June 2008 and 2009 in the Botanical Garden of the University of Pecs, with a total of 50 artificial nests at each of the four sessions (with 1 quail egg and 1 plasticine egg placed in every nest. In all four study periods of the two years, 2 nests (4% were destroyed by predators. Six nests (12%, of the nests were not discovered in either of the cases. The survival probability of artificial nests was greater in April than in June (both years, but the difference was significant only in 2008. Nests placed into a curtain of ivy (Hedera helix on a wall were located higher up than those in bushes, yet their predation rates were quite similar. The predation values of quail vs. plasticine eggs did not differ in 2008. In the year 2009, however, significantly more quail eggs were discovered (mostly removed, than plasticine eggs. Marks that were left on plasticine eggs originated mostly from small mammals and small-bodied birds, but the disappearance of a large number of quail and plasticine eggs was probably caused by larger birds, primarily jays.

  12. Survival probability of precipitations and rain attenuation in tropical and equatorial regions

    Science.gov (United States)

    Mohebbi Nia, Masoud; Din, Jafri; Panagopoulos, Athanasios D.; Lam, Hong Yin

    2015-08-01

    This contribution presents a stochastic model useful for the generation of a long-term tropospheric rain attenuation time series for Earth space or a terrestrial radio link in tropical and equatorial heavy rain regions based on the well-known Cox-Ingersoll-Ross model previously employed in research in the fields of finance and economics. This model assumes typical gamma distribution for rain attenuation in heavy rain climatic regions and utilises the temporal dynamic of precipitation collected in equatorial Johor, Malaysia. Different formations of survival probability are also discussed. Furthermore, the correlation between these probabilities and the Markov process is determined, and information on the variance and autocorrelation function of rain events with respect to the particular characteristics of precipitation in this area is presented. The proposed technique proved to preserve the peculiarities of precipitation for an equatorial region and reproduce fairly good statistics of the rain attenuation correlation function that could help to improve the prediction of dynamic characteristics of rain fade events.

  13. Dynamic probability of reinforcement for cooperation: Random game termination in the centipede game.

    Science.gov (United States)

    Krockow, Eva M; Colman, Andrew M; Pulford, Briony D

    2018-03-01

    Experimental games have previously been used to study principles of human interaction. Many such games are characterized by iterated or repeated designs that model dynamic relationships, including reciprocal cooperation. To enable the study of infinite game repetitions and to avoid endgame effects of lower cooperation toward the final game round, investigators have introduced random termination rules. This study extends previous research that has focused narrowly on repeated Prisoner's Dilemma games by conducting a controlled experiment of two-player, random termination Centipede games involving probabilistic reinforcement and characterized by the longest decision sequences reported in the empirical literature to date (24 decision nodes). Specifically, we assessed mean exit points and cooperation rates, and compared the effects of four different termination rules: no random game termination, random game termination with constant termination probability, random game termination with increasing termination probability, and random game termination with decreasing termination probability. We found that although mean exit points were lower for games with shorter expected game lengths, the subjects' cooperativeness was significantly reduced only in the most extreme condition with decreasing computer termination probability and an expected game length of two decision nodes. © 2018 Society for the Experimental Analysis of Behavior.

  14. Probability for human intake of an atom randomly released into ground, rivers, oceans and air

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, B L

    1984-08-01

    Numerical estimates are developed for the probability of an atom randomly released in the top ground layers, in a river, or in the oceans to be ingested orally by a human, and for an atom emitted from an industrial source to be inhaled by a human. Estimates are obtained for both probability per year and for total eventual probability. Results vary considerably for different elements, but typical values for total probabilities are: ground, 3 X 10/sup -3/, oceans, 3 X 10/sup -4/; rivers, 1.7 x 10/sup -4/; and air, 5 X 10/sup -6/. Probabilities per year are typcially 1 X 10/sup -7/ for releases into the ground and 5 X 10/sup -8/ for releases into the oceans. These results indicate that for material with very long-lasting toxicity, it is important to include the pathways from the ground and from the oceans.

  15. Approximations to the Probability of Failure in Random Vibration by Integral Equation Methods

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    Close approximations to the first passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first passage probability density function and the distribution function for the time interval spent below a barrier before...... passage probability density. The results of the theory agree well with simulation results for narrow banded processes dominated by a single frequency, as well as for bimodal processes with 2 dominating frequencies in the structural response....... outcrossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval, and hence for the first...

  16. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  17. A probability measure for random surfaces of arbitrary genus and bosonic strings in 4 dimensions

    International Nuclear Information System (INIS)

    Albeverio, S.; Hoeegh-Krohn, R.; Paycha, S.; Scarlatti, S.

    1989-01-01

    We define a probability measure describing random surfaces in R D , 3≤D≤13, parametrized by compact Riemann surfaces of arbitrary genus. The measure involves the path space measure for scalar fields with exponential interaction in 2 space time dimensions. We show that it gives a mathematical realization of Polyakov's heuristic measure for bosonic strings. (orig.)

  18. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Science.gov (United States)

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  19. Stationary Probability and First-Passage Time of Biased Random Walk

    International Nuclear Information System (INIS)

    Li Jing-Wen; Tang Shen-Li; Xu Xin-Ping

    2016-01-01

    In this paper, we consider the stationary probability and first-passage time of biased random walk on 1D chain, where at each step the walker moves to the left and right with probabilities p and q respectively (0 ⩽ p, q ⩽ 1, p + q = 1). We derive exact analytical results for the stationary probability and first-passage time as a function of p and q for the first time. Our results suggest that the first-passage time shows a double power-law F ∼ (N − 1) γ , where the exponent γ = 2 for N < |p − q| −1 and γ = 1 for N > |p − q| −1 . Our study sheds useful insights into the biased random-walk process. (paper)

  20. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    Science.gov (United States)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  1. Recruitment in a Colorado population of big brown bats: Breeding probabilities, litter size, and first-year survival

    Science.gov (United States)

    O'Shea, T.J.; Ellison, L.E.; Neubaum, D.J.; Neubaum, M.A.; Reynolds, C.A.; Bowen, R.A.

    2010-01-01

    We used markrecapture estimation techniques and radiography to test hypotheses about 3 important aspects of recruitment in big brown bats (Eptesicus fuscus) in Fort Collins, Colorado: adult breeding probabilities, litter size, and 1st-year survival of young. We marked 2,968 females with passive integrated transponder (PIT) tags at multiple sites during 2001-2005 and based our assessments on direct recaptures (breeding probabilities) and passive detection with automated PIT tag readers (1st-year survival). We interpreted our data in relation to hypotheses regarding demographic influences of bat age, roost, and effects of years with unusual environmental conditions: extreme drought (2002) and arrival of a West Nile virus epizootic (2003). Conditional breeding probabilities at 6 roosts sampled in 2002-2005 were estimated as 0.64 (95% confidence interval [95% CI] = 0.530.73) in 1-year-old females, but were consistently high (95% CI = 0.940.96) and did not vary by roost, year, or prior year breeding status in older adults. Mean litter size was 1.11 (95% CI = 1.051.17), based on examination of 112 pregnant females by radiography. Litter size was not higher in older or larger females and was similar to results of other studies in western North America despite wide variation in latitude. First-year survival was estimated as 0.67 (95% CI = 0.610.73) for weaned females at 5 maternity roosts over 5 consecutive years, was lower than adult survival (0.79; 95% CI = 0.770.81), and varied by roost. Based on model selection criteria, strong evidence exists for complex roost and year effects on 1st-year survival. First-year survival was lowest in bats born during the drought year. Juvenile females that did not return to roosts as 1-year-olds had lower body condition indices in late summer of their natal year than those known to survive. ?? 2009 American Society of Mammalogists.

  2. Estimating the probability of survival of individual shortleaf pine (Pinus echinata mill.) trees

    Science.gov (United States)

    Sudip Shrestha; Thomas B. Lynch; Difei Zhang; James M. Guldin

    2012-01-01

    A survival model is needed in a forest growth system which predicts the survival of trees on individual basis or on a stand basis (Gertner, 1989). An individual-tree modeling approach is one of the better methods available for predicting growth and yield as it provides essential information about particular tree species; tree size, tree quality and tree present status...

  3. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  4. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  5. Probability of failure prediction for step-stress fatigue under sine or random stress

    Science.gov (United States)

    Lambert, R. G.

    1979-01-01

    A previously proposed cumulative fatigue damage law is extended to predict the probability of failure or fatigue life for structural materials with S-N fatigue curves represented as a scatterband of failure points. The proposed law applies to structures subjected to sinusoidal or random stresses and includes the effect of initial crack (i.e., flaw) sizes. The corrected cycle ratio damage function is shown to have physical significance.

  6. An extended car-following model considering random safety distance with different probabilities

    Science.gov (United States)

    Wang, Jufeng; Sun, Fengxin; Cheng, Rongjun; Ge, Hongxia; Wei, Qi

    2018-02-01

    Because of the difference in vehicle type or driving skill, the driving strategy is not exactly the same. The driving speeds of the different vehicles may be different for the same headway. Since the optimal velocity function is just determined by the safety distance besides the maximum velocity and headway, an extended car-following model accounting for random safety distance with different probabilities is proposed in this paper. The linear stable condition for this extended traffic model is obtained by using linear stability theory. Numerical simulations are carried out to explore the complex phenomenon resulting from multiple safety distance in the optimal velocity function. The cases of multiple types of safety distances selected with different probabilities are presented. Numerical results show that the traffic flow with multiple safety distances with different probabilities will be more unstable than that with single type of safety distance, and will result in more stop-and-go phenomena.

  7. Estimates for the probability of survival of electrons in passing through a radiator

    International Nuclear Information System (INIS)

    Loos, J.

    1977-01-01

    Some calculations on the survival of electrons passing through various radiator thicknesses are tabulated. The results of these calculations should serve as a guide for expected attenuation of electrons in the beam when various Pb radiators are inserted

  8. A cellular automata model of traffic flow with variable probability of randomization

    International Nuclear Information System (INIS)

    Zheng Wei-Fan; Zhang Ji-Ye

    2015-01-01

    Research on the stochastic behavior of traffic flow is important to understand the intrinsic evolution rules of a traffic system. By introducing an interactional potential of vehicles into the randomization step, an improved cellular automata traffic flow model with variable probability of randomization is proposed in this paper. In the proposed model, the driver is affected by the interactional potential of vehicles before him, and his decision-making process is related to the interactional potential. Compared with the traditional cellular automata model, the modeling is more suitable for the driver’s random decision-making process based on the vehicle and traffic situations in front of him in actual traffic. From the improved model, the fundamental diagram (flow–density relationship) is obtained, and the detailed high-density traffic phenomenon is reproduced through numerical simulation. (paper)

  9. Some Limit Properties of Random Transition Probability for Second-Order Nonhomogeneous Markov Chains Indexed by a Tree

    Directory of Open Access Journals (Sweden)

    Shi Zhiyan

    2009-01-01

    Full Text Available We study some limit properties of the harmonic mean of random transition probability for a second-order nonhomogeneous Markov chain and a nonhomogeneous Markov chain indexed by a tree. As corollary, we obtain the property of the harmonic mean of random transition probability for a nonhomogeneous Markov chain.

  10. Manuscripts as Evidence for the use of Classics in Education, c. 800–1200: Estimating the Randomness of Survival

    Directory of Open Access Journals (Sweden)

    Jaakko Tahkokallio

    2017-06-01

    Full Text Available Are the surviving copies of schooltexts representative of what was popularly used in schools in the medieval period? In other words, was the survival of these manuscripts a random or selective process? To approach this question, this article presents a series of comparisons between the numbers of manuscripts of different schooltexts. It demonstrates that the most popular schooltexts all survive in very similar numbers from each century, and that the typical number of copies varies from one century to another. The easiest explanation for such a survival pattern is to assume that the texts were produced in equal numbers and passed through a relatively random filter of losses. The article seeks to test this intuitive explanation by using a simple probability mathematical experiment. In addition, the article analyses how the numbers of surviving manuscripts relate to entries in medieval book lists and medieval library catalogues. This examination supports the interpretation that the survival of schooltexts was a relatively random process. In addition, comparison between medieval book lists and extant manuscripts advocates caution in using the book lists as evidence for the popularity of texts in the medieval centuries. Even though the catalogues provide snapshots of specific historical situations, this paper concludes that the mass of extant books is more likely to give us a realistic picture of the contemporary popularity of texts.

  11. Malnutrition among rural and urban children in Lesotho: related hazard and survival probabilities

    Directory of Open Access Journals (Sweden)

    Zeleke Worku

    2003-11-01

    Full Text Available The relationship between the survival time of children and several variables that affect the survival and nutritional status of children under the age of five years in the Maseru District of Lesotho was investigated. Opsomming Die verhouding tussen die oorlewingstyd van kinders en verskeie veranderlikes wat die oorlewings- en voedingstatus van kinders onder die ouderdom van vyf jaar affekteer is in die Maseru-distrik in Lesotho nagevors. *Please note: This is a reduced version of the abstract. Please refer to PDF for full text.

  12. Experiencing El Niño conditions during early life reduces recruiting probabilities but not adult survival

    Science.gov (United States)

    Rodríguez, Cristina; Drummond, Hugh

    2018-01-01

    In wild long-lived animals, analysis of impacts of stressful natal conditions on adult performance has rarely embraced the entire age span, and the possibility that costs are expressed late in life has seldom been examined. Using 26 years of data from 8541 fledglings and 1310 adults of the blue-footed booby (Sula nebouxii), a marine bird that can live up to 23 years, we tested whether experiencing the warm waters and food scarcity associated with El Niño in the natal year reduces recruitment or survival over the adult lifetime. Warm water in the natal year reduced the probability of recruiting; each additional degree (°C) of water temperature meant a reduction of roughly 50% in fledglings' probability of returning to the natal colony as breeders. Warm water in the current year impacted adult survival, with greater effect at the oldest ages than during early adulthood. However, warm water in the natal year did not affect survival at any age over the adult lifespan. A previous study showed that early recruitment and widely spaced breeding allow boobies that experience warm waters in the natal year to achieve normal fledgling production over the first 10 years; our results now show that this reproductive effort incurs no survival penalty, not even late in life. This pattern is additional evidence of buffering against stressful natal conditions via life-history adjustments. PMID:29410788

  13. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2011-04-01

    Full Text Available This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.

  14. General Exact Solution to the Problem of the Probability Density for Sums of Random Variables

    Science.gov (United States)

    Tribelsky, Michael I.

    2002-07-01

    The exact explicit expression for the probability density pN(x) for a sum of N random, arbitrary correlated summands is obtained. The expression is valid for any number N and any distribution of the random summands. Most attention is paid to application of the developed approach to the case of independent and identically distributed summands. The obtained results reproduce all known exact solutions valid for the, so called, stable distributions of the summands. It is also shown that if the distribution is not stable, the profile of pN(x) may be divided into three parts, namely a core (small x), a tail (large x), and a crossover from the core to the tail (moderate x). The quantitative description of all three parts as well as that for the entire profile is obtained. A number of particular examples are considered in detail.

  15. Intraseasonal variation in survival and probable causes of mortality in greater sage-grouse Centrocercus urophasianus

    Science.gov (United States)

    Blomberg, Erik J.; Gibson, Daniel; Sedinger, James S.; Casazza, Michael L.; Coates, Peter S.

    2013-01-01

    The mortality process is a key component of avian population dynamics, and understanding factors that affect mortality is central to grouse conservation. Populations of greater sage-grouse Centrocercus urophasianus have declined across their range in western North America. We studied cause-specific mortality of radio-marked sage-grouse in Eureka County, Nevada, USA, during two seasons, nesting (2008-2012) and fall (2008-2010), when survival was known to be lower compared to other times of the year. We used known-fate and cumulative incidence function models to estimate weekly survival rates and cumulative risk of cause-specific mortalities, respectively. These methods allowed us to account for temporal variation in sample size and staggered entry of marked individuals into the sample to obtain robust estimates of survival and cause-specific mortality. We monitored 376 individual sage-grouse during the course of our study, and investigated 87 deaths. Predation was the major source of mortality, and accounted for 90% of all mortalities during our study. During the nesting season (1 April - 31 May), the cumulative risk of predation by raptors (0.10; 95% CI: 0.05-0.16) and mammals (0.08; 95% CI: 0.03-013) was relatively equal. In the fall (15 August - 31 October), the cumulative risk of mammal predation was greater (M(mam) = 0.12; 95% CI: 0.04-0.19) than either predation by raptors (M(rap) = 0.05; 95% CI: 0.00-0.10) or hunting harvest (M(hunt) = 0.02; 95% CI: 0.0-0.06). During both seasons, we observed relatively few additional sources of mortality (e.g. collision) and observed no evidence of disease-related mortality (e.g. West Nile Virus). In general, we found little evidence for intraseasonal temporal variation in survival, suggesting that the nesting and fall seasons represent biologically meaningful time intervals with respect to sage-grouse survival.

  16. Disparities in breast cancer tumor characteristics, treatment, time to treatment, and survival probability among African American and white women.

    Science.gov (United States)

    Foy, Kevin Chu; Fisher, James L; Lustberg, Maryam B; Gray, Darrell M; DeGraffinreid, Cecilia R; Paskett, Electra D

    2018-01-01

    African American (AA) women have a 42% higher breast cancer death rate compared to white women despite recent advancements in management of the disease. We examined racial differences in clinical and tumor characteristics, treatment and survival in patients diagnosed with breast cancer between 2005 and 2014 at a single institution, the James Cancer Hospital, and who were included in the Arthur G. James Cancer Hospital and Richard J. Solove Research Institute Cancer Registry in Columbus OH. Statistical analyses included likelihood ratio chi-square tests for differences in proportions, as well as univariate and multivariate Cox proportional hazards regressions to examine associations between race and overall and progression-free survival probabilities. AA women made up 10.2% (469 of 4593) the sample. Average time to onset of treatment after diagnosis was almost two times longer in AA women compared to white women (62.0 days vs 35.5 days, p  triple negative and late stage breast cancer, and were less likely to receive surgery, especially mastectomy and reconstruction following mastectomy. After adjustment for confounding factors (age, grade, and surgery), overall survival probability was significantly associated with race (HR = 1.33; 95% CI 1.03-1.72). These findings highlight the need for efforts focused on screening and receipt of prompt treatment among AA women diagnosed with breast cancer.

  17. Corticosterone levels predict survival probabilities of Galápagos marine iguanas during El Niño events

    Science.gov (United States)

    Romero, L. Michael; Wikelski, Martin

    2001-01-01

    Plasma levels of corticosterone are often used as a measure of “stress” in wild animal populations. However, we lack conclusive evidence that different stress levels reflect different survival probabilities between populations. Galápagos marine iguanas offer an ideal test case because island populations are affected differently by recurring El Niño famine events, and population-level survival can be quantified by counting iguanas locally. We surveyed corticosterone levels in six populations during the 1998 El Niño famine and the 1999 La Niña feast period. Iguanas had higher baseline and handling stress-induced corticosterone concentrations during famine than feast conditions. Corticosterone levels differed between islands and predicted survival through an El Niño period. However, among individuals, baseline corticosterone was only elevated when body condition dropped below a critical threshold. Thus, the population-level corticosterone response was variable but nevertheless predicted overall population health. Our results lend support to the use of corticosterone as a rapid quantitative predictor of survival in wild animal populations. PMID:11416210

  18. Survival probability for diffractive dijet production in p anti p collisions from next-to-leading order calculations

    International Nuclear Information System (INIS)

    Klasen, M.; Kramer, G.

    2009-08-01

    We perform next-to-leading order calculations of the single-diffractive and non-diffractive cross sections for dijet production in proton-antiproton collisions at the Tevatron. By comparing their ratio to the data published by the CDF collaboration for two different center-of-mass energies, we deduce the rapidity-gap survival probability as a function of the momentum fraction of the parton in the antiproton. Assuming Regge factorization, this probability can be interpreted as a suppression factor for the diffractive structure function measured in deep-inelastic scattering at HERA. In contrast to the observations for photoproduction, the suppression factor in protonantiproton collisions depends on the momentum fraction of the parton in the Pomeron even at next-to-leading order. (orig.)

  19. Method for Evaluation of Outage Probability on Random Access Channel in Mobile Communication Systems

    Science.gov (United States)

    Kollár, Martin

    2012-05-01

    In order to access the cell in all mobile communication technologies a so called random-access procedure is used. For example in GSM this is represented by sending the CHANNEL REQUEST message from Mobile Station (MS) to Base Transceiver Station (BTS) which is consequently forwarded as an CHANNEL REQUIRED message to the Base Station Controller (BSC). If the BTS decodes some noise on the Random Access Channel (RACH) as random access by mistake (so- called ‘phantom RACH') then it is a question of pure coincidence which èstablishment cause’ the BTS thinks to have recognized. A typical invalid channel access request or phantom RACH is characterized by an IMMEDIATE ASSIGNMENT procedure (assignment of an SDCCH or TCH) which is not followed by sending an ESTABLISH INDICATION from MS to BTS. In this paper a mathematical model for evaluation of the Power RACH Busy Threshold (RACHBT) in order to guaranty in advance determined outage probability on RACH is described and discussed as well. It focuses on Global System for Mobile Communications (GSM) however the obtained results can be generalized on remaining mobile technologies (ie WCDMA and LTE).

  20. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni; Nobile, Fabio; Tempone, Raul

    2015-01-01

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability

  1. Survival under uncertainty an introduction to probability models of social structure and evolution

    CERN Document Server

    Volchenkov, Dimitri

    2016-01-01

    This book introduces and studies a number of stochastic models of subsistence, communication, social evolution and political transition that will allow the reader to grasp the role of uncertainty as a fundamental property of our irreversible world. At the same time, it aims to bring about a more interdisciplinary and quantitative approach across very diverse fields of research in the humanities and social sciences. Through the examples treated in this work – including anthropology, demography, migration, geopolitics, management, and bioecology, among other things – evidence is gathered to show that volatile environments may change the rules of the evolutionary selection and dynamics of any social system, creating a situation of adaptive uncertainty, in particular, whenever the rate of change of the environment exceeds the rate of adaptation. Last but not least, it is hoped that this book will contribute to the understanding that inherent randomness can also be a great opportunity – for social systems an...

  2. Some results on convergence rates for probabilities of moderate deviations for sums of random variables

    Directory of Open Access Journals (Sweden)

    Deli Li

    1992-01-01

    Full Text Available Let X, Xn, n≥1 be a sequence of iid real random variables, and Sn=∑k=1nXk, n≥1. Convergence rates of moderate deviations are derived, i.e., the rate of convergence to zero of certain tail probabilities of the partial sums are determined. For example, we obtain equivalent conditions for the convergence of series ∑n≥1(ψ2(n/nP(|Sn|≥nφ(n only under the assumptions convergence that EX=0 and EX2=1, where φ and ψ are taken from a broad class of functions. These results generalize and improve some recent results of Li (1991 and Gafurov (1982 and some previous work of Davis (1968. For b∈[0,1] and ϵ>0, letλϵ,b=∑n≥3((loglognb/nI(|Sn|≥(2+ϵnloglogn.The behaviour of Eλϵ,b as ϵ↓0 is also studied.

  3. Lay understanding of forensic statistics: Evaluation of random match probabilities, likelihood ratios, and verbal equivalents.

    Science.gov (United States)

    Thompson, William C; Newman, Eryn J

    2015-08-01

    Forensic scientists have come under increasing pressure to quantify the strength of their evidence, but it is not clear which of several possible formats for presenting quantitative conclusions will be easiest for lay people, such as jurors, to understand. This experiment examined the way that people recruited from Amazon's Mechanical Turk (n = 541) responded to 2 types of forensic evidence--a DNA comparison and a shoeprint comparison--when an expert explained the strength of this evidence 3 different ways: using random match probabilities (RMPs), likelihood ratios (LRs), or verbal equivalents of likelihood ratios (VEs). We found that verdicts were sensitive to the strength of DNA evidence regardless of how the expert explained it, but verdicts were sensitive to the strength of shoeprint evidence only when the expert used RMPs. The weight given to DNA evidence was consistent with the predictions of a Bayesian network model that incorporated the perceived risk of a false match from 3 causes (coincidence, a laboratory error, and a frame-up), but shoeprint evidence was undervalued relative to the same Bayesian model. Fallacious interpretations of the expert's testimony (consistent with the source probability error and the defense attorney's fallacy) were common and were associated with the weight given to the evidence and verdicts. The findings indicate that perceptions of forensic science evidence are shaped by prior beliefs and expectations as well as expert testimony and consequently that the best way to characterize and explain forensic evidence may vary across forensic disciplines. (c) 2015 APA, all rights reserved).

  4. Sugar administration to newly emerged Aedes albopictus males increases their survival probability and mating performance.

    Science.gov (United States)

    Bellini, Romeo; Puggioli, Arianna; Balestrino, Fabrizio; Brunelli, Paolo; Medici, Anna; Urbanelli, Sandra; Carrieri, Marco

    2014-04-01

    Aedes albopictus male survival in laboratory cages is no more than 4-5 days when kept without any access to sugar indicating their need to feed on a sugar source soon after emergence. We therefore developed a device to administer energetic substances to newly emerged males when released as pupae as part of a sterile insect technique (SIT) programme, made with a polyurethane sponge 4 cm thick and perforated with holes 2 cm in diameter. The sponge was imbibed with the required sugar solution and due to its high retention capacity the sugar solution was available for males to feed for at least 48 h. When evaluated in lab cages, comparing adults emerged from the device with sugar solution vs the device with water only (as negative control), about half of the males tested positive for fructose using the Van Handel anthrone test, compared to none of males in the control cage. We then tested the tool in semi-field and in field conditions with different sugar concentrations (10%, 15%, and 20%) and compared results to the controls fed with water only. Males were recaptured by a battery operated manual aspirator at 24 and 48 h after pupae release. Rather high share 10-25% of captured males tested positive for fructose in recollections in the vicinity of the control stations, while in the vicinity of the sugar stations around 40-55% of males were positive, though variability between replicates was large. The sugar positive males in the control test may have been released males that had access to natural sugar sources found close to the release station and/or wild males present in the environment. Only a slight increase in the proportion of positive males was obtained by increasing the sugar concentration in the feeding device from 10% to 20%. Surprisingly, modification of the device to add a black plastic inverted funnel above the container reduced rather than increased the proportion of fructose positive males collected around the station. No evidence of difference in the

  5. Characteristics of the probability function for three random-walk models of reaction--diffusion processes

    International Nuclear Information System (INIS)

    Musho, M.K.; Kozak, J.J.

    1984-01-01

    A method is presented for calculating exactly the relative width (sigma 2 )/sup 1/2// , the skewness γ 1 , and the kurtosis γ 2 characterizing the probability distribution function for three random-walk models of diffusion-controlled processes. For processes in which a diffusing coreactant A reacts irreversibly with a target molecule B situated at a reaction center, three models are considered. The first is the traditional one of an unbiased, nearest-neighbor random walk on a d-dimensional periodic/confining lattice with traps; the second involves the consideration of unbiased, non-nearest-neigh bor (i.e., variable-step length) walks on the same d-dimensional lattice; and, the third deals with the case of a biased, nearest-neighbor walk on a d-dimensional lattice (wherein a walker experiences a potential centered at the deep trap site of the lattice). Our method, which has been described in detail elsewhere [P.A. Politowicz and J. J. Kozak, Phys. Rev. B 28, 5549 (1983)] is based on the use of group theoretic arguments within the framework of the theory of finite Markov processes

  6. On the Generation of Random Ensembles of Qubits and Qutrits Computing Separability Probabilities for Fixed Rank States

    Directory of Open Access Journals (Sweden)

    Khvedelidze Arsen

    2018-01-01

    Full Text Available The generation of random mixed states is discussed, aiming for the computation of probabilistic characteristics of composite finite dimensional quantum systems. In particular, we consider the generation of random Hilbert-Schmidt and Bures ensembles of qubit and qutrit pairs and compute the corresponding probabilities to find a separable state among the states of a fixed rank.

  7. Contribution to the neutronic theory of random stacks (diffusion coefficient and first-flight collision probabilities) with a general theorem on collision probabilities

    International Nuclear Information System (INIS)

    Dixmier, Marc.

    1980-10-01

    A general expression of the diffusion coefficient (d.c.) of neutrons was given, with stress being put on symmetries. A system of first-flight collision probabilities for the case of a random stack of any number of types of one- and two-zoned spherical pebbles, with an albedo at the frontiers of the elements or (either) consideration of the interstital medium, was built; to that end, the bases of collision probability theory were reviewed, and a wide generalisation of the reciprocity theorem for those probabilities was demonstrated. The migration area of neutrons was expressed for any random stack of convex, 'simple' and 'regular-contact' elements, taking into account the correlations between free-paths; the average cosinus of re-emission of neutrons by an element, in the case of a homogeneous spherical pebble and the transport approximation, was expressed; the superiority of the so-found result over Behrens' theory, for the type of media under consideration, was established. The 'fine structure current term' of the d.c. was also expressed, and it was shown that its 'polarisation term' is negligible. Numerical applications showed that the global heterogeneity effect on the d.c. of pebble-bed reactors is comparable with that for Graphite-moderated, Carbon gas-cooled, natural Uranium reactors. The code CARACOLE, which integrates all the results here obtained, was introduced [fr

  8. How long do the dead survive on the road? Carcass persistence probability and implications for road-kill monitoring surveys.

    Directory of Open Access Journals (Sweden)

    Sara M Santos

    Full Text Available BACKGROUND: Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. METHODOLOGY/PRINCIPAL FINDINGS: Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i describe carcass persistence timings for overall and for specific animal groups; ii assess optimal sampling designs according to research objectives; and iii model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning and lizards (in the afternoon, daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. CONCLUSION/SIGNIFICANCE: The guidance given here on monitoring frequencies is particularly relevant to provide

  9. How long do the dead survive on the road? Carcass persistence probability and implications for road-kill monitoring surveys.

    Science.gov (United States)

    Santos, Sara M; Carvalho, Filipe; Mira, António

    2011-01-01

    Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i) describe carcass persistence timings for overall and for specific animal groups; ii) assess optimal sampling designs according to research objectives; and iii) model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning) and lizards (in the afternoon), daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days) for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. The guidance given here on monitoring frequencies is particularly relevant to provide conservation and transportation agencies with accurate numbers of road

  10. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    Science.gov (United States)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  11. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    Science.gov (United States)

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  12. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    Science.gov (United States)

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  13. Impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach.

    Science.gov (United States)

    Chandrasekar, A; Rakkiyappan, R; Cao, Jinde

    2015-10-01

    This paper studies the impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach. The array of neural networks are coupled in a random fashion which is governed by Bernoulli random variable. The aim of this paper is to obtain the synchronization criteria, which is suitable for both exactly known and partly unknown transition probabilities such that the coupled neural network is synchronized with mixed time-delay. The considered impulsive effects can be synchronized at partly unknown transition probabilities. Besides, a multiple integral approach is also proposed to strengthen the Markovian jumping randomly coupled neural networks with partly unknown transition probabilities. By making use of Kronecker product and some useful integral inequalities, a novel Lyapunov-Krasovskii functional was designed for handling the coupled neural network with mixed delay and then impulsive synchronization criteria are solvable in a set of linear matrix inequalities. Finally, numerical examples are presented to illustrate the effectiveness and advantages of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Estimating survival probabilities by exposure levels: utilizing vital statistics and complex survey data with mortality follow-up.

    Science.gov (United States)

    Landsman, V; Lou, W Y W; Graubard, B I

    2015-05-20

    We present a two-step approach for estimating hazard rates and, consequently, survival probabilities, by levels of general categorical exposure. The resulting estimator utilizes three sources of data: vital statistics data and census data are used at the first step to estimate the overall hazard rate for a given combination of gender and age group, and cohort data constructed from a nationally representative complex survey with linked mortality records, are used at the second step to divide the overall hazard rate by exposure levels. We present an explicit expression for the resulting estimator and consider two methods for variance estimation that account for complex multistage sample design: (1) the leaving-one-out jackknife method, and (2) the Taylor linearization method, which provides an analytic formula for the variance estimator. The methods are illustrated with smoking and all-cause mortality data from the US National Health Interview Survey Linked Mortality Files, and the proposed estimator is compared with a previously studied crude hazard rate estimator that uses survey data only. The advantages of a two-step approach and possible extensions of the proposed estimator are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Exact calculations of survival probability for diffusion on growing lines, disks, and spheres: The role of dimension.

    Science.gov (United States)

    Simpson, Matthew J; Baker, Ruth E

    2015-09-07

    Unlike standard applications of transport theory, the transport of molecules and cells during embryonic development often takes place within growing multidimensional tissues. In this work, we consider a model of diffusion on uniformly growing lines, disks, and spheres. An exact solution of the partial differential equation governing the diffusion of a population of individuals on the growing domain is derived. Using this solution, we study the survival probability, S(t). For the standard non-growing case with an absorbing boundary, we observe that S(t) decays to zero in the long time limit. In contrast, when the domain grows linearly or exponentially with time, we show that S(t) decays to a constant, positive value, indicating that a proportion of the diffusing substance remains on the growing domain indefinitely. Comparing S(t) for diffusion on lines, disks, and spheres indicates that there are minimal differences in S(t) in the limit of zero growth and minimal differences in S(t) in the limit of fast growth. In contrast, for intermediate growth rates, we observe modest differences in S(t) between different geometries. These differences can be quantified by evaluating the exact expressions derived and presented here.

  16. Non-stationary random vibration analysis of a 3D train-bridge system using the probability density evolution method

    Science.gov (United States)

    Yu, Zhi-wu; Mao, Jian-feng; Guo, Feng-qi; Guo, Wei

    2016-03-01

    Rail irregularity is one of the main sources causing train-bridge random vibration. A new random vibration theory for the coupled train-bridge systems is proposed in this paper. First, number theory method (NTM) with 2N-dimensional vectors for the stochastic harmonic function (SHF) of rail irregularity power spectrum density was adopted to determine the representative points of spatial frequencies and phases to generate the random rail irregularity samples, and the non-stationary rail irregularity samples were modulated with the slowly varying function. Second, the probability density evolution method (PDEM) was employed to calculate the random dynamic vibration of the three-dimensional (3D) train-bridge system by a program compiled on the MATLAB® software platform. Eventually, the Newmark-β integration method and double edge difference method of total variation diminishing (TVD) format were adopted to obtain the mean value curve, the standard deviation curve and the time-history probability density information of responses. A case study was presented in which the ICE-3 train travels on a three-span simply-supported high-speed railway bridge with excitation of random rail irregularity. The results showed that compared to the Monte Carlo simulation, the PDEM has higher computational efficiency for the same accuracy, i.e., an improvement by 1-2 orders of magnitude. Additionally, the influences of rail irregularity and train speed on the random vibration of the coupled train-bridge system were discussed.

  17. Survival after relapse in patients with endometrial cancer : results from a randomized trial

    NARCIS (Netherlands)

    Creutzberg, CL; van Putten, WLJ; Koper, PC; Lybeert, MLM; Jobsen, JJ; Warlam-Rodenhuis, CC; De Winter, KAJ; Lutgens, LCHW; van den Bergh, ACM; van der Steen-Banasik, E; Beerman, H; van Lent, M

    Objective. The aim of this study was to determine the rates of local control and survival after relapse in patients with stage I endometrial cancer treated in the multicenter randomized PORTEC trial. Methods, The PORTEC trial included 715 patients with stage I endometrial cancer, either grade I or 2

  18. SNRFCB: sub-network based random forest classifier for predicting chemotherapy benefit on survival for cancer treatment.

    Science.gov (United States)

    Shi, Mingguang; He, Jianmin

    2016-04-01

    Adjuvant chemotherapy (CTX) should be individualized to provide potential survival benefit and avoid potential harm to cancer patients. Our goal was to establish a computational approach for making personalized estimates of the survival benefit from adjuvant CTX. We developed Sub-Network based Random Forest classifier for predicting Chemotherapy Benefit (SNRFCB) based gene expression datasets of lung cancer. The SNRFCB approach was then validated in independent test cohorts for identifying chemotherapy responder cohorts and chemotherapy non-responder cohorts. SNRFCB involved the pre-selection of gene sub-network signatures based on the mutations and on protein-protein interaction data as well as the application of the random forest algorithm to gene expression datasets. Adjuvant CTX was significantly associated with the prolonged overall survival of lung cancer patients in the chemotherapy responder group (P = 0.008), but it was not beneficial to patients in the chemotherapy non-responder group (P = 0.657). Adjuvant CTX was significantly associated with the prolonged overall survival of lung cancer squamous cell carcinoma (SQCC) subtype patients in the chemotherapy responder cohorts (P = 0.024), but it was not beneficial to patients in the chemotherapy non-responder cohorts (P = 0.383). SNRFCB improved prediction performance as compared to the machine learning method, support vector machine (SVM). To test the general applicability of the predictive model, we further applied the SNRFCB approach to human breast cancer datasets and also observed superior performance. SNRFCB could provide recurrent probability for individual patients and identify which patients may benefit from adjuvant CTX in clinical trials.

  19. Spectral shaping of a randomized PWM DC-DC converter using maximum entropy probability distributions

    CSIR Research Space (South Africa)

    Dove, Albert

    2017-01-01

    Full Text Available maintaining constraints in a DC-DC converter is investigated. A probability distribution whose aim is to ensure maximal harmonic spreading and yet mainaint constraints is presented. The PDFs are determined from a direct application of the method of Maximum...

  20. Making Heads or Tails of Probability: An Experiment with Random Generators

    Science.gov (United States)

    Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie

    2013-01-01

    Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…

  1. A new formulation of the probability density function in random walk models for atmospheric dispersion

    DEFF Research Database (Denmark)

    Falk, Anne Katrine Vinther; Gryning, Sven-Erik

    1997-01-01

    In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials...

  2. On the regularity of the extinction probability of a branching process in varying and random environments

    International Nuclear Information System (INIS)

    Alili, Smail; Rugh, Hans Henrik

    2008-01-01

    We consider a supercritical branching process in time-dependent environment ξ. We assume that the offspring distributions depend regularly (C k or real-analytically) on real parameters λ. We show that the extinction probability q λ (ξ), given the environment ξ 'inherits' this regularity whenever the offspring distributions satisfy a condition of contraction-type. Our proof makes use of the Poincaré metric on the complex unit disc and a real-analytic implicit function theorem

  3. Prediction of 90Y Radioembolization Outcome from Pretherapeutic Factors with Random Survival Forests.

    Science.gov (United States)

    Ingrisch, Michael; Schöppe, Franziska; Paprottka, Karolin; Fabritius, Matthias; Strobl, Frederik F; De Toni, Enrico N; Ilhan, Harun; Todica, Andrei; Michl, Marlies; Paprottka, Philipp Marius

    2018-05-01

    Our objective was to predict the outcome of 90 Y radioembolization in patients with intrahepatic tumors from pretherapeutic baseline parameters and to identify predictive variables using a machine-learning approach based on random survival forests. Methods: In this retrospective study, 366 patients with primary ( n = 92) or secondary ( n = 274) liver tumors who had received 90 Y radioembolization were analyzed. A random survival forest was trained to predict individual risk from baseline values of cholinesterase, bilirubin, type of primary tumor, age at radioembolization, hepatic tumor burden, presence of extrahepatic disease, and sex. The predictive importance of each baseline parameter was determined using the minimal-depth concept, and the partial dependency of predicted risk on the continuous variables bilirubin level and cholinesterase level was determined. Results: Median overall survival was 11.4 mo (95% confidence interval, 9.7-14.2 mo), with 228 deaths occurring during the observation period. The random-survival-forest analysis identified baseline cholinesterase and bilirubin as the most important variables (forest-averaged lowest minimal depth, 1.2 and 1.5, respectively), followed by the type of primary tumor (1.7), age (2.4), tumor burden (2.8), and presence of extrahepatic disease (3.5). Sex had the highest forest-averaged minimal depth (5.5), indicating little predictive value. Baseline bilirubin levels above 1.5 mg/dL were associated with a steep increase in predicted mortality. Similarly, cholinesterase levels below 7.5 U predicted a strong increase in mortality. The trained random survival forest achieved a concordance index of 0.657, with an SE of 0.02, comparable to the concordance index of 0.652 and SE of 0.02 for a previously published Cox proportional hazards model. Conclusion: Random survival forests are a simple and straightforward machine-learning approach for prediction of overall survival. The predictive performance of the trained model

  4. Adjuvant Hormone Therapy May Improve Survival in Epithelial Ovarian Cancer: Results of the AHT Randomized Trial.

    Science.gov (United States)

    Eeles, Rosalind A; Morden, James P; Gore, Martin; Mansi, Janine; Glees, John; Wenczl, Miklos; Williams, Christopher; Kitchener, Henry; Osborne, Richard; Guthrie, David; Harper, Peter; Bliss, Judith M

    2015-12-10

    To assess the effects of adjuvant hormone therapy (AHT) on survival and disease outcome in women with epithelial ovarian cancer. Participants were premenopausal and postmenopausal women who had been diagnosed with epithelial ovarian cancer (any International Federation of Gynecology and Obstetrics stage) 9 or fewer months previously. Ineligible patients included those with deliberately preserved ovarian function, with a history of a hormone-dependent malignancy, or with any contraindications to hormone-replacement therapy. Patients were centrally randomly assigned in a 1:1 ratio to either AHT for 5 years after random assignment or no AHT (control). Main outcome measures were overall survival (OS), defined as time from random assignment to death (any cause), and relapse-free survival, defined as time from random assignment to relapse or death (any cause). Patients who continued, alive and relapse free, were censored at their last known follow-up. A total of 150 patients (n = 75, AHT; n = 75, control) were randomly assigned from 1990 to 1995 from 19 centers in the United Kingdom, Spain, and Hungary; all patients were included in intention-to-treat analyses. The median follow-up in alive patients is currently 19.1 years. Of the 75 patients with AHT, 53 (71%) have died compared with 68 (91%) of 75 patients in the control group. OS was significantly improved in patients who were receiving AHT (hazard ratio, 0.63; 95% CI, 0.44 to 0.90; P = .011). A similar effect was seen for relapse-free survival (hazard ratio, 0.67; 95% CI, 0.47 to 0.97; P = .032). Effects remained after adjustment for known prognostic factors. These results show that women who have severe menopausal symptoms after ovarian cancer treatment can safely take hormone-replacement therapy, and this may, in fact, infer benefits in terms of OS in addition to known advantages in terms of quality of life. © 2015 by American Society of Clinical Oncology.

  5. Formulas for Rational-Valued Separability Probabilities of Random Induced Generalized Two-Qubit States

    Directory of Open Access Journals (Sweden)

    Paul B. Slater

    2015-01-01

    Full Text Available Previously, a formula, incorporating a 5F4 hypergeometric function, for the Hilbert-Schmidt-averaged determinantal moments ρPTnρk/ρk of 4×4 density-matrices (ρ and their partial transposes (|ρPT|, was applied with k=0 to the generalized two-qubit separability probability question. The formula can, furthermore, be viewed, as we note here, as an averaging over “induced measures in the space of mixed quantum states.” The associated induced-measure separability probabilities (k=1,2,… are found—via a high-precision density approximation procedure—to assume interesting, relatively simple rational values in the two-re[al]bit (α=1/2, (standard two-qubit (α=1, and two-quater[nionic]bit (α=2 cases. We deduce rather simple companion (rebit, qubit, quaterbit, … formulas that successfully reproduce the rational values assumed for general  k. These formulas are observed to share certain features, possibly allowing them to be incorporated into a single master formula.

  6. The mean distance to the nth neighbour in a uniform distribution of random points: an application of probability theory

    International Nuclear Information System (INIS)

    Bhattacharyya, Pratip; Chakrabarti, Bikas K

    2008-01-01

    We study different ways of determining the mean distance (r n ) between a reference point and its nth neighbour among random points distributed with uniform density in a D-dimensional Euclidean space. First, we present a heuristic method; though this method provides only a crude mathematical result, it shows a simple way of estimating (r n ). Next, we describe two alternative means of deriving the exact expression of (r n ): we review the method using absolute probability and develop an alternative method using conditional probability. Finally, we obtain an approximation to (r n ) from the mean volume between the reference point and its nth neighbour and compare it with the heuristic and exact results

  7. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  8. Vasopressin, steroids, and epinephrine and neurologically favorable survival after in-hospital cardiac arrest: a randomized clinical trial.

    Science.gov (United States)

    Mentzelopoulos, Spyros D; Malachias, Sotirios; Chamos, Christos; Konstantopoulos, Demetrios; Ntaidou, Theodora; Papastylianou, Androula; Kolliantzaki, Iosifinia; Theodoridi, Maria; Ischaki, Helen; Makris, Dimosthemis; Zakynthinos, Epaminondas; Zintzaras, Elias; Sourlas, Sotirios; Aloizos, Stavros; Zakynthinos, Spyros G

    2013-07-17

    Among patients with cardiac arrest, preliminary data have shown improved return of spontaneous circulation and survival to hospital discharge with the vasopressin-steroids-epinephrine (VSE) combination. To determine whether combined vasopressin-epinephrine during cardiopulmonary resuscitation (CPR) and corticosteroid supplementation during and after CPR improve survival to hospital discharge with a Cerebral Performance Category (CPC) score of 1 or 2 in vasopressor-requiring, in-hospital cardiac arrest. Randomized, double-blind, placebo-controlled, parallel-group trial performed from September 1, 2008, to October 1, 2010, in 3 Greek tertiary care centers (2400 beds) with 268 consecutive patients with cardiac arrest requiring epinephrine according to resuscitation guidelines (from 364 patients assessed for eligibility). Patients received either vasopressin (20 IU/CPR cycle) plus epinephrine (1 mg/CPR cycle; cycle duration approximately 3 minutes) (VSE group, n = 130) or saline placebo plus epinephrine (1 mg/CPR cycle; cycle duration approximately 3 minutes) (control group, n = 138) for the first 5 CPR cycles after randomization, followed by additional epinephrine if needed. During the first CPR cycle after randomization, patients in the VSE group received methylprednisolone (40 mg) and patients in the control group received saline placebo. Shock after resuscitation was treated with stress-dose hydrocortisone (300 mg daily for 7 days maximum and gradual taper) (VSE group, n = 76) or saline placebo (control group, n = 73). Return of spontaneous circulation (ROSC) for 20 minutes or longer and survival to hospital discharge with a CPC score of 1 or 2. Follow-up was completed in all resuscitated patients. Patients in the VSE group vs patients in the control group had higher probability for ROSC of 20 minutes or longer (109/130 [83.9%] vs 91/138 [65.9%]; odds ratio [OR], 2.98; 95% CI, 1.39-6.40; P = .005) and survival to hospital discharge with CPC

  9. Effect of drain current on appearance probability and amplitude of random telegraph noise in low-noise CMOS image sensors

    Science.gov (United States)

    Ichino, Shinya; Mawaki, Takezo; Teramoto, Akinobu; Kuroda, Rihito; Park, Hyeonwoo; Wakashima, Shunichi; Goto, Tetsuya; Suwa, Tomoyuki; Sugawa, Shigetoshi

    2018-04-01

    Random telegraph noise (RTN), which occurs in in-pixel source follower (SF) transistors, has become one of the most critical problems in high-sensitivity CMOS image sensors (CIS) because it is a limiting factor of dark random noise. In this paper, the behaviors of RTN toward changes in SF drain current conditions were analyzed using a low-noise array test circuit measurement system with a floor noise of 35 µV rms. In addition to statistical analysis by measuring a large number of transistors (18048 transistors), we also analyzed the behaviors of RTN parameters such as amplitude and time constants in the individual transistors. It is demonstrated that the appearance probability of RTN becomes small under a small drain current condition, although large-amplitude RTN tends to appear in a very small number of cells.

  10. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  11. Frequency format diagram and probability chart for breast cancer risk communication: a prospective, randomized trial

    Directory of Open Access Journals (Sweden)

    Wahner-Roedler Dietlind

    2008-10-01

    Full Text Available Abstract Background Breast cancer risk education enables women make informed decisions regarding their options for screening and risk reduction. We aimed to determine whether patient education regarding breast cancer risk using a bar graph, with or without a frequency format diagram, improved the accuracy of risk perception. Methods We conducted a prospective, randomized trial among women at increased risk for breast cancer. The main outcome measurement was patients' estimation of their breast cancer risk before and after education with a bar graph (BG group or bar graph plus a frequency format diagram (BG+FF group, which was assessed by previsit and postvisit questionnaires. Results Of 150 women in the study, 74 were assigned to the BG group and 76 to the BG+FF group. Overall, 72% of women overestimated their risk of breast cancer. The improvement in accuracy of risk perception from the previsit to the postvisit questionnaire (BG group, 19% to 61%; BG+FF group, 13% to 67% was not significantly different between the 2 groups (P = .10. Among women who inaccurately perceived very high risk (≥ 50% risk, inaccurate risk perception decreased significantly in the BG+FF group (22% to 3% compared with the BG group (28% to 19% (P = .004. Conclusion Breast cancer risk communication using a bar graph plus a frequency format diagram can improve the short-term accuracy of risk perception among women perceiving inaccurately high risk.

  12. The Random Walk of Cars and Their Collision Probabilities with Planets

    Directory of Open Access Journals (Sweden)

    Hanno Rein

    2018-05-01

    Full Text Available On 6 February 2018, SpaceX launched a Tesla Roadster on a Mars-crossing orbit. We perform N-body simulations to determine the fate of the object over the next 15 Myr. The orbital evolution is initially dominated by close encounters with the Earth. While a precise orbit can not be predicted beyond the next several centuries due to these repeated chaotic scatterings, one can reliably predict the long-term outcomes by statistically analyzing a large suite of possible trajectories with slightly perturbed initial conditions. Repeated gravitational scatterings with Earth lead to a random walk. Collisions with the Earth, Venus and the Sun represent primary sinks for the Roadster’s orbital evolution. Collisions with Mercury and Mars, or ejections from the Solar System by Jupiter, are highly unlikely. We calculate a dynamical half-life of the Tesla of approximately 15 Myr, with some 22%, 12% and 12% of Roadster orbit realizations impacting the Earth, Venus, and the Sun within one half-life, respectively. Because the eccentricities and inclinations in our ensemble increase over time due to mean-motion and secular resonances, the impact rates with the terrestrial planets decrease beyond a few million years, whereas the impact rate on the Sun remains roughly constant.

  13. Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points

    KAUST Repository

    Migliorati, Giovanni

    2015-08-28

    We study the accuracy of the discrete least-squares approximation on a finite dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise / offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

  14. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  15. How Long Do the Dead Survive on the Road? Carcass Persistence Probability and Implications for Road-Kill Monitoring Surveys

    OpenAIRE

    Santos, Sara; Carvalho, Filipe; Mira, António

    2011-01-01

    Background: Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the eff...

  16. Risk Prediction of One-Year Mortality in Patients with Cardiac Arrhythmias Using Random Survival Forest

    Directory of Open Access Journals (Sweden)

    Fen Miao

    2015-01-01

    Full Text Available Existing models for predicting mortality based on traditional Cox proportional hazard approach (CPH often have low prediction accuracy. This paper aims to develop a clinical risk model with good accuracy for predicting 1-year mortality in cardiac arrhythmias patients using random survival forest (RSF, a robust approach for survival analysis. 10,488 cardiac arrhythmias patients available in the public MIMIC II clinical database were investigated, with 3,452 deaths occurring within 1-year followups. Forty risk factors including demographics and clinical and laboratory information and antiarrhythmic agents were analyzed as potential predictors of all-cause mortality. RSF was adopted to build a comprehensive survival model and a simplified risk model composed of 14 top risk factors. The built comprehensive model achieved a prediction accuracy of 0.81 measured by c-statistic with 10-fold cross validation. The simplified risk model also achieved a good accuracy of 0.799. Both results outperformed traditional CPH (which achieved a c-statistic of 0.733 for the comprehensive model and 0.718 for the simplified model. Moreover, various factors are observed to have nonlinear impact on cardiac arrhythmias prognosis. As a result, RSF based model which took nonlinearity into account significantly outperformed traditional Cox proportional hazard model and has great potential to be a more effective approach for survival analysis.

  17. Survival probability of Baltic larval cod in relation to spatial overlap patterns with their prey obtained from drift model studies

    DEFF Research Database (Denmark)

    Hinrichsen, H.H.; Schmidt, J.O.; Petereit, C.

    2005-01-01

    Temporal mismatch between the occurrence of larvae and their prey potentially affects the spatial overlap and thus the contact rates between predator and prey. This might have important consequences for growth and survival. We performed a case study investigating the influence of circulation......-prey overlap, dependent on the hatching time of cod larvae. By performing model runs for the years 1979-1998 investigated the intra- and interannual variability of potential spatial overlap between predator and prey. Assuming uniform prey distributions, we generally found the overlap to have decreased since...

  18. Survival modeling for the estimation of transition probabilities in model-based economic evaluations in the absence of individual patient data: a tutorial.

    Science.gov (United States)

    Diaby, Vakaramoko; Adunlin, Georges; Montero, Alberto J

    2014-02-01

    Survival modeling techniques are increasingly being used as part of decision modeling for health economic evaluations. As many models are available, it is imperative for interested readers to know about the steps in selecting and using the most suitable ones. The objective of this paper is to propose a tutorial for the application of appropriate survival modeling techniques to estimate transition probabilities, for use in model-based economic evaluations, in the absence of individual patient data (IPD). An illustration of the use of the tutorial is provided based on the final progression-free survival (PFS) analysis of the BOLERO-2 trial in metastatic breast cancer (mBC). An algorithm was adopted from Guyot and colleagues, and was then run in the statistical package R to reconstruct IPD, based on the final PFS analysis of the BOLERO-2 trial. It should be emphasized that the reconstructed IPD represent an approximation of the original data. Afterwards, we fitted parametric models to the reconstructed IPD in the statistical package Stata. Both statistical and graphical tests were conducted to verify the relative and absolute validity of the findings. Finally, the equations for transition probabilities were derived using the general equation for transition probabilities used in model-based economic evaluations, and the parameters were estimated from fitted distributions. The results of the application of the tutorial suggest that the log-logistic model best fits the reconstructed data from the latest published Kaplan-Meier (KM) curves of the BOLERO-2 trial. Results from the regression analyses were confirmed graphically. An equation for transition probabilities was obtained for each arm of the BOLERO-2 trial. In this paper, a tutorial was proposed and used to estimate the transition probabilities for model-based economic evaluation, based on the results of the final PFS analysis of the BOLERO-2 trial in mBC. The results of our study can serve as a basis for any model

  19. Precise lim sup behavior of probabilities of large deviations for sums of i.i.d. random variables

    Directory of Open Access Journals (Sweden)

    Andrew Rosalsky

    2004-12-01

    Full Text Available Let {X,Xn;n≥1} be a sequence of real-valued i.i.d. random variables and let Sn=∑i=1nXi, n≥1. In this paper, we study the probabilities of large deviations of the form P(Sn>tn1/p, P(Sntn1/p, where t>0 and 0x1/p/ϕ(x=1, then for every t>0, limsupn→∞P(|Sn|>tn1/p/(nϕ(n=tpα.

  20. On Randomness and Probability

    Indian Academy of Sciences (India)

    An axiomatic development of such a model is given below. It is also shown ... teacher needs to decide which students deserve to be promoted to the next class - it is not ... whether an unborn child would be a boy or a girl, the total number of births in a ..... that the outcome of the previous trials has no influence on the next trial.

  1. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    Science.gov (United States)

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  2. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    Science.gov (United States)

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  3. Continuous time random walk model with asymptotical probability density of waiting times via inverse Mittag-Leffler function

    Science.gov (United States)

    Liang, Yingjie; Chen, Wen

    2018-04-01

    The mean squared displacement (MSD) of the traditional ultraslow diffusion is a logarithmic function of time. Recently, the continuous time random walk model is employed to characterize this ultraslow diffusion dynamics by connecting the heavy-tailed logarithmic function and its variation as the asymptotical waiting time density. In this study we investigate the limiting waiting time density of a general ultraslow diffusion model via the inverse Mittag-Leffler function, whose special case includes the traditional logarithmic ultraslow diffusion model. The MSD of the general ultraslow diffusion model is analytically derived as an inverse Mittag-Leffler function, and is observed to increase even more slowly than that of the logarithmic function model. The occurrence of very long waiting time in the case of the inverse Mittag-Leffler function has the largest probability compared with the power law model and the logarithmic function model. The Monte Carlo simulations of one dimensional sample path of a single particle are also performed. The results show that the inverse Mittag-Leffler waiting time density is effective in depicting the general ultraslow random motion.

  4. WE-H-BRA-08: A Monte Carlo Cell Nucleus Model for Assessing Cell Survival Probability Based On Particle Track Structure Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B [Northwestern Memorial Hospital, Chicago, IL (United States); Georgia Institute of Technology, Atlanta, GA (Georgia); Wang, C [Georgia Institute of Technology, Atlanta, GA (Georgia)

    2016-06-15

    Purpose: To correlate the damage produced by particles of different types and qualities to cell survival on the basis of nanodosimetric analysis and advanced DNA structures in the cell nucleus. Methods: A Monte Carlo code was developed to simulate subnuclear DNA chromatin fibers (CFs) of 30nm utilizing a mean-free-path approach common to radiation transport. The cell nucleus was modeled as a spherical region containing 6000 chromatin-dense domains (CDs) of 400nm diameter, with additional CFs modeled in a sparser interchromatin region. The Geant4-DNA code was utilized to produce a particle track database representing various particles at different energies and dose quantities. These tracks were used to stochastically position the DNA structures based on their mean free path to interaction with CFs. Excitation and ionization events intersecting CFs were analyzed using the DBSCAN clustering algorithm for assessment of the likelihood of producing DSBs. Simulated DSBs were then assessed based on their proximity to one another for a probability of inducing cell death. Results: Variations in energy deposition to chromatin fibers match expectations based on differences in particle track structure. The quality of damage to CFs based on different particle types indicate more severe damage by high-LET radiation than low-LET radiation of identical particles. In addition, the model indicates more severe damage by protons than of alpha particles of same LET, which is consistent with differences in their track structure. Cell survival curves have been produced showing the L-Q behavior of sparsely ionizing radiation. Conclusion: Initial results indicate the feasibility of producing cell survival curves based on the Monte Carlo cell nucleus method. Accurate correlation between simulated DNA damage to cell survival on the basis of nanodosimetric analysis can provide insight into the biological responses to various radiation types. Current efforts are directed at producing cell

  5. Predicting treatment effect from surrogate endpoints and historical trials: an extrapolation involving probabilities of a binary outcome or survival to a specific time.

    Science.gov (United States)

    Baker, Stuart G; Sargent, Daniel J; Buyse, Marc; Burzykowski, Tomasz

    2012-03-01

    Using multiple historical trials with surrogate and true endpoints, we consider various models to predict the effect of treatment on a true endpoint in a target trial in which only a surrogate endpoint is observed. This predicted result is computed using (1) a prediction model (mixture, linear, or principal stratification) estimated from historical trials and the surrogate endpoint of the target trial and (2) a random extrapolation error estimated from successively leaving out each trial among the historical trials. The method applies to either binary outcomes or survival to a particular time that is computed from censored survival data. We compute a 95% confidence interval for the predicted result and validate its coverage using simulation. To summarize the additional uncertainty from using a predicted instead of true result for the estimated treatment effect, we compute its multiplier of standard error. Software is available for download. © 2011, The International Biometric Society No claim to original US government works.

  6. Fatigue Reliability under Random Loads

    DEFF Research Database (Denmark)

    Talreja, R.

    1979-01-01

    We consider the problem of estimating the probability of survival (non-failure) and the probability of safe operation (strength greater than a limiting value) of structures subjected to random loads. These probabilities are formulated in terms of the probability distributions of the loads...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....

  7. Long-term survival in laparoscopic vs open resection for colorectal liver metastases: inverse probability of treatment weighting using propensity scores.

    Science.gov (United States)

    Lewin, Joel W; O'Rourke, Nicholas A; Chiow, Adrian K H; Bryant, Richard; Martin, Ian; Nathanson, Leslie K; Cavallucci, David J

    2016-02-01

    This study compares long-term outcomes between intention-to-treat laparoscopic and open approaches to colorectal liver metastases (CLM), using inverse probability of treatment weighting (IPTW) based on propensity scores to control for selection bias. Patients undergoing liver resection for CLM by 5 surgeons at 3 institutions from 2000 to early 2014 were analysed. IPTW based on propensity scores were generated and used to assess the marginal treatment effect of the laparoscopic approach via a weighted Cox proportional hazards model. A total of 298 operations were performed in 256 patients. 7 patients with planned two-stage resections were excluded leaving 284 operations in 249 patients for analysis. After IPTW, the population was well balanced. With a median follow up of 36 months, 5-year overall survival (OS) and recurrence-free survival (RFS) for the cohort were 59% and 38%. 146 laparoscopic procedures were performed in 140 patients, with weighted 5-year OS and RFS of 54% and 36% respectively. In the open group, 138 procedures were performed in 122 patients, with a weighted 5-year OS and RFS of 63% and 38% respectively. There was no significant difference between the two groups in terms of OS or RFS. In the Brisbane experience, after accounting for bias in treatment assignment, long term survival after LLR for CLM is equivalent to outcomes in open surgery. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  8. First estimates of the probability of survival in a small-bodied, high-elevation frog (Boreal Chorus Frog, Pseudacris maculata), or how historical data can be useful

    Science.gov (United States)

    Muths, Erin L.; Scherer, R. D.; Amburgey, S. M.; Matthews, T.; Spencer, A. W.; Corn, P.S.

    2016-01-01

    In an era of shrinking budgets yet increasing demands for conservation, the value of existing (i.e., historical) data are elevated. Lengthy time series on common, or previously common, species are particularly valuable and may be available only through the use of historical information. We provide first estimates of the probability of survival and longevity (0.67–0.79 and 5–7 years, respectively) for a subalpine population of a small-bodied, ostensibly common amphibian, the Boreal Chorus Frog (Pseudacris maculata (Agassiz, 1850)), using historical data and contemporary, hypothesis-driven information–theoretic analyses. We also test a priori hypotheses about the effects of color morph (as suggested by early reports) and of drought (as suggested by recent climate predictions) on survival. Using robust mark–recapture models, we find some support for early hypotheses regarding the effect of color on survival, but we find no effect of drought. The congruence between early findings and our analyses highlights the usefulness of historical information in providing raw data for contemporary analyses and context for conservation and management decisions.

  9. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  10. Fracture strength and probability of survival of narrow and extra-narrow dental implants after fatigue testing: In vitro and in silico analysis.

    Science.gov (United States)

    Bordin, Dimorvan; Bergamo, Edmara T P; Fardin, Vinicius P; Coelho, Paulo G; Bonfante, Estevam A

    2017-07-01

    To assess the probability of survival (reliability) and failure modes of narrow implants with different diameters. For fatigue testing, 42 implants with the same macrogeometry and internal conical connection were divided, according to diameter, as follows: narrow (Ø3.3×10mm) and extra-narrow (Ø2.9×10mm) (21 per group). Identical abutments were torqued to the implants and standardized maxillary incisor crowns were cemented and subjected to step-stress accelerated life testing (SSALT) in water. The use-level probability Weibull curves, and reliability for a mission of 50,000 and 100,000 cycles at 50N, 100, 150 and 180N were calculated. For the finite element analysis (FEA), two virtual models, simulating the samples tested in fatigue, were constructed. Loading at 50N and 100N were applied 30° off-axis at the crown. The von-Mises stress was calculated for implant and abutment. The beta (β) values were: 0.67 for narrow and 1.32 for extra-narrow implants, indicating that failure rates did not increase with fatigue in the former, but more likely were associated with damage accumulation and wear-out failures in the latter. Both groups showed high reliability (up to 97.5%) at 50 and 100N. A decreased reliability was observed for both groups at 150 and 180N (ranging from 0 to 82.3%), but no significant difference was observed between groups. Failure predominantly involved abutment fracture for both groups. FEA at 50N-load, Ø3.3mm showed higher von-Mises stress for abutment (7.75%) and implant (2%) when compared to the Ø2.9mm. There was no significant difference between narrow and extra-narrow implants regarding probability of survival. The failure mode was similar for both groups, restricted to abutment fracture. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. The adverse effect of selective cyclooxygenase-2 inhibitor on random skin flap survival in rats.

    Directory of Open Access Journals (Sweden)

    Haiyong Ren

    Full Text Available BACKGROUND: Cyclooxygenase-2(COX-2 inhibitors provide desired analgesic effects after injury or surgery, but evidences suggested they also attenuate wound healing. The study is to investigate the effect of COX-2 inhibitor on random skin flap survival. METHODS: The McFarlane flap model was established in 40 rats and evaluated within two groups, each group gave the same volume of Parecoxib and saline injection for 7 days. The necrotic area of the flap was measured, the specimens of the flap were stained with haematoxylin-eosin(HE for histologic analysis. Immunohistochemical staining was performed to analyse the level of VEGF and COX-2 . RESULTS: 7 days after operation, the flap necrotic area ratio in study group (66.65 ± 2.81% was significantly enlarged than that of the control group(48.81 ± 2.33%(P <0.01. Histological analysis demonstrated angiogenesis with mean vessel density per mm(2 being lower in study group (15.4 ± 4.4 than in control group (27.2 ± 4.1 (P <0.05. To evaluate the expression of COX-2 and VEGF protein in the intermediate area II in the two groups by immunohistochemistry test .The expression of COX-2 in study group was (1022.45 ± 153.1, and in control group was (2638.05 ± 132.2 (P <0.01. The expression of VEGF in the study and control groups were (2779.45 ± 472.0 vs (4938.05 ± 123.6(P <0.01.In the COX-2 inhibitor group, the expressions of COX-2 and VEGF protein were remarkably down-regulated as compared with the control group. CONCLUSION: Selective COX-2 inhibitor had adverse effect on random skin flap survival. Suppression of neovascularization induced by low level of VEGF was supposed to be the biological mechanism.

  12. Effect of botulinum toxin A and nitroglycerin on random skin flap survival in rats.

    Science.gov (United States)

    Ghanbarzadeh, Kourosh; Tabatabaie, Omid Reza; Salehifar, Ebrahim; Amanlou, Massoud; Khorasani, Ghasemali

    2016-01-01

    A suitable pharmacological substitute for the well-established surgical delay technique for random skin flaps to increase viability has been elusive. To evaluate the effects of nitroglycerin and botulinum toxin type A on random flap survival in a rat model. The present controlled experimental study was performed in the four groups of rats. One week after intervention in each group, the flap was raised and kept in situ, and flap necrosis was evaluated through follow-up. Group 1 received intradermal botulinum toxin type A (BTX-A) and topical nitroglycerin 2%; group 2 received BTX-A and topical Vaseline (Unilever, USA); group 3 received topical nitroglycerin and intradermal normal saline; and group 4 received topical Vaseline and intradermal normal saline. BTX-A reduced the area of necrosis compared with control (24% versus 56% respectively; P<0.001). Nitroglycerin application was associated with a trend toward improved flap viability (42% versus 56%; P=0.059). The combination of topical nitroglycerin and BTX-A, compared with Vaseline and BTX-A, was associated with decreased flap necrosis (16.1% versus 24%, respectively), although it was not statistically significant (P=0.45). BTX-A was effective in reducing distal flap necrosis. The effect of BTX-A was significantly more pronounced than nitroglycerin ointment.

  13. Association between expression of random gene sets and survival is evident in multiple cancer types and may be explained by sub-classification

    Science.gov (United States)

    2018-01-01

    One of the goals of cancer research is to identify a set of genes that cause or control disease progression. However, although multiple such gene sets were published, these are usually in very poor agreement with each other, and very few of the genes proved to be functional therapeutic targets. Furthermore, recent findings from a breast cancer gene-expression cohort showed that sets of genes selected randomly can be used to predict survival with a much higher probability than expected. These results imply that many of the genes identified in breast cancer gene expression analysis may not be causal of cancer progression, even though they can still be highly predictive of prognosis. We performed a similar analysis on all the cancer types available in the cancer genome atlas (TCGA), namely, estimating the predictive power of random gene sets for survival. Our work shows that most cancer types exhibit the property that random selections of genes are more predictive of survival than expected. In contrast to previous work, this property is not removed by using a proliferation signature, which implies that proliferation may not always be the confounder that drives this property. We suggest one possible solution in the form of data-driven sub-classification to reduce this property significantly. Our results suggest that the predictive power of random gene sets may be used to identify the existence of sub-classes in the data, and thus may allow better understanding of patient stratification. Furthermore, by reducing the observed bias this may allow more direct identification of biologically relevant, and potentially causal, genes. PMID:29470520

  14. Análisis de supervivencia en presencia de riesgos competitivos: estimadores de la probabilidad de suceso Survival analysis with competing risks: estimating failure probability

    Directory of Open Access Journals (Sweden)

    Javier Llorca

    2004-10-01

    Full Text Available Objetivo: Mostrar el efecto de los riesgos competitivos de muerte en el análisis de supervivencia. Métodos: Se presenta un ejemplo sobre la supervivencia libre de rechazo tras un trasplante cardíaco, en el que la muerte antes de desarrollar el rechazo actúa como riesgo competitivo. Mediante una simulación se comparan el estimador de Kaplan-Meier y el modelo de decrementos múltiples. Resultados: El método de Kaplan-Meier sobrestima el riesgo de rechazo. A continuación, se expone la aplicación del modelo de decrementos múltiples para el análisis de acontecimientos secundarios (en el ejemplo, la muerte tras el rechazo. Finalmente, se discuten las asunciones propias del método de Kaplan-Meier y las razones por las que no puede ser aplicado en presencia de riesgos competitivos. Conclusiones: El análisis de supervivencia debe ajustarse por los riesgos competitivos de muerte para evitar la sobrestimación del riesgo de fallo que se produce con el método de Kaplan-Meier.Objective: To show the impact of competing risks of death on survival analysis. Method: We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. Results: The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection. Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Conclusions: Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  15. Effect of Electroacupuncture at The Zusanli Point (Stomach-36) on Dorsal Random Pattern Skin Flap Survival in a Rat Model.

    Science.gov (United States)

    Wang, Li-Ren; Cai, Le-Yi; Lin, Ding-Sheng; Cao, Bin; Li, Zhi-Jie

    2017-10-01

    Random skin flaps are commonly used for wound repair and reconstruction. Electroacupuncture at The Zusanli point could enhance microcirculation and blood perfusion in random skin flaps. To determine whether electroacupuncture at The Zusanli point can improve the survival of random skin flaps in a rat model. Thirty-six male Sprague Dawley rats were randomly divided into 3 groups: control group (no electroacupuncture), Group A (electroacupuncture at a nonacupoint near The Zusanli point), and Group B (electroacupuncture at The Zusanli point). McFarlane flaps were established. On postoperative Day 2, malondialdehyde (MDA) and superoxide dismutase were detected. The flap survival rate was evaluated, inflammation was examined in hematoxylin and eosin-stained slices, and the expression of vascular endothelial growth factor (VEGF) was measured immunohistochemically on Day 7. The mean survival area of the flaps in Group B was significantly larger than that in the control group and Group A. Superoxide dismutase activity and VEGF expression level were significantly higher in Group B than those in the control group and Group A, whereas MDA and inflammation levels in Group B were significantly lower than those in the other 2 groups. Electroacupuncture at The Zusanli point can effectively improve the random flap survival.

  16. Probability calculus of fractional order and fractional Taylor's series application to Fokker-Planck equation and information of non-random functions

    International Nuclear Information System (INIS)

    Jumarie, Guy

    2009-01-01

    A probability distribution of fractional (or fractal) order is defined by the measure μ{dx} = p(x)(dx) α , 0 α (D x α h α )f(x) provided by the modified Riemann Liouville definition, one can expand a probability calculus parallel to the standard one. A Fourier's transform of fractional order using the Mittag-Leffler function is introduced, together with its inversion formula; and it provides a suitable generalization of the characteristic function of fractal random variables. It appears that the state moments of fractional order are more especially relevant. The main properties of this fractional probability calculus are outlined, it is shown that it provides a sound approach to Fokker-Planck equation which are fractional in both space and time, and it provides new results in the information theory of non-random functions.

  17. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions

    DEFF Research Database (Denmark)

    Yura, Harold; Hanson, Steen Grüner

    2012-01-01

    with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative...

  18. The design and analysis of salmonid tagging studies in the Columbia basin. Volume 8: A new model for estimating survival probabilities and residualization from a release-recapture study of fall chinook salmon (Oncorhynchus tschawytscha) smolts in the Snake River

    International Nuclear Information System (INIS)

    Lowther, A.B.; Skalski, J.

    1997-09-01

    Standard release-recapture analysis using Cormack-Jolly-Seber (CJS) models to estimate survival probabilities between hydroelectric facilities for Snake river fall chinook salmon (Oncorhynchus tschawytscha) ignore the possibility of individual fish residualizing and completing their migration in the year following tagging. These models do not utilize available capture history data from this second year and, thus, produce negatively biased estimates of survival probabilities. A new multinomial likelihood model was developed that results in biologically relevant, unbiased estimates of survival probabilities using the full two years of capture history data. This model was applied to 1995 Snake River fall chinook hatchery releases to estimate the true survival probability from one of three upstream release points (Asotin, Billy Creek, and Pittsburgh Landing) to Lower Granite Dam. In the data analyzed here, residualization is not a common physiological response and thus the use of CJS models did not result in appreciably different results than the true survival probability obtained using the new multinomial likelihood model

  19. Survival in Malnourished Older Patients Receiving Post-Discharge Nutritional Support; Long-Term Results of a Randomized Controlled Trial.

    Science.gov (United States)

    Neelemaat, F; van Keeken, S; Langius, J A E; de van der Schueren, M A E; Thijs, A; Bosmans, J E

    2017-01-01

    Previous analyses have shown that a post-discharge individualized nutritional intervention had positive effects on body weight, lean body mass, functional limitations and fall incidents in malnourished older patients. However, the impact of this intervention on survival has not yet been studied. The objective of this randomized controlled study was to examine the effect of a post-discharge individualized nutritional intervention on survival in malnourished older patients. Malnourished older patients, aged ≥ 60 years, were randomized during hospitalization to a three-months post-discharge nutritional intervention group (protein and energy enriched diet, oral nutritional supplements, vitamin D3/calcium supplement and telephone counseling by a dietitian) or to a usual care regimen (control group). Survival data were collected 4 years after enrollment. Survival analyses were performed using intention-to-treat analysis by Log-rank tests and Cox regression adjusted for confounders. The study population consisted of 94 men (45%) and 116 women with a mean age of 74.5 (SD 9.5) years. There were no statistically significant differences in baseline characteristics. Survival data was available in 208 out of 210 patients. After 1 and 4 years of follow-up, survival rates were respectively 66% and 29% in the intervention group (n=104) and 73% and 30% in the control group (n=104). There were no statistically significant differences in survival between the two groups 1 year (HR= 0.933, 95% CI=0.675-1.289) and 4 years after enrollment (HR=0.928, 95% CI=0.671-1.283). The current study failed to show an effect of a three-months post-discharge multi-component nutritional intervention in malnourished older patients on long-term survival, despite the positive effects on short-term outcome such as functional limitations and falls.

  20. Use of Systemic Rosmarinus Officinalis to Enhance the Survival of Random-Pattern Skin Flaps

    Directory of Open Access Journals (Sweden)

    Bilsev İnce

    2016-12-01

    Full Text Available Background: Skin flaps are commonly used in soft-tissue reconstruction; however, necrosis can be a frequent complication. Several systemic and local agents have been used in attempts to improve skin flap survival, but none that can prevent flap necrosis have been identified. Aims: This study aims to determine whether the use of systemic Rosmarinus officinalis (R. officinalis extract can prevent flap necrosis and improve skin flap recovery. Study Design: Animal experimentation. Methods: Thirty-five Wistar albino rats were divided in five groups. A rectangular random-pattern flaps measuring 8×2 cm was elevated from the back of each rat. Group I was the control group. In Group II, 0.2 ml of R. officinalis oil was given orally 2h before surgery. R. officinalis oil was then applied orally twice a day for a week. In Group III, R. officinalis oil was given orally twice a day for one week before surgery. At the end of the week, 0.2 mL of R. officinalis oil was given orally 2 h before surgery. In Group IV, 0.2 mL of R. officinalis oil was injected subcutaneously 2 h before surgery. After the surgery, 0.2 mL R. officinalis oil was injected subcutaneously twice a day for one week. In Group V, 0.2 mL R. officinalis oil was injected subcutaneously twice a day for one week prior to surgery. At the end of the week, one last 0.2 mL R. officinalis oil injection was administered subcutaneously 2 h before surgery. After the surgery, 0.2 mL R. officinalis oil was injected subcutaneously twice a day for one week. Results: The mean percentage of viable surface area was significantly greater (p<0.05 in Groups II, III, IV, and V as compared to Group I. Mean vessel diameter was significantly greater (p<0.05 in Groups II, III, IV, and V as compared to Group I. Conclusion: We have determined that, in addition to its anti-inflammatory and anti-oxidant effects, R. officinalis has vasodilatory effects that contribute to increased skin flap survival.

  1. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  2. Ten-Year Survival Results of a Randomized Trial of Irradiation of Internal Mammary Nodes After Mastectomy

    International Nuclear Information System (INIS)

    Hennequin, Christophe; Bossard, Nadine; Servagi-Vernat, Stéphanie; Maingon, Philippe; Dubois, Jean-Bernard; Datchary, Jean; Carrie, Christian; Roullet, Bernard; Suchaud, Jean-Philippe; Teissier, Eric; Lucardi, Audrey; Gerard, Jean-Pierre; Belot, Aurélien

    2013-01-01

    Purpose: To evaluate the efficacy of irradiation of internal mammary nodes (IMN) on 10-year overall survival in breast cancer patients after mastectomy. Methods and Patients: This multicenter phase 3 study enrolled patients with positive axillary nodes (pN+) or central/medial tumors with or without pN+. Other inclusion criteria were age <75 and a Karnofsky index ≥70. All patients received postoperative irradiation of the chest wall and supraclavicular nodes and were randomly assigned to receive IMN irradiation or not. Randomization was stratified by tumor location (medial/central or lateral), axillary lymph node status, and adjuvant therapy (chemotherapy vs no chemotherapy). The prescribed dose of irradiation to the target volumes was 50 Gy or equivalent. The first 5 intercostal spaces were included in the IMN target volume, and two-thirds of the dose (31.5 Gy) was given by electrons. The primary outcome was overall survival at 10 years. Disease-free survival and toxicity were secondary outcomes. Results: T total of 1334 patients were analyzed after a median follow-up of 11.3 years among the survivors. No benefit of IMN irradiation on the overall survival could be demonstrated: the 10-year overall survival was 59.3% in the IMN-nonirradiated group versus 62.6% in the IMN-irradiated group (P=.8). According to stratification factors, we defined 6 subgroups (medial/central or lateral tumor, pN0 [only for medial/central] or pN+, and chemotherapy or not). In all these subgroups, IMN irradiation did not significantly improve overall survival. Conclusions: In patients treated with 2-dimensional techniques, we failed to demonstrate a survival benefit for IMN irradiation. This study cannot rule out a moderate benefit, especially with more modern, conformal techniques applied to a higher risk population

  3. Effects of Benazepril on Survival of Dogs with Chronic Kidney Disease: A Multicenter, Randomized, Blinded, Placebo-Controlled Clinical Trial.

    Science.gov (United States)

    King, J N; Font, A; Rousselot, J-F; Ash, R A; Bonfanti, U; Brovida, C; Crowe, I D; Lanore, D; Pechereau, D; Seewald, W; Strehlau, G

    2017-07-01

    Chronic kidney disease (CKD) is an important cause of morbidity and mortality in dogs. To evaluate the efficacy in prolonging survival and safety of benazepril administration to dogs with CKD. Forty-nine client-owned dogs with CKD. Dogs were randomized to benazepril (0.25 to benazepril versus placebo was detected for renal survival time in all dogs; median (95% confidence interval (CI)) survival times were 305 (53-575) days in the benazepril group and 287 (152-not available) in the placebo group (P = .53). Renal survival times were not significantly longer with benazepril compared to placebo for subgroups: hazard ratios (95% CI) were 0.50 (0.21-1.22) with P = .12 for initial urine protein-to-creatinine ratio (UPC) >0.5, and 0.38 (0.12-1.19) with P = .080 for initial UPC >0.5 plus plasma creatinine ≤440 μmol/L. Proteinuria, assessed from the UPC, was significantly (P = .0032) lower after treatment with benazepril compared to placebo. There were no significant differences between groups for clinical signs or frequencies of adverse events. Benazepril significantly reduced proteinuria in dogs with CKD. Insufficient numbers of dogs were recruited to allow conclusions on survival time. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  4. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    International Nuclear Information System (INIS)

    George, L.L.

    1983-01-01

    This workshop describes some probability problems in power plant reliability and maintenance analysis. The problems are seismic risk analysis, loss of load probability, load combinations, and load sharing. The seismic risk problem is to compute power plant reliability given an earthquake and the resulting risk. Component survival occurs if its peak random response to the earthquake does not exceed its strength. Power plant survival is a complicated Boolean function of component failures and survivals. The responses and strengths of components are dependent random processes, and the peak responses are maxima of random processes. The resulting risk is the expected cost of power plant failure

  5. Statistical modelling of survival data with random effects h-likelihood approach

    CERN Document Server

    Ha, Il Do; Lee, Youngjo

    2017-01-01

    This book provides a groundbreaking introduction to the likelihood inference for correlated survival data via the hierarchical (or h-) likelihood in order to obtain the (marginal) likelihood and to address the computational difficulties in inferences and extensions. The approach presented in the book overcomes shortcomings in the traditional likelihood-based methods for clustered survival data such as intractable integration. The text includes technical materials such as derivations and proofs in each chapter, as well as recently developed software programs in R (“frailtyHL”), while the real-world data examples together with an R package, “frailtyHL” in CRAN, provide readers with useful hands-on tools. Reviewing new developments since the introduction of the h-likelihood to survival analysis (methods for interval estimation of the individual frailty and for variable selection of the fixed effects in the general class of frailty models) and guiding future directions, the book is of interest to research...

  6. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  7. Prestack inversion based on anisotropic Markov random field-maximum posterior probability inversion and its application to identify shale gas sweet spots

    Science.gov (United States)

    Wang, Kang-Ning; Sun, Zan-Dong; Dong, Ning

    2015-12-01

    Economic shale gas production requires hydraulic fracture stimulation to increase the formation permeability. Hydraulic fracturing strongly depends on geomechanical parameters such as Young's modulus and Poisson's ratio. Fracture-prone sweet spots can be predicted by prestack inversion, which is an ill-posed problem; thus, regularization is needed to obtain unique and stable solutions. To characterize gas-bearing shale sedimentary bodies, elastic parameter variations are regarded as an anisotropic Markov random field. Bayesian statistics are adopted for transforming prestack inversion to the maximum posterior probability. Two energy functions for the lateral and vertical directions are used to describe the distribution, and the expectation-maximization algorithm is used to estimate the hyperparameters of the prior probability of elastic parameters. Finally, the inversion yields clear geological boundaries, high vertical resolution, and reasonable lateral continuity using the conjugate gradient method to minimize the objective function. Antinoise and imaging ability of the method were tested using synthetic and real data.

  8. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  9. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  10. Isoflurane Preconditioning Increases Survival of Rat Skin Random-Pattern Flaps by Induction of HIF-1α Expression

    Directory of Open Access Journals (Sweden)

    Yu Sun

    2013-04-01

    Full Text Available Background: Survival of random-pattern skin flaps is important for the success of plastic and reconstructive surgeries. This study investigates isoflurane-induced protection against ischemia of skin flap and the underlying molecular mechanism in this process. Methods: Human umbilical vein endothelial cells (HUVECs and human skin fibroblast cells were exposed to isoflurane for 4 h. Expression of hypoxia inducible factor-1α (HIF-1α, heme oxygenase-1 (HO-1 and vascular endothelial growth factor (VEGF were analyzed up to 24 h post isoflurane exposure using qRT-PCR and western blot, or ELISA analyses. PI3K inhibitors - LY 294002 and wortmannin, mTOR inhibitor - rapamycin, and GSK3β inhibitor - SB 216763 were used respectively to assess the effects of isoflurane treatment and HIF-1α expression. Furthermore, 40 rats were randomly divided into 5 groups (control, isoflurane, scrambled siRNA plus isoflurane, HIF-1α siRNA plus isoflurane, and DMOG and subjected to random-pattern skin flaps operation. Rats were prepared for evaluation of flap survival and full-feld laser perfusion imager (FLPI (at 7 day and microvessel density evaluation (at 10 day. Results: Isoflurane exposure induced expression of HIF-1α protein, HO-1 and VEGF mRNA and proteins in a time-dependent manner. Both LY 294002 and wortmannin inhibited phospho-Akt, phospho-mTOR, phospho-GSK 3β and HIF-1α expression after isoflurane exposure. Both wortmannin and rapamycin inhibited isoflurane-induced phospho-4E-BP1 (Ser 65 and phospho-P70s6k (Thr 389 and HIF-1α expression. SB 216763 pre-treatment could further enhance isoflurane-induced expression of phospho-GSK 3β (Ser 9 and HIF-1α protein compared to the isoflurane-alone cells. In animal experiments, isoflurane alone, scrambled siRNA plus isoflurane, or DMOG groups had significantly upregulated vascularity and increased survival of the skin flaps compared to the controls. However, HIF-1α knockdown abrogated the protective effect of

  11. Survival probability of larval sprat in response to decadal changes in diel vertical migration behavior and prey abundance in the Baltic Sea

    DEFF Research Database (Denmark)

    Hinrichsen, Hans-Harald; Peck, Myron A.; Schmidt, Jörn

    2010-01-01

    distribution and climate-driven abiotic and biotic environmental factors including variability in the abundance of different, key prey species (calanoid copepods) as well as seasonal changes, long-term trends, and spatial differences in water temperature. Climate forcing affected Baltic sprat larval survival......, larvae were predicted to experience optimal conditions to ensure higher survival throughout the later larval and early juvenile stages. However, this behavioral shift also increased the susceptibility of larvae to unfavorable winddriven surface currents, contributing to the marked increase in interannual...

  12. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  13. Enriched enteral nutrition may improve short-term survival in stage IV gastric cancer patients: A randomized, controlled trial.

    Science.gov (United States)

    Klek, Stanislaw; Scislo, Lucyna; Walewska, Elzbieta; Choruz, Ryszard; Galas, Aleksander

    2017-04-01

    The aim of the study was to determine whether the postoperative use of enteral nutrition enriched with arginine, glutamine, and omega-3 fatty acids influences survival in patients diagnosed with stomach cancer. For the purpose of the study, the second wave of the trial performed in 2003 to 2009 was done. Ninety-nine patients who underwent surgery for gastric cancer (27 F, 72 M, mean age: 62.9 y) met the inclusion criteria. Of those, 54 were randomized to standard and 45 to enriched enteral nutrition (EEN). In all patients, short- and long-term (5 y) survival was analyzed. Analysis of the overall survival time did not reveal differences between groups (P = 0.663). Until the end of the third month, however, there were nine deaths in the standard enteral nutrition group and no deaths in the EEN group (16.7% versus 0.0%, P = 0.004). The univariate analyses suggested that the EEN group may have lower risk, especially during the first year after intervention. A significant reduction in the risk of death was seen during the early period after surgery (first 6 mo) in the EEN group in stage IV patients (hazard ratio = 0.25, P = 0.049). The use of enriched enteral diet did not influence, however, the risk of dying when patients were analyzed together. The study does not support the beneficial effect of enriched enteral nutrition in long-term survival; however, the positive impact on the stage IV patients suggests the need for further, more detailed studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. The Norwegian dietary guidelines and colorectal cancer survival (CRC-NORDIET) study: a food-based multicentre randomized controlled trial.

    Science.gov (United States)

    Henriksen, Hege Berg; Ræder, Hanna; Bøhn, Siv Kjølsrud; Paur, Ingvild; Kværner, Ane Sørlie; Billington, Siv Åshild; Eriksen, Morten Tandberg; Wiedsvang, Gro; Erlund, Iris; Færden, Arne; Veierød, Marit Bragelien; Zucknick, Manuela; Smeland, Sigbjørn; Blomhoff, Rune

    2017-01-30

    Colorectal cancer survivors are not only at risk for recurrent disease but also at increased risk of comorbidities such as other cancers, cardiovascular disease, diabetes, hypertension and functional decline. In this trial, we aim at investigating whether a diet in accordance with the Norwegian food-based dietary guidelines and focusing at dampening inflammation and oxidative stress will improve long-term disease outcomes and survival in colorectal cancer patients. This paper presents the study protocol of the Norwegian Dietary Guidelines and Colorectal Cancer Survival study. Men and women aged 50-80 years diagnosed with primary invasive colorectal cancer (Stage I-III) are invited to this randomized controlled, parallel two-arm trial 2-9 months after curative surgery. The intervention group (n = 250) receives an intensive dietary intervention lasting for 12 months and a subsequent maintenance intervention for 14 years. The control group (n = 250) receives no dietary intervention other than standard clinical care. Both groups are offered equal general advice of physical activity. Patients are followed-up at 6 months and 1, 3, 5, 7, 10 and 15 years after baseline. The study center is located at the Department of Nutrition, University of Oslo, and patients are recruited from two hospitals within the South-Eastern Norway Regional Health Authority. Primary outcomes are disease-free survival and overall survival. Secondary outcomes are time to recurrence, cardiovascular disease-free survival, compliance to the dietary recommendations and the effects of the intervention on new comorbidities, intermediate biomarkers, nutrition status, physical activity, physical function and quality of life. The current study is designed to gain a better understanding of the role of a healthy diet aimed at dampening inflammation and oxidative stress on long-term disease outcomes and survival in colorectal cancer patients. Since previous research on the role of diet for

  15. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  16. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  17. Effect of Haloperidol on Survival Among Critically Ill Adults With a High Risk of Delirium: The REDUCE Randomized Clinical Trial.

    Science.gov (United States)

    van den Boogaard, Mark; Slooter, Arjen J C; Brüggemann, Roger J M; Schoonhoven, Lisette; Beishuizen, Albertus; Vermeijden, J Wytze; Pretorius, Danie; de Koning, Jan; Simons, Koen S; Dennesen, Paul J W; Van der Voort, Peter H J; Houterman, Saskia; van der Hoeven, J G; Pickkers, Peter; van der Woude, Margaretha C. E.; Besselink, Anna; Hofstra, Lieuwe S; Spronk, Peter E; van den Bergh, Walter; Donker, Dirk W; Fuchs, Malaika; Karakus, Attila; Koeman, M; van Duijnhoven, Mirella; Hannink, Gerjon

    2018-02-20

    Results of studies on use of prophylactic haloperidol in critically ill adults are inconclusive, especially in patients at high risk of delirium. To determine whether prophylactic use of haloperidol improves survival among critically ill adults at high risk of delirium, which was defined as an anticipated intensive care unit (ICU) stay of at least 2 days. Randomized, double-blind, placebo-controlled investigator-driven study involving 1789 critically ill adults treated at 21 ICUs, at which nonpharmacological interventions for delirium prevention are routinely used in the Netherlands. Patients without delirium whose expected ICU stay was at least a day were included. Recruitment was from July 2013 to December 2016 and follow-up was conducted at 90 days with the final follow-up on March 1, 2017. Patients received prophylactic treatment 3 times daily intravenously either 1 mg (n = 350) or 2 mg (n = 732) of haloperidol or placebo (n = 707), consisting of 0.9% sodium chloride. The primary outcome was the number of days that patients survived in 28 days. There were 15 secondary outcomes, including delirium incidence, 28-day delirium-free and coma-free days, duration of mechanical ventilation, and ICU and hospital length of stay. All 1789 randomized patients (mean, age 66.6 years [SD, 12.6]; 1099 men [61.4%]) completed the study. The 1-mg haloperidol group was prematurely stopped because of futility. There was no difference in the median days patients survived in 28 days, 28 days in the 2-mg haloperidol group vs 28 days in the placebo group, for a difference of 0 days (95% CI, 0-0; P = .93) and a hazard ratio of 1.003 (95% CI, 0.78-1.30, P=.82). All of the 15 secondary outcomes were not statistically different. These included delirium incidence (mean difference, 1.5%, 95% CI, -3.6% to 6.7%), delirium-free and coma-free days (mean difference, 0 days, 95% CI, 0-0 days), and duration of mechanical ventilation, ICU, and hospital length of stay (mean difference

  18. [Effects and related mechanism of bivalirudin on the survival of random skin flap on the back of rat].

    Science.gov (United States)

    Cai, L Y; Wang, T; Lin, D S; Lu, D

    2017-04-20

    Objective: To investigate the effects and related mechanism of bivalirudin on the survival of random skin flap on the back of rat. Methods: Thirty SD rats were divided into bivalirudin group and normal saline group according to the random number table, with 15 rats in each group. The random flap model with size of 9 cm×3 cm was reproduced on the back of rats in two groups. Immediately post injury, rats in bivalirudin group were intraperitoneally injected with 5 mg/mL bivalirudin (0.8 mL/kg), while rats in normal saline group were intraperitoneally injected with normal saline (0.8 mL/kg) once a day. The continuous injection lasted for 7 days. The flap was divided into distal area, middle area and proximal area averagely based on the flap blood supply. On post injury day (PID) 1, 3, and 7, the overall survival of each area of flap was observed with naked eyes. On PID 7, the survival rate of flap was calculated, and then the morphology of skin tissue at the center of the three areas of flap was observed by HE staining, the microvessel density (MVD) of the middle area of flap was calculated, and the expression of vascular endothelial growth factor (VEGF) of the middle area of flap was detected with immunohistochemical staining. Data were processed with t test. Results: (1) On PID 1, flaps of rats in two groups had different degrees of swelling, mainly concentrated in distal area, but there was no obvious necrosis. The middle area and proximal area of flaps in two groups were survived. On PID 3, the necrosis of flaps of rats in two groups was concentrated in the middle area, while the proximal area of flap was still in survival state, and most distal area of flap was necrosis with a little scab. On PID 7, the necrosis of middle area of flaps of rats in two groups was gradually fused, and the survival area of flap of rats in bivalirudin group was larger than that in normal saline group. The distal area of flap was almost necrotic, and the proximal area of flap was

  19. Meta-analysis of single-arm survival studies: a distribution-free approach for estimating summary survival curves with random effects.

    Science.gov (United States)

    Combescure, Christophe; Foucher, Yohann; Jackson, Daniel

    2014-07-10

    In epidemiologic studies and clinical trials with time-dependent outcome (for instance death or disease progression), survival curves are used to describe the risk of the event over time. In meta-analyses of studies reporting a survival curve, the most informative finding is a summary survival curve. In this paper, we propose a method to obtain a distribution-free summary survival curve by expanding the product-limit estimator of survival for aggregated survival data. The extension of DerSimonian and Laird's methodology for multiple outcomes is applied to account for the between-study heterogeneity. Statistics I(2)  and H(2) are used to quantify the impact of the heterogeneity in the published survival curves. A statistical test for between-strata comparison is proposed, with the aim to explore study-level factors potentially associated with survival. The performance of the proposed approach is evaluated in a simulation study. Our approach is also applied to synthesize the survival of untreated patients with hepatocellular carcinoma from aggregate data of 27 studies and synthesize the graft survival of kidney transplant recipients from individual data from six hospitals. Copyright © 2014 John Wiley & Sons, Ltd.

  20. Random Linear Network Coding is Key to Data Survival in Highly Dynamic Distributed Storage

    DEFF Research Database (Denmark)

    Sipos, Marton A.; Fitzek, Frank; Roetter, Daniel Enrique Lucani

    2015-01-01

    Distributed storage solutions have become widespread due to their ability to store large amounts of data reliably across a network of unreliable nodes, by employing repair mechanisms to prevent data loss. Conventional systems rely on static designs with a central control entity to oversee...... and control the repair process. Given the large costs for maintaining and cooling large data centers, our work proposes and studies the feasibility of a fully decentralized systems that can store data even on unreliable and, sometimes, unavailable mobile devices. This imposes new challenges on the design...... as the number of available nodes varies greatly over time and keeping track of the system's state becomes unfeasible. As a consequence, conventional erasure correction approaches are ill-suited for maintaining data integrity. In this highly dynamic context, random linear network coding (RLNC) provides...

  1. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  2. Physical Activity Improves Verbal and Spatial Memory in Older Adults with Probable Mild Cognitive Impairment: A 6-Month Randomized Controlled Trial

    Directory of Open Access Journals (Sweden)

    Lindsay S. Nagamatsu

    2013-01-01

    Full Text Available We report secondary findings from a randomized controlled trial on the effects of exercise on memory in older adults with probable MCI. We randomized 86 women aged 70–80 years with subjective memory complaints into one of three groups: resistance training, aerobic training, or balance and tone (control. All participants exercised twice per week for six months. We measured verbal memory and learning using the Rey Auditory Verbal Learning Test (RAVLT and spatial memory using a computerized test, before and after trial completion. We found that the aerobic training group remembered significantly more items in the loss after interference condition of the RAVLT compared with the control group after six months of training. In addition, both experimental groups showed improved spatial memory performance in the most difficult condition where they were required to memorize the spatial location of three items, compared with the control group. Lastly, we found a significant correlation between spatial memory performance and overall physical capacity after intervention in the aerobic training group. Taken together, our results provide support for the prevailing notion that exercise can positively impact cognitive functioning and may represent an effective strategy to improve memory in those who have begun to experience cognitive decline.

  3. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  4. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data

    Directory of Open Access Journals (Sweden)

    Justine B. Nasejje

    2017-07-01

    Full Text Available Abstract Background Random survival forest (RSF models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. Methods In this study, we compare the random survival forest model to the conditional inference model (CIF using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points. The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB which consists of mainly categorical covariates with two levels (few split-points. Results The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Conclusion Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  5. Quantification of the heterogeneity of prognostic cellular biomarkers in ewing sarcoma using automated image and random survival forest analysis.

    Directory of Open Access Journals (Sweden)

    Claudia Bühnemann

    Full Text Available Driven by genomic somatic variation, tumour tissues are typically heterogeneous, yet unbiased quantitative methods are rarely used to analyse heterogeneity at the protein level. Motivated by this problem, we developed automated image segmentation of images of multiple biomarkers in Ewing sarcoma to generate distributions of biomarkers between and within tumour cells. We further integrate high dimensional data with patient clinical outcomes utilising random survival forest (RSF machine learning. Using material from cohorts of genetically diagnosed Ewing sarcoma with EWSR1 chromosomal translocations, confocal images of tissue microarrays were segmented with level sets and watershed algorithms. Each cell nucleus and cytoplasm were identified in relation to DAPI and CD99, respectively, and protein biomarkers (e.g. Ki67, pS6, Foxo3a, EGR1, MAPK localised relative to nuclear and cytoplasmic regions of each cell in order to generate image feature distributions. The image distribution features were analysed with RSF in relation to known overall patient survival from three separate cohorts (185 informative cases. Variation in pre-analytical processing resulted in elimination of a high number of non-informative images that had poor DAPI localisation or biomarker preservation (67 cases, 36%. The distribution of image features for biomarkers in the remaining high quality material (118 cases, 104 features per case were analysed by RSF with feature selection, and performance assessed using internal cross-validation, rather than a separate validation cohort. A prognostic classifier for Ewing sarcoma with low cross-validation error rates (0.36 was comprised of multiple features, including the Ki67 proliferative marker and a sub-population of cells with low cytoplasmic/nuclear ratio of CD99. Through elimination of bias, the evaluation of high-dimensionality biomarker distribution within cell populations of a tumour using random forest analysis in quality

  6. Quantification of the heterogeneity of prognostic cellular biomarkers in ewing sarcoma using automated image and random survival forest analysis.

    Science.gov (United States)

    Bühnemann, Claudia; Li, Simon; Yu, Haiyue; Branford White, Harriet; Schäfer, Karl L; Llombart-Bosch, Antonio; Machado, Isidro; Picci, Piero; Hogendoorn, Pancras C W; Athanasou, Nicholas A; Noble, J Alison; Hassan, A Bassim

    2014-01-01

    Driven by genomic somatic variation, tumour tissues are typically heterogeneous, yet unbiased quantitative methods are rarely used to analyse heterogeneity at the protein level. Motivated by this problem, we developed automated image segmentation of images of multiple biomarkers in Ewing sarcoma to generate distributions of biomarkers between and within tumour cells. We further integrate high dimensional data with patient clinical outcomes utilising random survival forest (RSF) machine learning. Using material from cohorts of genetically diagnosed Ewing sarcoma with EWSR1 chromosomal translocations, confocal images of tissue microarrays were segmented with level sets and watershed algorithms. Each cell nucleus and cytoplasm were identified in relation to DAPI and CD99, respectively, and protein biomarkers (e.g. Ki67, pS6, Foxo3a, EGR1, MAPK) localised relative to nuclear and cytoplasmic regions of each cell in order to generate image feature distributions. The image distribution features were analysed with RSF in relation to known overall patient survival from three separate cohorts (185 informative cases). Variation in pre-analytical processing resulted in elimination of a high number of non-informative images that had poor DAPI localisation or biomarker preservation (67 cases, 36%). The distribution of image features for biomarkers in the remaining high quality material (118 cases, 104 features per case) were analysed by RSF with feature selection, and performance assessed using internal cross-validation, rather than a separate validation cohort. A prognostic classifier for Ewing sarcoma with low cross-validation error rates (0.36) was comprised of multiple features, including the Ki67 proliferative marker and a sub-population of cells with low cytoplasmic/nuclear ratio of CD99. Through elimination of bias, the evaluation of high-dimensionality biomarker distribution within cell populations of a tumour using random forest analysis in quality controlled tumour

  7. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  8. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  9. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  10. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  11. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  12. Identification by random forest method of HLA class I amino acid substitutions associated with lower survival at day 100 in unrelated donor hematopoietic cell transplantation.

    Science.gov (United States)

    Marino, S R; Lin, S; Maiers, M; Haagenson, M; Spellman, S; Klein, J P; Binkowski, T A; Lee, S J; van Besien, K

    2012-02-01

    The identification of important amino acid substitutions associated with low survival in hematopoietic cell transplantation (HCT) is hampered by the large number of observed substitutions compared with the small number of patients available for analysis. Random forest analysis is designed to address these limitations. We studied 2107 HCT recipients with good or intermediate risk hematological malignancies to identify HLA class I amino acid substitutions associated with reduced survival at day 100 post transplant. Random forest analysis and traditional univariate and multivariate analyses were used. Random forest analysis identified amino acid substitutions in 33 positions that were associated with reduced 100 day survival, including HLA-A 9, 43, 62, 63, 76, 77, 95, 97, 114, 116, 152, 156, 166 and 167; HLA-B 97, 109, 116 and 156; and HLA-C 6, 9, 11, 14, 21, 66, 77, 80, 95, 97, 99, 116, 156, 163 and 173. In all 13 had been previously reported by other investigators using classical biostatistical approaches. Using the same data set, traditional multivariate logistic regression identified only five amino acid substitutions associated with lower day 100 survival. Random forest analysis is a novel statistical methodology for analysis of HLA mismatching and outcome studies, capable of identifying important amino acid substitutions missed by other methods.

  13. Survival of bonded lingual retainers with chemical or photo polymerization over a 2-year period: a single-center, randomized controlled clinical trial

    NARCIS (Netherlands)

    Pandis, N.; Fleming, P.S.; Kloukos, D.; Polychronopoulou, A.; Katsaros, C.; Eliades, T.

    2013-01-01

    INTRODUCTION: The objective of this trial was to compare the survival rates of mandibular lingual retainers bonded with either chemically cured or light-cured adhesive after orthodontic treatment. METHODS: Patients having undergone orthodontic treatment at a private orthodontic office were randomly

  14. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  15. COUNTRY-LEVEL SOCIOECONOMIC INDICATORS ASSOCIATED WITH SURVIVAL PROBABILITY OF BECOMING A CENTENARIAN AMONG OLDER EUROPEAN ADULTS: GENDER INEQUALITY, MALE LABOUR FORCE PARTICIPATION AND PROPORTIONS OF WOMEN IN PARLIAMENTS.

    Science.gov (United States)

    Kim, Jong In; Kim, Gukbin

    2017-03-01

    This study confirms an association between survival probability of becoming a centenarian (SPBC) for those aged 65 to 69 and country-level socioeconomic indicators in Europe: the gender inequality index (GII), male labour force participation (MLP) rates and proportions of seats held by women in national parliaments (PWP). The analysis was based on SPBC data from 34 countries obtained from the United Nations (UN). Country-level socioeconomic indicator data were obtained from the UN and World Bank databases. The associations between socioeconomic indicators and SPBC were assessed using correlation coefficients and multivariate regression models. The findings show significant correlations between the SPBC for women and men aged 65 to 69 and country-level socioeconomic indicators: GII (r=-0.674, p=0.001), MLP (r=0.514, p=0.002) and PWP (r=0.498, p=0.003). The SPBC predictors for women and men were lower GIIs and higher MLP and PWP (R 2=0.508, p=0.001). Country-level socioeconomic indicators appear to have an important effect on the probability of becoming a centenarian in European adults aged 65 to 69. Country-level gender equality policies in European counties may decrease the risk of unhealthy old age and increase longevity in elders through greater national gender equality; disparities in GII and other country-level socioeconomic indicators impact longevity probability. National longevity strategies should target country-level gender inequality.

  16. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  17. Modeling Transport in Fractured Porous Media with the Random-Walk Particle Method: The Transient Activity Range and the Particle-Transfer Probability

    International Nuclear Information System (INIS)

    Lehua Pan; G.S. Bodvarsson

    2001-01-01

    Multiscale features of transport processes in fractured porous media make numerical modeling a difficult task, both in conceptualization and computation. Modeling the mass transfer through the fracture-matrix interface is one of the critical issues in the simulation of transport in a fractured porous medium. Because conventional dual-continuum-based numerical methods are unable to capture the transient features of the diffusion depth into the matrix (unless they assume a passive matrix medium), such methods will overestimate the transport of tracers through the fractures, especially for the cases with large fracture spacing, resulting in artificial early breakthroughs. We have developed a new method for calculating the particle-transfer probability that can capture the transient features of diffusion depth into the matrix within the framework of the dual-continuum random-walk particle method (RWPM) by introducing a new concept of activity range of a particle within the matrix. Unlike the multiple-continuum approach, the new dual-continuum RWPM does not require using additional grid blocks to represent the matrix. It does not assume a passive matrix medium and can be applied to the cases where global water flow exists in both continua. The new method has been verified against analytical solutions for transport in the fracture-matrix systems with various fracture spacing. The calculations of the breakthrough curves of radionuclides from a potential repository to the water table in Yucca Mountain demonstrate the effectiveness of the new method for simulating 3-D, mountain-scale transport in a heterogeneous, fractured porous medium under variably saturated conditions

  18. Impact of weight loss on survival after chemoradiation for locally advanced head and neck Cancer: secondary results of a randomized phase III trial (SAKK 10/94)

    International Nuclear Information System (INIS)

    Ghadjar, Pirus; Hayoz, Stefanie; Zimmermann, Frank; Bodis, Stephan; Kaul, David; Badakhshi, Harun; Bernier, Jacques; Studer, Gabriela; Plasswilm, Ludwig; Budach, Volker; Aebersold, Daniel M

    2015-01-01

    To analyze the impact of weight loss before and during chemoradiation on survival outcomes in patients with locally advanced head and neck cancer. From 07/1994-07/2000 a total of 224 patients with squamous cell carcinoma of the head and neck were randomized to either hyperfractionated radiation therapy alone or the same radiation therapy combined with two cycles of concomitant cisplatin. The primary endpoint was time to any treatment failure (TTF); secondary endpoints were locoregional recurrence-free survival (LRRFS), distant metastasis-free survival (DMFS) and overall survival (OS). Patient weight was measured 6 months before treatment, at treatment start and treatment end. The proportion of patients with >5% weight loss was 32% before, and 51% during treatment, and the proportion of patients with >10% weight loss was 12% before, and 17% during treatment. After a median follow-up of 9.5 years (range, 0.1 – 15.4 years) weight loss before treatment was associated with decreased TTF, LRRFS, DMFS, cancer specific survival and OS in a multivariable analysis. However, weight loss during treatment was not associated with survival outcomes. Weight loss before and during chemoradiation was commonly observed. Weight loss before but not during treatment was associated with worse survival

  19. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  20. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  1. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  2. Ovarian Suppression With Triptorelin During Adjuvant Breast Cancer Chemotherapy and Long-term Ovarian Function, Pregnancies, and Disease-Free Survival: A Randomized Clinical Trial.

    Science.gov (United States)

    Lambertini, Matteo; Boni, Luca; Michelotti, Andrea; Gamucci, Teresa; Scotto, Tiziana; Gori, Stefania; Giordano, Monica; Garrone, Ornella; Levaggi, Alessia; Poggio, Francesca; Giraudi, Sara; Bighin, Claudia; Vecchio, Carlo; Sertoli, Mario Roberto; Pronzato, Paolo; Del Mastro, Lucia

    Whether the administration of luteinizing hormone-releasing hormone analogues (LHRHa) during chemotherapy is a reliable strategy to preserve ovarian function is controversial owing to both the lack of data on long-term ovarian function and pregnancies and the safety concerns about the potential negative interactions between endocrine therapy and chemotherapy. To evaluate long-term results of LHRHa-induced ovarian suppression during breast cancer chemotherapy. Parallel, randomized, open-label, phase 3 superiority trial conducted at 16 Italian sites. Between October 2003 and January 2008, 281 premenopausal women with stage I to III hormone receptor-positive or hormone receptor-negative breast cancer were enrolled. Last annual follow-up was June 3, 2014. Patients were randomized to receive adjuvant or neoadjuvant chemotherapy alone (control group) or chemotherapy plus triptorelin (LHRHa group). The primary planned end point was incidence of chemotherapy-induced early menopause. Post hoc end points were long-term ovarian function (evaluated by yearly assessment of menstrual activity and defined as resumed by the occurrence of at least 1 menstrual cycle), pregnancies, and disease-free survival (DFS). A total of 281 women (median age, 39 [range, 24-45] years) were randomized. Median follow-up was 7.3 years (interquartile range, 6.3-8.2 years). The 5-year cumulative incidence estimate of menstrual resumption was 72.6% (95% CI, 65.7%-80.3%) among the 148 patients in the LHRHa group and 64.0% (95% CI, 56.2%-72.8%) among the 133 patients in the control group (hazard ratio [HR], 1.28 [95% CI, 0.98-1.68]; P = .07; age-adjusted HR, 1.48 [95% CI, 1.12-1.95]; P = .006). Eight pregnancies (5-year cumulative incidence estimate of pregnancy, 2.1% [95% CI, 0.7%-6.3%]) occurred in the LHRHa group and 3 (5-year cumulative incidence estimate of pregnancy, 1.6% [95% CI, 0.4%-6.2%]) in the control group (HR, 2.56 [95% CI, 0.68-9.60]; P = .14; age-adjusted HR, 2.40 [95% CI, 0

  3. Does quasi-long-range order in the two-dimensional XY model really survive weak random phase fluctuations?

    International Nuclear Information System (INIS)

    Mudry, Christopher; Wen Xiaogang

    1999-01-01

    Effective theories for random critical points are usually non-unitary, and thus may contain relevant operators with negative scaling dimensions. To study the consequences of the existence of negative-dimensional operators, we consider the random-bond XY model. It has been argued that the XY model on a square lattice, when weakly perturbed by random phases, has a quasi-long-range ordered phase (the random spin wave phase) at sufficiently low temperatures. We show that infinitely many relevant perturbations to the proposed critical action for the random spin wave phase were omitted in all previous treatments. The physical origin of these perturbations is intimately related to the existence of broadly distributed correlation functions. We find that those relevant perturbations do enter the Renormalization Group equations, and affect critical behavior. This raises the possibility that the random XY model has no quasi-long-range ordered phase and no Kosterlitz-Thouless (KT) phase transition

  4. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  5. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  6. Cluster-randomized study of intermittent preventive treatment for malaria in infants (IPTi in southern Tanzania: evaluation of impact on survival

    Directory of Open Access Journals (Sweden)

    Schellenberg Joanna

    2011-12-01

    Full Text Available Abstract Background Intermittent Preventive Treatment for malaria control in infants (IPTi consists of the administration of a treatment dose of an anti-malarial drug, usually sulphadoxine-pyrimethamine, at scheduled intervals, regardless of the presence of Plasmodium falciparum infection. A pooled analysis of individually randomized trials reported that IPTi reduced clinical episodes by 30%. This study evaluated the effect of IPTi on child survival in the context of a five-district implementation project in southern Tanzania. [Trial registration: clinical trials.gov NCT00152204]. Methods After baseline household and health facility surveys in 2004, five districts comprising 24 divisions were randomly assigned either to receive IPTi (n = 12 or not (n = 12. Implementation started in March 2005, led by routine health services with support from the research team. In 2007, a large household survey was undertaken to assess the impact of IPTi on survival in infants aged two-11 months through birth history interviews with all women aged 13-49 years. The analysis is based on an "intention-to-treat" ecological design, with survival outcomes analysed according to the cluster in which the mothers lived. Results Survival in infants aged two-11 months was comparable in IPTi and comparison areas at baseline. In intervention areas in 2007, 48% of children aged 12-23 months had documented evidence of receiving three doses of IPTi, compared to 2% in comparison areas (P P = 0.31. Conclusion The lack of evidence of an effect of IPTi on survival could be a false negative result due to a lack of power or imbalance of unmeasured confounders. Alternatively, there could be no mortality impact of IPTi due to low coverage, late administration, drug resistance, decreased malaria transmission or improvements in vector control and case management. This study raises important questions for programme evaluation design.

  7. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  8. A preliminary randomized clinical trial comparing diode laser and scalpel periosteal incision during implant surgery: impact on postoperative morbidity and implant survival.

    Science.gov (United States)

    Shahnaz, Aysan; Jamali, Raika; Mohammadi, Farnush; Khorsand, Afshin; Moslemi, Neda; Fekrazad, Reza

    2018-01-01

    The aim of this preliminary randomized clinical trial was to compare: (1) post-operative morbidity after application of laser or scalpel incision for flap advancement during implant surgery and bone grafting and (2) implant survival rate following flap advancement with laser or scalpel incision after 6 months of loading. Eighteen patients who were scheduled for dental implant placement and simultaneous bone grafting were randomly assigned to test or control groups. Diode laser (810 nm, 2 W, pulse interval 200 μs; pulse length 100 μs, 400-μm initiated fiber tip), or scalpel (control) was used to sever the periosteum to create a tension-free flap. Visual analogue scale (VAS) pain score, rate of nonsteroid anti-inflammatory drug (NSAID) consumption, intensity of swelling, and ecchymosis were measured for the six postsurgical days. Six months after loading, implant survival was assessed. VAS pain score (during the first four postoperative days), rate of NSAID consumption (during the first three postoperative days), and intensity of swelling (during the first five postoperative days) were significantly lower in the test group compared to the control group (All P values laser for performing periosteal releasing incision reduced the incidence and severity of postoperative morbidity of the patients undergone implant surgery in conjunction with bone augmentation procedure. We did not find any detrimental effect of laser incision on the implant survival within 6 months of loading.

  9. Spatial Random Effects Survival Models to Assess Geographical Inequalities in Dengue Fever Using Bayesian Approach: a Case Study

    Science.gov (United States)

    Astuti Thamrin, Sri; Taufik, Irfan

    2018-03-01

    Dengue haemorrhagic fever (DHF) is an infectious disease caused by dengue virus. The increasing number of people with DHF disease correlates with the neighbourhood, for example sub-districts, and the characteristics of the sub-districts are formed from individuals who are domiciled in the sub-districts. Data containing individuals and sub-districts is a hierarchical data structure, called multilevel analysis. Frequently encountered response variable of the data is the time until an event occurs. Multilevel and spatial models are being increasingly used to obtain substantive information on area-level inequalities in DHF survival. Using a case study approach, we report on the implications of using multilevel with spatial survival models to study geographical inequalities in all cause survival.

  10. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  11. Survival in Malnourished Older Patients Receiving Post-Discharge Nutritional Support; Long-Term Results of a Randomized Controlled Trial

    NARCIS (Netherlands)

    Neelemaat, F; van Keeken, S; Langius, J A E; de van der Schueren, M A E; Thijs, A; Bosmans, J E

    2017-01-01

    BACKGROUND: Previous analyses have shown that a post-discharge individualized nutritional intervention had positive effects on body weight, lean body mass, functional limitations and fall incidents in malnourished older patients. However, the impact of this intervention on survival has not yet been

  12. Taylor-series and Monte-Carlo-method uncertainty estimation of the width of a probability distribution based on varying bias and random error

    International Nuclear Information System (INIS)

    Wilson, Brandon M; Smith, Barton L

    2013-01-01

    Uncertainties are typically assumed to be constant or a linear function of the measured value; however, this is generally not true. Particle image velocimetry (PIV) is one example of a measurement technique that has highly nonlinear, time varying local uncertainties. Traditional uncertainty methods are not adequate for the estimation of the uncertainty of measurement statistics (mean and variance) in the presence of nonlinear, time varying errors. Propagation of instantaneous uncertainty estimates into measured statistics is performed allowing accurate uncertainty quantification of time-mean and statistics of measurements such as PIV. It is shown that random errors will always elevate the measured variance, and thus turbulent statistics such as u'u'-bar. Within this paper, nonlinear, time varying errors are propagated from instantaneous measurements into the measured mean and variance using the Taylor-series method. With these results and knowledge of the systematic and random uncertainty of each measurement, the uncertainty of the time-mean, the variance and covariance can be found. Applicability of the Taylor-series uncertainty equations to time varying systematic and random errors and asymmetric error distributions are demonstrated with Monte-Carlo simulations. The Taylor-series uncertainty estimates are always accurate for uncertainties on the mean quantity. The Taylor-series variance uncertainty is similar to the Monte-Carlo results for cases in which asymmetric random errors exist or the magnitude of the instantaneous variations in the random and systematic errors is near the ‘true’ variance. However, the Taylor-series method overpredicts the uncertainty in the variance as the instantaneous variations of systematic errors are large or are on the same order of magnitude as the ‘true’ variance. (paper)

  13. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  14. Random magnetism

    International Nuclear Information System (INIS)

    Tahir-Kheli, R.A.

    1975-01-01

    A few simple problems relating to random magnetic systems are presented. Translational symmetry, only on the macroscopic scale, is assumed for these systems. A random set of parameters, on the microscopic scale, for the various regions of these systems is also assumed. A probability distribution for randomness is obeyed. Knowledge of the form of these probability distributions, is assumed in all cases [pt

  15. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  16. The INCA trial (Impact of NOD2 genotype-guided antibiotic prevention on survival in patients with liver Cirrhosis and Ascites): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Casper, Markus; Mengel, Martin; Fuhrmann, Christine; Herrmann, Eva; Appenrodt, Beate; Schiedermaier, Peter; Reichert, Matthias; Bruns, Tony; Engelmann, Cornelius; Grünhage, Frank; Lammert, Frank

    2015-03-08

    Patients with liver cirrhosis have a highly elevated risk of developing bacterial infections that significantly decrease survival rates. One of the most relevant infections is spontaneous bacterial peritonitis (SBP). Recently, NOD2 germline variants were found to be potential predictors of the development of infectious complications and mortality in patients with cirrhosis. The aim of the INCA (Impact of NOD2 genotype-guided antibiotic prevention on survival in patients with liver Cirrhosis and Ascites) trial is to investigate whether survival of this genetically defined high-risk group of patients with cirrhosis defined by the presence of NOD2 variants is improved by primary antibiotic prophylaxis of SBP. The INCA trial is a double-blind, placebo-controlled clinical trial with two parallel treatment arms (arm 1: norfloxacin 400 mg once daily; arm 2: placebo once daily; 12-month treatment and observational period). Balanced randomization of 186 eligible patients with stratification for the protein content of the ascites (INCA trial is first in the field of hepatology aimed at rapidly transferring and validating information on individual genetic risk into clinical decision algorithms. German Clinical Trials Register DRKS00005616 . Registered 22 January 2014. EU Clinical Trials Register EudraCT 2013-001626-26 . Registered 26 January 2015.

  17. A randomized controlled trial of cognitive-behavioral stress management in breast cancer: survival and recurrence at 11-year follow-up.

    Science.gov (United States)

    Stagl, Jamie M; Lechner, Suzanne C; Carver, Charles S; Bouchard, Laura C; Gudenkauf, Lisa M; Jutagir, Devika R; Diaz, Alain; Yu, Qilu; Blomberg, Bonnie B; Ironson, Gail; Glück, Stefan; Antoni, Michael H

    2015-11-01

    Non-metastatic breast cancer patients often experience psychological distress which may influence disease progression and survival. Cognitive-behavioral stress management (CBSM) improves psychological adaptation and lowers distress during breast cancer treatment and long-term follow-ups. We examined whether breast cancer patients randomized to CBSM had improved survival and recurrence 8-15 years post-enrollment. From 1998 to 2005, women (N = 240) 2-10 weeks post-surgery for non-metastatic Stage 0-IIIb breast cancer were randomized to a 10-week, group-based CBSM intervention (n = 120) or a 1-day psychoeducational seminar control (n = 120). In 2013, 8-15 years post-study enrollment (11-year median), recurrence and survival data were collected. Cox Proportional Hazards Models and Weibull Accelerated Failure Time tests were used to assess group differences in all-cause mortality, breast cancer-specific mortality, and disease-free interval, controlling for biomedical confounders. Relative to the control, the CBSM group was found to have a reduced risk of all-cause mortality (HR = 0.21; 95 % CI [0.05, 0.93]; p = .040). Restricting analyses to women with invasive disease revealed significant effects of CBSM on breast cancer-related mortality (p = .006) and disease-free interval (p = .011). CBSM intervention delivered post-surgery may provide long-term clinical benefit for non-metastatic breast cancer patients in addition to previously established psychological benefits. Results should be interpreted with caution; however, the findings contribute to the limited evidence regarding physical benefits of psychosocial intervention post-surgery for non-metastatic breast cancer. Additional research is necessary to confirm these results and investigate potential explanatory mechanisms, including physiological pathways, health behaviors, and treatment adherence changes.

  18. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  19. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  20. Survival of bonded lingual retainers with chemical or photo polymerization over a 2-year period: a single-center, randomized controlled clinical trial.

    Science.gov (United States)

    Pandis, Nikolaos; Fleming, Padhraig S; Kloukos, Dimitrios; Polychronopoulou, Argy; Katsaros, Christos; Eliades, Theodore

    2013-08-01

    The objective of this trial was to compare the survival rates of mandibular lingual retainers bonded with either chemically cured or light-cured adhesive after orthodontic treatment. Patients having undergone orthodontic treatment at a private orthodontic office were randomly allocated to fixed retainers placed with chemically cured composite or light-cured composite. Eligibility criteria included no active caries, restorations, or fractures on the mandibular anterior teeth, and adequate oral hygiene. The main outcome was any type of first-time lingual retainer breakage; pattern of failure (adapted adhesive remnant index scores) was a secondary outcome. Randomization was accomplished with random permuted blocks of 20 patients with allocation concealed in sequentially numbered, opaque, sealed envelopes. Blinding was applicable for outcome assessment only. Patients were reviewed at 1, 3, and 6 months and then every 6 months after placement of the retainer until completion of the study. Data were analyzed using survival analysis including Cox regression; sensitivity analysis was carried out after data imputation for subjects lost to follow-up. Two hundred twenty patients (median age, 16 years; interquartile range, 2; range, 12-47 years) were randomized in a 1:1 ratio to either chemical or light curing. Baseline characteristics were similar between groups, the median follow-up period was 2.19 years (range, 0.003-3.64 years), and 16 patients were lost to follow-up. At a minimum follow-up of 2 years, 47 of 110 (42.7%) and 55 of 110 (50.0%) retainers had some type of failure with chemically cured and light-cured adhesive, respectively (log-rank test, P = 0.35). Data were analyzed on an intention-to-treat basis, and the hazard ratio (HR) was 1.15 (95% confidence interval [CI], 0.88-1.70; P = 0.47). There was weak evidence that age is a significant predictor for lingual retainer failures (HR, 0.96; 95% CI, 0.93-1.00; P = 0.08). Adhesive remnant index scoring was

  1. Randomized, placebo controlled study of the effect of propentofylline on survival time and quality of life of cats with feline infectious peritonitis.

    Science.gov (United States)

    Fischer, Y; Ritz, S; Weber, K; Sauter-Louis, C; Hartmann, K

    2011-01-01

    Currently there is no drug proven to effectively treat cats with feline infectious peritonitis (FIP). Propentofylline (PPF) can decrease vasculitis, and therefore prolong survival time in cats with FIP, and increase their quality of life. Twenty-three privately owned cats with FIP. Placebo-controlled double-blind trial. FIP was confirmed by histology or immunostaining of feline coronavirus (FCoV) antigen in effusion or tissue macrophages or both. The cats were randomly selected for treatment with either PPF or placebo. All cats received additional treatment with glucocorticoids, antibiotics, and low molecular weight heparin according to methods. There was no statistically significant difference in the survival time of cats treated with PPF (8 days, 95% CI 5.4-10.6) versus placebo (7.5 days, 95% CI 4.4-9.6). The median survival time of all cats was 8 days (4-36 days). There was neither a difference in quality of life (day 7, P = .892), in the amount of effusion (day 7, P = .710), the tumor necrosis factor-alpha (TNF-α) concentration (day 7, P = .355), nor in any other variable investigated in this study, including a complete blood count, and a small animal biochemistry profile. This study did not detect an effect of PPF on the survival time, the quality of life, or any clinical or laboratory parameter in cats with FIP. Therefore, PPF does not appear to be an effective treatment option in cats with a late stage of the disease FIP. Copyright © 2011 by the American College of Veterinary Internal Medicine.

  2. Joint genome-wide prediction in several populations accounting for randomness of genotypes: A hierarchical Bayes approach. I: Multivariate Gaussian priors for marker effects and derivation of the joint probability mass function of genotypes.

    Science.gov (United States)

    Martínez, Carlos Alberto; Khare, Kshitij; Banerjee, Arunava; Elzo, Mauricio A

    2017-03-21

    It is important to consider heterogeneity of marker effects and allelic frequencies in across population genome-wide prediction studies. Moreover, all regression models used in genome-wide prediction overlook randomness of genotypes. In this study, a family of hierarchical Bayesian models to perform across population genome-wide prediction modeling genotypes as random variables and allowing population-specific effects for each marker was developed. Models shared a common structure and differed in the priors used and the assumption about residual variances (homogeneous or heterogeneous). Randomness of genotypes was accounted for by deriving the joint probability mass function of marker genotypes conditional on allelic frequencies and pedigree information. As a consequence, these models incorporated kinship and genotypic information that not only permitted to account for heterogeneity of allelic frequencies, but also to include individuals with missing genotypes at some or all loci without the need for previous imputation. This was possible because the non-observed fraction of the design matrix was treated as an unknown model parameter. For each model, a simpler version ignoring population structure, but still accounting for randomness of genotypes was proposed. Implementation of these models and computation of some criteria for model comparison were illustrated using two simulated datasets. Theoretical and computational issues along with possible applications, extensions and refinements were discussed. Some features of the models developed in this study make them promising for genome-wide prediction, the use of information contained in the probability distribution of genotypes is perhaps the most appealing. Further studies to assess the performance of the models proposed here and also to compare them with conventional models used in genome-wide prediction are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  4. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  5. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  6. Overall Survival in Patients With Advanced Melanoma Who Received Nivolumab Versus Investigator's Choice Chemotherapy in CheckMate 037: A Randomized, Controlled, Open-Label Phase III Trial.

    Science.gov (United States)

    Larkin, James; Minor, David; D'Angelo, Sandra; Neyns, Bart; Smylie, Michael; Miller, Wilson H; Gutzmer, Ralf; Linette, Gerald; Chmielowski, Bartosz; Lao, Christopher D; Lorigan, Paul; Grossmann, Kenneth; Hassel, Jessica C; Sznol, Mario; Daud, Adil; Sosman, Jeffrey; Khushalani, Nikhil; Schadendorf, Dirk; Hoeller, Christoph; Walker, Dana; Kong, George; Horak, Christine; Weber, Jeffrey

    2018-02-01

    Purpose Until recently, limited options existed for patients with advanced melanoma who experienced disease progression while receiving treatment with ipilimumab. Here, we report the coprimary overall survival (OS) end point of CheckMate 037, which has previously shown that nivolumab resulted in more patients achieving an objective response compared with chemotherapy regimens in ipilimumab-refractory patients with advanced melanoma. Patients and Methods Patients were stratified by programmed death-ligand 1 expression, BRAF status, and best prior cytotoxic T-lymphocyte antigen-4 therapy response, then randomly assigned 2:1 to nivolumab 3 mg/kg intravenously every 2 weeks or investigator's choice chemotherapy (ICC; dacarbazine 1,000 mg/m 2 every 3 weeks or carboplatin area under the curve 6 plus paclitaxel 175 mg/m 2 every 3 weeks). Patients were treated until they experienced progression or unacceptable toxicity, with follow-up of approximately 2 years. Results Two hundred seventy-two patients were randomly assigned to nivolumab (99% treated) and 133 to ICC (77% treated). More nivolumab-treated patients had brain metastases (20% v 14%) and increased lactate dehydrogenase levels (52% v 38%) at baseline; 41% of patients treated with ICC versus 11% of patients treated with nivolumab received anti-programmed death 1 agents after randomly assigned therapy. Median OS was 16 months for nivolumab versus 14 months for ICC (hazard ratio, 0.95; 95.54% CI, 0.73 to 1.24); median progression-free survival was 3.1 months versus 3.7 months, respectively (hazard ratio, 1.0; 95.1% CI, 0.78 to 1.436). Overall response rate (27% v 10%) and median duration of response (32 months v 13 months) were notably higher for nivolumab versus ICC. Fewer grade 3 and 4 treatment-related adverse events were observed in patients on nivolumab (14% v 34%). Conclusion Nivolumab demonstrated higher, more durable responses but no difference in survival compared with ICC. OS should be interpreted with

  7. Optimal dose selection accounting for patient subpopulations in a randomized Phase II trial to maximize the success probability of a subsequent Phase III trial.

    Science.gov (United States)

    Takahashi, Fumihiro; Morita, Satoshi

    2018-02-08

    Phase II clinical trials are conducted to determine the optimal dose of the study drug for use in Phase III clinical trials while also balancing efficacy and safety. In conducting these trials, it may be important to consider subpopulations of patients grouped by background factors such as drug metabolism and kidney and liver function. Determining the optimal dose, as well as maximizing the effectiveness of the study drug by analyzing patient subpopulations, requires a complex decision-making process. In extreme cases, drug development has to be terminated due to inadequate efficacy or severe toxicity. Such a decision may be based on a particular subpopulation. We propose a Bayesian utility approach (BUART) to randomized Phase II clinical trials which uses a first-order bivariate normal dynamic linear model for efficacy and safety in order to determine the optimal dose and study population in a subsequent Phase III clinical trial. We carried out a simulation study under a wide range of clinical scenarios to evaluate the performance of the proposed method in comparison with a conventional method separately analyzing efficacy and safety in each patient population. The proposed method showed more favorable operating characteristics in determining the optimal population and dose.

  8. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  9. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  10. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the

  11. A randomized trial of the effect of a plant-based dietary pattern on additional breast cancer events and survival: the Women's Healthy Eating and Living (WHEL) Study.

    Science.gov (United States)

    Pierce, John P; Faerber, Susan; Wright, Fred A; Rock, Cheryl L; Newman, Vicky; Flatt, Shirley W; Kealey, Sheila; Jones, Vicky E; Caan, Bette J; Gold, Ellen B; Haan, Mary; Hollenbach, Kathryn A; Jones, Lovell; Marshall, James R; Ritenbaugh, Cheryl; Stefanick, Marcia L; Thomson, Cynthia; Wasserman, Linda; Natarajan, Loki; Thomas, Ronald G; Gilpin, Elizabeth A

    2002-12-01

    The Women's Healthy Eating and Living (WHEL) Study is a multisite randomized controlled trial of the effectiveness of a high-vegetable, low-fat diet, aimed at markedly raising circulating carotenoid concentrations from food sources, in reducing additional breast cancer events and early death in women with early-stage invasive breast cancer (within 4 years of diagnosis). The study randomly assigned 3088 such women to an intensive diet intervention or to a comparison group between 1995 and 2000 and is expected to follow them through 2006. Two thirds of these women were under 55 years of age at randomization. This research study has a coordinating center and seven clinical sites. Randomization was stratified by age, stage of tumor and clinical site. A comprehensive intervention program that includes intensive telephone counseling, cooking classes and print materials helps shift the dietary pattern of women in the intervention. Through an innovative telephone counseling program, dietary counselors encourage women in the intervention group to meet the following daily behavioral targets: five vegetable servings, 16 ounces of vegetable juice, three fruit servings, 30 g of fiber and 15-20% energy from fat. Adherence assessments occur at baseline, 6, 12, 24 or 36, 48 and 72 months. These assessments can include dietary intake (repeated 24-hour dietary recalls and food frequency questionnaire), circulating carotenoid concentrations, physical measures and questionnaires about health symptoms, quality of life, personal habits and lifestyle patterns. Outcome assessments are completed by telephone interview every 6 months with medical record verification. We will assess evidence of effectiveness by the length of the breast cancer event-free interval, as well as by overall survival separately in all the women in the study as well as specifically in women under and over the age of 55 years.

  12. Bonded versus vacuum-formed retainers: a randomized controlled trial. Part 1: stability, retainer survival, and patient satisfaction outcomes after 12 months.

    Science.gov (United States)

    Forde, Katherine; Storey, Madeleine; Littlewood, Simon J; Scott, Paul; Luther, Friedy; Kang, Jing

    2017-10-20

    There is a shortage of evidence on the best type of retainer. Evaluate upper and lower bonded retainers (BRs) versus upper and lower vacuum-formed retainers (VFRs) over 12 months, in terms of stability, retainer survival, and patient satisfaction. Two-arm parallel group multi-centre randomized controlled clinical trial. Sixty consecutive patients completing fixed appliance therapy and requiring retainers were recruited from 3 hospital departments. They were randomly allocated to either upper and lower labial segment BRs (n = 30) or upper and lower full-arch VFRs (n = 30). Primary outcome was stability. Secondary outcomes were retainer survival and patient satisfaction. A random sequence of treatment allocation was computer-generated and implemented by sealing in sequentially numbered opaque sealed envelopes independently prepared in advance. Patients, operators and outcome could not be blinded due to the nature of the intervention. Thirty patients received BRs (median [Mdn] age 16 years, inter-quartile range [IQR] = 2) and 30 received VFRs (Mdn age 17 years, IQR = 4). Baseline characteristics were similar between groups. At 12 months, there were no statistically significant inter-group differences in post-treatment change of maxillary labial segment alignment (BR = 1.1 mm, IQR = 1.56, VFR = 0.76 mm, IQR = 1.55, P = 0.61); however, there was greater post-treatment change in the mandibular VFR group (BR = 0.77 mm, IQR = 1.46, VFR = 1.69mm, IQR = 2.00, P = 0.008). The difference in maxillary retainer survival rates were statistically non-significant, P = 0.34 (BR = 63.6%, 239.3 days, 95% confidence interval [CI] = 191.1-287.5, VFR = 73.3%, 311.1 days, 95% CI = 278.3-344.29). The mandibular BR had a lower survival rate (P = 0.01) at 12 months (BR = 50%, 239.3 days 95% CI = 191.1-287.5, VFR = 80%, 324.9 days 95% CI = 295.4-354.4). More subjects with VFRs reported discomfort (P = 0.002) and speech difficulties (P = 0.004) but found them easier to clean than those with

  13. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  14. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  15. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  16. A flexible and coherent test/estimation procedure based on restricted mean survival times for censored time-to-event data in randomized clinical trials.

    Science.gov (United States)

    Horiguchi, Miki; Cronin, Angel M; Takeuchi, Masahiro; Uno, Hajime

    2018-04-22

    In randomized clinical trials where time-to-event is the primary outcome, almost routinely, the logrank test is prespecified as the primary test and the hazard ratio is used to quantify treatment effect. If the ratio of 2 hazard functions is not constant, the logrank test is not optimal and the interpretation of hazard ratio is not obvious. When such a nonproportional hazards case is expected at the design stage, the conventional practice is to prespecify another member of weighted logrank tests, eg, Peto-Prentice-Wilcoxon test. Alternatively, one may specify a robust test as the primary test, which can capture various patterns of difference between 2 event time distributions. However, most of those tests do not have companion procedures to quantify the treatment difference, and investigators have fallen back on reporting treatment effect estimates not associated with the primary test. Such incoherence in the "test/estimation" procedure may potentially mislead clinicians/patients who have to balance risk-benefit for treatment decision. To address this, we propose a flexible and coherent test/estimation procedure based on restricted mean survival time, where the truncation time τ is selected data dependently. The proposed procedure is composed of a prespecified test and an estimation of corresponding robust and interpretable quantitative treatment effect. The utility of the new procedure is demonstrated by numerical studies based on 2 randomized cancer clinical trials; the test is dramatically more powerful than the logrank, Wilcoxon tests, and the restricted mean survival time-based test with a fixed τ, for the patterns of difference seen in these cancer clinical trials. Copyright © 2018 John Wiley & Sons, Ltd.

  17. Prolonged survival in patients with breast cancer and a history of brain metastases: results of a preplanned subgroup analysis from the randomized phase III BEACON trial.

    Science.gov (United States)

    Cortés, Javier; Rugo, Hope S; Awada, Ahmad; Twelves, Chris; Perez, Edith A; Im, Seock-Ah; Gómez-Pardo, Patricia; Schwartzberg, Lee S; Diéras, Veronique; Yardley, Denise A; Potter, David A; Mailliez, Audrey; Moreno-Aspitia, Alvaro; Ahn, Jin-Seok; Zhao, Carol; Hoch, Ute; Tagliaferri, Mary; Hannah, Alison L; O'Shaughnessy, Joyce

    2017-09-01

    Conventional chemotherapy has limited activity in patients with breast cancer and brain metastases (BCBM). Etirinotecan pegol (EP), a novel long-acting topoisomerase-1 inhibitor, was designed using advanced polymer technology to preferentially accumulate in tumor tissue including brain metastases, providing sustained cytotoxic SN38 levels. The phase 3 BEACON trial enrolled 852 women with heavily pretreated locally recurrent or metastatic breast cancer between 2011 and 2013. BEACON compared EP with treatment of physician's choice (TPC; eribulin, vinorelbine, gemcitabine, nab-paclitaxel, paclitaxel, ixabepilone, or docetaxel) in patients previously treated with anthracycline, taxane, and capecitabine, including those with treated, stable brain metastases. The primary endpoint, overall survival (OS), was assessed in a pre-defined subgroup of BCBM patients; an exploratory post hoc analysis adjusting for the diagnosis-specific graded prognostic assessment (GPA) index was also conducted. In the trial, 67 BCBM patients were randomized (EP, n = 36; TPC, n = 31). Treatment subgroups were balanced for baseline characteristics and GPA indices. EP was associated with a significant reduction in the risk of death (HR 0.51; P BEACON population, fewer patients on EP experienced grade ≥3 toxicity (50 vs. 70%). The significant improvement in survival in BCBM patients provides encouraging data for EP in this difficult-to-treat subgroup of patients. A phase three trial of EP in BCBM patients is underway (ClinicalTrials.gov NCT02915744).

  18. Laparoscopic Complete Mesocolic Excision versus Open Complete Mesocolic Excision for Transverse Colon Cancer: Long-Term Survival Results of a Prospective Single Centre Non-Randomized Study.

    Science.gov (United States)

    Storli, Kristian Eeg; Eide, Geir Egil

    2016-01-01

    Laparoscopic complete mesocolic excision (CME) used in the treatment of transverse colon cancer has been questioned on the basis of the technical challenges. The aim of this study was to evaluate the medium- and long-term clinical and survival outcomes after laparoscopic and open CME for transverse colon cancer and to compare the 2 approaches. This study was a retrospective non-randomized study of patients with prospectively registered data on open and laparoscopic CME for transverse colon cancer tumour-node-metastasis stages I-III operated on between 2007 and 2014. This was a single-centre study in a community teaching hospital. A total of 56 patients with transverse colon cancer were included, excluding those with tumours in the colonic flexures. The outcome aims were 4-year time to recurrence (TTR) and cancer-specific survival (CSS). Morbidity was also measured. The 4-year TTR was 93.9% in the laparoscopic group and 91.3% in the open group (p = 0.71). The 4-year CSS was 97.0% in the laparoscopic group and 91.3% in the open group (p = 0.42). This was a prospective single-institution study with a small sample size. Results of the study suggest that the laparoscopic CME approach might be the preferred approach for transverse colon cancer, especially regarding its benefits in terms of short-term morbidity, length of stay and oncological outcome. © 2016 S. Karger AG, Basel.

  19. MRE11-Deficiency Associated with Improved Long-Term Disease Free Survival and Overall Survival in a Subset of Stage III Colon Cancer Patients in Randomized CALGB 89803 Trial

    Science.gov (United States)

    Pavelitz, Thomas; Renfro, Lindsay; Foster, Nathan R.; Caracol, Amber; Welsch, Piri; Lao, Victoria Valinluck; Grady, William B.; Niedzwiecki, Donna; Saltz, Leonard B.; Bertagnolli, Monica M.; Goldberg, Richard M.; Rabinovitch, Peter S.; Emond, Mary; Monnat, Raymond J.; Maizels, Nancy

    2014-01-01

    Purpose Colon cancers deficient in mismatch repair (MMR) may exhibit diminished expression of the DNA repair gene, MRE11, as a consequence of contraction of a T11 mononucleotide tract. This study investigated MRE11 status and its association with prognosis, survival and drug response in patients with stage III colon cancer. Patients and Methods Cancer and Leukemia Group B 89803 (Alliance) randomly assigned 1,264 patients with stage III colon cancer to postoperative weekly adjuvant bolus 5-fluorouracil/leucovorin (FU/LV) or irinotecan+FU/LV (IFL), with 8 year follow-up. Tumors from these patients were analyzed to determine stability of a T11 tract in the MRE11 gene. The primary endpoint was overall survival (OS), and a secondary endpoint was disease-free survival (DFS). Non-proportional hazards were addressed using time-dependent covariates in Cox analyses. Results Of 625 tumor cases examined, 70 (11.2%) exhibited contraction at the T11 tract in one or both MRE11 alleles and were thus predicted to be deficient in MRE11 (dMRE11). In pooled treatment analyses, dMRE11 patients showed initially reduced DFS and OS but improved long-term DFS and OS compared with patients with an intact MRE11 T11 tract. In the subgroup of dMRE11 patients treated with IFL, an unexplained early increase in mortality but better long-term DFS than IFL-treated pMRE11 patients was observed. Conclusions Analysis of this relatively small number of patients and events showed that the dMRE11 marker predicts better prognosis independent of treatment in the long-term. In subgroup analyses, dMRE11 patients treated with irinotecan exhibited unexplained short-term mortality. MRE11 status is readily assayed and may therefore prove to be a useful prognostic marker, provided that the results reported here for a relatively small number of patients can be generalized in independent analyses of larger numbers of samples. Trial Registration ClinicalTrials.gov NCT00003835 PMID:25310185

  20. MRE11-deficiency associated with improved long-term disease free survival and overall survival in a subset of stage III colon cancer patients in randomized CALGB 89803 trial.

    Directory of Open Access Journals (Sweden)

    Thomas Pavelitz

    Full Text Available Colon cancers deficient in mismatch repair (MMR may exhibit diminished expression of the DNA repair gene, MRE11, as a consequence of contraction of a T11 mononucleotide tract. This study investigated MRE11 status and its association with prognosis, survival and drug response in patients with stage III colon cancer.Cancer and Leukemia Group B 89803 (Alliance randomly assigned 1,264 patients with stage III colon cancer to postoperative weekly adjuvant bolus 5-fluorouracil/leucovorin (FU/LV or irinotecan+FU/LV (IFL, with 8 year follow-up. Tumors from these patients were analyzed to determine stability of a T11 tract in the MRE11 gene. The primary endpoint was overall survival (OS, and a secondary endpoint was disease-free survival (DFS. Non-proportional hazards were addressed using time-dependent covariates in Cox analyses.Of 625 tumor cases examined, 70 (11.2% exhibited contraction at the T11 tract in one or both MRE11 alleles and were thus predicted to be deficient in MRE11 (dMRE11. In pooled treatment analyses, dMRE11 patients showed initially reduced DFS and OS but improved long-term DFS and OS compared with patients with an intact MRE11 T11 tract. In the subgroup of dMRE11 patients treated with IFL, an unexplained early increase in mortality but better long-term DFS than IFL-treated pMRE11 patients was observed.Analysis of this relatively small number of patients and events showed that the dMRE11 marker predicts better prognosis independent of treatment in the long-term. In subgroup analyses, dMRE11 patients treated with irinotecan exhibited unexplained short-term mortality. MRE11 status is readily assayed and may therefore prove to be a useful prognostic marker, provided that the results reported here for a relatively small number of patients can be generalized in independent analyses of larger numbers of samples.ClinicalTrials.gov NCT00003835.

  1. Prostate cancer-specific survival among warfarin users in the Finnish Randomized Study of Screening for Prostate Cancer.

    Science.gov (United States)

    Kinnunen, Pete T T; Murtola, Teemu J; Talala, Kirsi; Taari, Kimmo; Tammela, Teuvo L J; Auvinen, Anssi

    2017-08-29

    Venous thromboembolic events (VTE) are common in cancer patients and associated with higher mortality. In vivo thrombosis and anticoagulation might be involved in tumor growth and progression. We studied the association of warfarin and other anticoagulant use as antithrombotic medication and prostate cancer (PCa) death in men with the disease. The study included 6,537 men diagnosed with PCa during 1995-2009. Information on anticoagulant use was obtained from a national reimbursement registry. Cox regression with adjustment for age, PCa risk group, primary therapy and use of other medication was performed to compare risk of PCa death between warfarin users with 1) men using other types of anticoagulants and 2) non-users of anticoagulants. Medication use was analyzed as a time-dependent variable to minimize immortal time bias. In total, 728 men died from PCa during a median follow-up of 9 years. Compared to anticoagulant non-users, post-diagnostic use of warfarin was associated with an increased risk of PCa death (overall HR 1.47, 95% CI 1.13-1.93). However, this was limited to low-dose, low-intensity use. Otherwise, the risk was similar to anticoagulant non-users. Additionally, we found no risk difference between warfarin and other types of anticoagulants. Pre-diagnostic use of warfarin was not associated with the risk of PCa death. We found no reduction in risk of PCa death associated with warfarin use. Conversely, the risk was increased in short-term use, which is probably explained by a higher risk of thrombotic events prompting warfarin use in patients with terminal PCa.

  2. Lymphadenectomy extent and survival of patients with gastric carcinoma: a systematic review and meta-analysis of time-to-event data from randomized trials.

    Science.gov (United States)

    Mocellin, Simone; Nitti, Donato

    2015-05-01

    The extent of lymph node dissection in patients with resectable non-metastatic primary carcinoma of the stomach is still a controversial matter of debate, with special regard to its effect on survival. We conducted a systematic review and meta-analysis of time-to-event data from randomized controlled trials (RCTs) comparing the three main types of lymphadenectomy (D1, D2, and D3) for gastric cancer. Hazard ratio (HR) was considered the effect measure for both overall (OS), disease-specific (DSS) and disease-free survival (DFS). The quality of the available evidence was assessed using the GRADE system. Eight RCTs enrolling 2515 patients were eligible. The meta-analysis of four RCTs (n=1599) showed a significant impact of D2 versus D1 lymphadenectomy on DSS (summary HR=0.807, CI: 0.705-0.924, P=0.002), the corresponding number-to-treat being equal to ten. This effect remained clinically valuable even after adjustment for postoperative mortality. However, the quality of evidence was graded as moderate due to inconsistency issues. When OS and DFS were considered, the meta-analysis of respectively five (n=1653) and three RCTs (n=1332) found no significant difference between D2 and D1 lymph node dissection (summary HR=0.911, CI: 0.708-1.172, P=0.471, and summary HR=0.946, CI: 0.840-1.066, P=0.366, respectively). However, at subgroup analysis D2 type resulted superior to D1 type lymphadenectomy in terms of OS considering the two RCTs carried out in Eastern countries (summary HR=0.627, CI: 0.396-0.994, P=0.047). As regards the D3 vs D2 comparison, the meta-analysis of the three available RCTs (n=862) showed no significant impact of more extended lymphadenectomy on OS (summary HR=0.990, CI: 0.814-1.205, P=0.924). Our findings support the superiority of D2 versus D1 lymphadenectomy in terms of survival benefit. However, this advantage is mainly limited to DSS, the level of evidence is moderate, and the interaction with other factors affecting patient survival (such as

  3. Adjusting survival time estimates to account for treatment switching in randomized controlled trials--an economic evaluation context: methods, limitations, and recommendations.

    Science.gov (United States)

    Latimer, Nicholas R; Abrams, Keith R; Lambert, Paul C; Crowther, Michael J; Wailoo, Allan J; Morden, James P; Akehurst, Ron L; Campbell, Michael J

    2014-04-01

    Treatment switching commonly occurs in clinical trials of novel interventions in the advanced or metastatic cancer setting. However, methods to adjust for switching have been used inconsistently and potentially inappropriately in health technology assessments (HTAs). We present recommendations on the use of methods to adjust survival estimates in the presence of treatment switching in the context of economic evaluations. We provide background on the treatment switching issue and summarize methods used to adjust for it in HTAs. We discuss the assumptions and limitations associated with adjustment methods and draw on results of a simulation study to make recommendations on their use. We demonstrate that methods used to adjust for treatment switching have important limitations and often produce bias in realistic scenarios. We present an analysis framework that aims to increase the probability that suitable adjustment methods can be identified on a case-by-case basis. We recommend that the characteristics of clinical trials, and the treatment switching mechanism observed within them, should be considered alongside the key assumptions of the adjustment methods. Key assumptions include the "no unmeasured confounders" assumption associated with the inverse probability of censoring weights (IPCW) method and the "common treatment effect" assumption associated with the rank preserving structural failure time model (RPSFTM). The limitations associated with switching adjustment methods such as the RPSFTM and IPCW mean that they are appropriate in different scenarios. In some scenarios, both methods may be prone to bias; "2-stage" methods should be considered, and intention-to-treat analyses may sometimes produce the least bias. The data requirements of adjustment methods also have important implications for clinical trialists.

  4. Assessment of imatinib as first-line treatment of chronic myeloid leukemia: 10-year survival results of the randomized CML study IV and impact of non-CML determinants.

    Science.gov (United States)

    Hehlmann, R; Lauseker, M; Saußele, S; Pfirrmann, M; Krause, S; Kolb, H J; Neubauer, A; Hossfeld, D K; Nerl, C; Gratwohl, A; Baerlocher, G M; Heim, D; Brümmendorf, T H; Fabarius, A; Haferlach, C; Schlegelberger, B; Müller, M C; Jeromin, S; Proetel, U; Kohlbrenner, K; Voskanyan, A; Rinaldetti, S; Seifarth, W; Spieß, B; Balleisen, L; Goebeler, M C; Hänel, M; Ho, A; Dengler, J; Falge, C; Kanz, L; Kremers, S; Burchert, A; Kneba, M; Stegelmann, F; Köhne, C A; Lindemann, H W; Waller, C F; Pfreundschuh, M; Spiekermann, K; Berdel, W E; Müller, L; Edinger, M; Mayer, J; Beelen, D W; Bentz, M; Link, H; Hertenstein, B; Fuchs, R; Wernli, M; Schlegel, F; Schlag, R; de Wit, M; Trümper, L; Hebart, H; Hahn, M; Thomalla, J; Scheid, C; Schafhausen, P; Verbeek, W; Eckart, M J; Gassmann, W; Pezzutto, A; Schenk, M; Brossart, P; Geer, T; Bildat, S; Schäfer, E; Hochhaus, A; Hasford, J

    2017-11-01

    Chronic myeloid leukemia (CML)-study IV was designed to explore whether treatment with imatinib (IM) at 400 mg/day (n=400) could be optimized by doubling the dose (n=420), adding interferon (IFN) (n=430) or cytarabine (n=158) or using IM after IFN-failure (n=128). From July 2002 to March 2012, 1551 newly diagnosed patients in chronic phase were randomized into a 5-arm study. The study was powered to detect a survival difference of 5% at 5 years. After a median observation time of 9.5 years, 10-year overall survival was 82%, 10-year progression-free survival was 80% and 10-year relative survival was 92%. Survival between IM400 mg and any experimental arm was not different. In a multivariate analysis, risk group, major-route chromosomal aberrations, comorbidities, smoking and treatment center (academic vs other) influenced survival significantly, but not any form of treatment optimization. Patients reaching the molecular response milestones at 3, 6 and 12 months had a significant survival advantage. For responders, monotherapy with IM400 mg provides a close to normal life expectancy independent of the time to response. Survival is more determined by patients' and disease factors than by initial treatment selection. Although improvements are also needed for refractory disease, more life-time can currently be gained by carefully addressing non-CML determinants of survival.

  5. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  6. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  7. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  8. Probability, random processes, and ergodic properties

    CERN Document Server

    Gray, Robert M

    2014-01-01

    In this new edition of this classic text, much of the material has been rearranged and revised for pedagogical reasons. Many classic inequalities and proofs are now incorporated into the text, and many citations have been added.

  9. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    Science.gov (United States)

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  10. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  11. Radiographic Progression-Free Survival as a Clinically Meaningful End Point in Metastatic Castration-Resistant Prostate Cancer: The PREVAIL Randomized Clinical Trial.

    Science.gov (United States)

    Rathkopf, Dana E; Beer, Tomasz M; Loriot, Yohann; Higano, Celestia S; Armstrong, Andrew J; Sternberg, Cora N; de Bono, Johann S; Tombal, Bertrand; Parli, Teresa; Bhattacharya, Suman; Phung, De; Krivoshik, Andrew; Scher, Howard I; Morris, Michael J

    2018-05-01

    Drug development for metastatic castration-resistant prostate cancer has been limited by a lack of clinically relevant trial end points short of overall survival (OS). Radiographic progression-free survival (rPFS) as defined by the Prostate Cancer Clinical Trials Working Group 2 (PCWG2) is a candidate end point that represents a clinically meaningful benefit to patients. To demonstrate the robustness of the PCWG2 definition and to examine the relationship between rPFS and OS. PREVAIL was a phase 3, randomized, double-blind, placebo-controlled multinational study that enrolled 1717 chemotherapy-naive men with metastatic castration-resistant prostate cancer from September 2010 through September 2012. The data were analyzed in November 2016. Patients were randomized 1:1 to enzalutamide 160 mg or placebo until confirmed radiographic disease progression or a skeletal-related event and initiation of either cytotoxic chemotherapy or an investigational agent for prostate cancer treatment. Sensitivity analyses (SAs) of investigator-assessed rPFS were performed using the final rPFS data cutoff (May 6, 2012; 439 events; SA1) and the interim OS data cutoff (September 16, 2013; 540 events; SA2). Additional SAs using investigator-assessed rPFS from the final rPFS data cutoff assessed the impact of skeletal-related events (SA3), clinical progression (SA4), a confirmatory scan for soft-tissue disease progression (SA5), and all deaths regardless of time after study drug discontinuation (SA6). Correlations between investigator-assessed rPFS (SA2) and OS were calculated using Spearman ρ and Kendall τ via Clayton copula. In the 1717 men (mean age, 72.0 [range, 43.0-93.0] years in enzalutamide arm and 71.0 [range, 42.0-93.0] years in placebo arm), enzalutamide significantly reduced risk of radiographic progression or death in all SAs, with hazard ratios of 0.22 (SA1; 95% CI, 0.18-0.27), 0.31 (SA2; 95% CI, 0.27-0.35), 0.21 (SA3; 95% CI, 0.18-0.26), 0.21 (SA4; 95% CI, 0.17-0.26), 0

  12. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  13. Chemohormonal Therapy in Metastatic Hormone-Sensitive Prostate Cancer: Long-Term Survival Analysis of the Randomized Phase III E3805 CHAARTED Trial.

    Science.gov (United States)

    Kyriakopoulos, Christos E; Chen, Yu-Hui; Carducci, Michael A; Liu, Glenn; Jarrard, David F; Hahn, Noah M; Shevrin, Daniel H; Dreicer, Robert; Hussain, Maha; Eisenberger, Mario; Kohli, Manish; Plimack, Elizabeth R; Vogelzang, Nicholas J; Picus, Joel; Cooney, Matthew M; Garcia, Jorge A; DiPaola, Robert S; Sweeney, Christopher J

    2018-04-10

    Purpose Docetaxel added to androgen-deprivation therapy (ADT) significantly increases the longevity of some patients with metastatic hormone-sensitive prostate cancer. Herein, we present the outcomes of the CHAARTED (Chemohormonal Therapy Versus Androgen Ablation Randomized Trial for Extensive Disease in Prostate Cancer) trial with more mature follow-up and focus on tumor volume. Patients and Methods In this phase III study, 790 patients with metastatic hormone-sensitive prostate cancer were equally randomly assigned to receive either ADT in combination with docetaxel 75 mg/m 2 for up to six cycles or ADT alone. The primary end point of the study was overall survival (OS). Additional analyses of the prospectively defined low- and high-volume disease subgroups were performed. High-volume disease was defined as presence of visceral metastases and/or ≥ four bone metastases with at least one outside of the vertebral column and pelvis. Results At a median follow-up of 53.7 months, the median OS was 57.6 months for the chemohormonal therapy arm versus 47.2 months for ADT alone (hazard ratio [HR], 0.72; 95% CI, 0.59 to 0.89; P = .0018). For patients with high-volume disease (n = 513), the median OS was 51.2 months with chemohormonal therapy versus 34.4 months with ADT alone (HR, 0.63; 95% CI, 0.50 to 0.79; P OS benefit was observed (HR, 1.04; 95% CI, 0.70 to 1.55; P = .86). Conclusion The clinical benefit from chemohormonal therapy in prolonging OS was confirmed for patients with high-volume disease; however, for patients with low-volume disease, no OS benefit was discerned.

  14. Improved leukemia-free survival after postconsolidation immunotherapy with histamine dihydrochloride and interleukin-2 in acute myeloid leukemia: results of a randomized phase 3 trial.

    Science.gov (United States)

    Brune, Mats; Castaigne, Sylvie; Catalano, John; Gehlsen, Kurt; Ho, Anthony D; Hofmann, Wolf-Karsten; Hogge, Donna E; Nilsson, Bo; Or, Reuven; Romero, Ana I; Rowe, Jacob M; Simonsson, Bengt; Spearing, Ruth; Stadtmauer, Edward A; Szer, Jeff; Wallhult, Elisabeth; Hellstrand, Kristoffer

    2006-07-01

    The primary objective of this phase 3 study was to determine whether postconsolidation immunotherapy with interleukin-2 (IL-2) and histamine dihydrochloride (HDC) improved the leukemia-free survival (LFS) of adult patients with acute myeloid leukemia (AML) in complete remission (CR). Three hundred twenty patients with AML (median age, 57 years; range, 18-84 years) were stratified by CR1 or subsequent CR (CR > 1) and randomly assigned to treatment with HDC/IL-2 or no treatment (control). Treatment comprised 10 21-day cycles with IL-2 (16 400 U/kg) plus HDC (0.5 mg); both compounds were administered by subcutaneous injection twice daily. Study arms were balanced for age, sex, previous treatment, leukemic karyotypes, time from CR to inclusion, and frequency of secondary leukemia. Three years after enrollment of the last patient, treatment with HDC/IL-2 was found to improve LFS over control in the study population (CR1 + CR > 1, n = 320; P < .01, log-rank test). For patients in CR1 (n = 261), treatment significantly improved LFS (P = .01) with 3-year LFS estimates of 40% (HDC/IL-2) compared with 26% (control). Side effects were typically mild to moderate. These results indicate that HDC/IL-2 treatment offers an efficacious and tolerable treatment for patients with AML in remission.

  15. Randomized study of control of the primary tumor and survival using preoperative radiation, radiation alone, or surgery alone in head and beck carcinomas

    International Nuclear Information System (INIS)

    Hintz, B.; Charyulu, K.; Chandler, J.R.; Sudarsanam, A.; Garciga, C.

    1979-01-01

    Fifty-five selected patients with previously untreated squamous cell carcinoma of the head and neck regions were studied in a randomized, prospective manner. The three treatment categories were primary radiation (Gp R), primary surgery (Gp S), and preoperative radiation of 4000 rads in four weeks (Gp R/S). The local control rates for the 44 evaluable patients with a two-year minimum followup were 24%, 39%, and 43%, respectively. Further treatment attempts in patients failing initial therapy yielded local control rates of 35%, 39%, and 43% for Gp R, Gp S, and Gp R/S, respectively. None of the local control rates nor the corresponding survival curves were significantly different at P < 0.10. However, the group sizes were sufficiently small that true differences might not have been detected. Postoperative complications were higher in the primary radiation failures subsequently operated upon compared to the primary surgery group (P = 0.07). A table is included in which the types of postoperative complications are listed and enumerated according to treatment regime

  16. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  17. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  18. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  19. Six weeks of structured exercise training and hypocaloric diet increases the probability of ovulation after clomiphene citrate in overweight and obese patients with polycystic ovary syndrome: a randomized controlled trial.

    Science.gov (United States)

    Palomba, S; Falbo, A; Giallauria, F; Russo, T; Rocca, M; Tolino, A; Zullo, F; Orio, F

    2010-11-01

    Clomiphene citrate (CC) is the first-line therapy for the induction of ovulation in infertile women with polycystic ovary syndrome (PCOS), but ∼20% of patients are unresponsive. The aim of the current study was to test the hypothesis that a 6-week intervention that consisted of structured exercise training (SET) and hypocaloric diet increases the probability of ovulation after CC in overweight and obese CC-resistant PCOS patients. A cohort of 96 overweight and obese CC-resistant PCOS patients was enrolled consecutively in a three-arm randomized, parallel, controlled, assessor-blinded clinical trial. The three interventions were: SET plus hypocaloric diet for 6 weeks (Group A); 2 weeks of observation followed by one cycle of CC therapy (Group B); and SET plus hypocaloric diet for 6 weeks, with one cycle of CC after the first 2 weeks (Group C). The primary end-point was the ovulation rate. Other reproductive data, as well as anthropometric, hormonal and metabolic data, were also collected and considered as secondary end points. After 6 weeks of SET plus hypocaloric diet, the ovulation rate was significantly (P =0.008) higher in Group C [12/32 (37.5%)] than in Groups A [4/32 (12.5%)] and B [3/32 (9.4%)] with relative risks of 3.9 [95% confidence interval (CI) 1.1-8.3; P = 0.035] and 4.0 (95% CI 1.2-12.8; P = 0.020) compared with Groups A and B, respectively. Compared with baseline, in Groups A and C, a significant improvement in clinical and biochemical androgen and insulin sensitivity indexes was observed. In the same two groups, the insulin sensitivity index was significantly (P hypocaloric diet was effective in increasing the probability of ovulation under CC treatment. The study was registered at Clinicaltrials.gov:NCT0100468.

  20. Survival curves for irradiated cells

    International Nuclear Information System (INIS)

    Gibson, D.K.

    1975-01-01

    The subject of the lecture is the probability of survival of biological cells which have been subjected to ionising radiation. The basic mathematical theories of cell survival as a function of radiation dose are developed. A brief comparison with observed survival curves is made. (author)

  1. Individual Patient Data Analysis of Progression-Free Survival Versus Overall Survival As a First-Line End Point for Metastatic Colorectal Cancer in Modern Randomized Trials: Findings From the Analysis and Research in Cancers of the Digestive System Database

    NARCIS (Netherlands)

    Shi, Qian; de Gramont, Aimery; Grothey, Axel; Zalcberg, John; Chibaudel, Benoist; Schmoll, Hans-Joachim; Seymour, Matthew T.; Adams, Richard; Saltz, Leonard; Goldberg, Richard M.; Punt, Cornelis J. A.; Douillard, Jean-Yves; Hoff, Paulo M.; Hecht, Joel Randolph; Hurwitz, Herbert; Díaz-Rubio, Eduardo; Porschen, Rainer; Tebbutt, Niall C.; Fuchs, Charles; Souglakos, John; Falcone, Alfredo; Tournigand, Christophe; Kabbinavar, Fairooz F.; Heinemann, Volker; van Cutsem, Eric; Bokemeyer, Carsten; Buyse, Marc; Sargent, Daniel J.

    2015-01-01

    Purpose Progression-free survival (PFS) has previously been established as a surrogate for overall survival (OS) for first-line metastatic colorectal cancer (mCRC). Because mCRC treatment has advanced in the last decade with extended OS, this surrogacy requires re-examination. Methods Individual

  2. Is There a Role for Pelvic Irradiation in Localized Prostate Adenocarcinoma? Update of the Long-Term Survival Results of the GETUG-01 Randomized Study

    Energy Technology Data Exchange (ETDEWEB)

    Pommier, Pascal, E-mail: Pascal.pommier@lyon.unicancer.fr [Department of Radiation Oncology, Centre Léon Bérard, Lyon (France); Chabaud, Sylvie [Department of Clinical Research and Innovation, Centre Léon Bérard, Lyon (France); Lagrange, Jean-Leon [Department of Radiation Oncology, Centre Hospitalo-Universitaire H. Mondor, Créteil (France); Richaud, Pierre [Department of Radiation Oncology, Institut Bergognié, Bordeaux (France); Le Prise, Elisabeth [Department of Radiation Oncology, Centre Eugène Marquis, Rennes (France); Wagner, Jean-Philippe [Department of Radiation Oncology, Institut Andrée Dutreix, Dunkerque (France); Azria, David [Department of Radiation Oncology, Institut de Cancérologie de Montpellier, Montpellier (France); Beckendorf, Veronique [Department of Radiation Oncology, Institut de Cancérologie de Lorraine, Nancy (France); Suchaud, Jean-Philippe [Department of Radiation Oncology, Centre Hospitalier de Roanne, Roanne (France); Bernier, Valerie [Department of Radiation Oncology, Centre Oscar Lambret, Lille (France); Perol, David [Department of Clinical Research and Innovation, Centre Léon Bérard, Lyon (France); Carrie, Christian [Department of Radiation Oncology, Centre Léon Bérard, Lyon (France)

    2016-11-15

    Purpose: To report the long-term results of the French Genitourinary Study Group (GETUG)-01 study in terms of event-free survival (EFS) and overall survival (OS) and assess the potential interaction between hormonotherapy and pelvic nodes irradiation. Patients and Methods: Between December 1998 and June 2004, 446 patients with T1b-T3, N0pNx, M0 prostate carcinoma were randomly assigned to either pelvic nodes and prostate or prostate-only radiation therapy. Patients were stratified into 2 groups: “low risk” (T1-T2 and Gleason score 6 and prostate-specific antigen <3× the upper normal limit of the laboratory) (92 patients) versus “high risk” (T3 or Gleason score >6 or prostate-specific antigen >3× the upper normal limit of the laboratory). Short-term 6-month neoadjuvant and concomitant hormonal therapy was allowed only for high-risk patients. Radiation therapy was delivered with a 3-dimensional conformal technique, using a 4-field technique for the pelvic volume (46 Gy). The total dose recommended to the prostate moved from 66 Gy to 70 Gy during the course of the study. Criteria for EFS included biologic prostate-specific antigen recurrences and/or a local or metastatic progression. Results: With a median follow-up of 11.4 years, the 10-year OS and EFS were similar in the 2 treatment arms. A higher but nonsignificant EFS was observed in the low-risk subgroup in favor of pelvic nodes radiation therapy (77.2% vs 62.5%; P=.18). A post hoc subgroup analysis showed a significant benefit of pelvic irradiation when the risk of lymph node involvement was <15% (Roach formula). This benefit seemed to be limited to patients who did not receive hormonal therapy. Conclusion: Pelvic nodes irradiation did not statistically improve EFS or OS in the whole population but may be beneficial in selected low- and intermediate-risk prostate cancer patients treated with exclusive radiation therapy.

  3. Cost-effectiveness of invitation to food supplementation early in pregnancy combined with multiple micronutrients on infant survival: analysis of data from MINIMat randomized trial, Bangladesh.

    Science.gov (United States)

    Shaheen, Rubina; Persson, Lars Åke; Ahmed, Shakil; Streatfield, Peter Kim; Lindholm, Lars

    2015-05-28

    Absence of cost-effectiveness (CE) analyses limits the relevance of large-scale nutrition interventions in low-income countries. We analyzed if the effect of invitation to food supplementation early in pregnancy combined with multiple micronutrient supplements (MMS) on infant survival represented value for money compared to invitation to food supplementation at usual time in pregnancy combined with iron-folic acid. Outcome data, infant mortality (IM) rates, came from MINIMat trial (Maternal and Infant Nutrition Interventions, Matlab, ISRCTN16581394). In MINIMat, women were randomized to early (E around 9 weeks of pregnancy) or usual invitation (U around 20 weeks) to food supplementation and daily doses of 30 mg, or 60 mg iron with 400 μgm of folic acid, or MMS with 15 micronutrients including 30 mg iron and 400 μgm of folic acid. In MINIMat, EMMS significantly reduced IM compared to UFe60F (U plus 60 mg iron 400 μgm Folic acid). We present incremental CE ratios for incrementing UFe60F to EMMS. Costing data came mainly from a published study. By incrementing UFe60F to EMMS, one extra IM could be averted at a cost of US$907 and US$797 for NGO run and government run CNCs, respectively, and at US$1024 for a hypothetical scenario of highest cost. These comparisons generated one extra life year (LY) saved at US$30, US$27, and US$34, respectively. Incrementing UFe60F to EMMS in pregnancy seems worthwhile from health economic and public health standpoints. Maternal and Infant Nutrition Interventions, Matlab; ISRCTN16581394 ; Date of registration: Feb 16, 2009.

  4. Two-year survival analysis of twisted wire fixed retainer versus spiral wire and fiber-reinforced composite retainers: a preliminary explorative single-blind randomized clinical trial.

    Science.gov (United States)

    Sobouti, Farhad; Rakhshan, Vahid; Saravi, Mahdi Gholamrezaei; Zamanian, Ali; Shariati, Mahsa

    2016-03-01

    Traditional retainers (both metal and fiber-reinforced composite [FRC]) have limitations, and a retainer made from more flexible ligature wires might be advantageous. We aimed to compare an experimental design with two traditional retainers. In this prospective preliminary clinical trial, 150 post-treatment patients were enrolled and randomly divided into three groups of 50 patients each to receive mandibular canine-to-canine retainers made of FRC, flexible spiral wire (FSW), and twisted wire (TW). The patients were monitored monthly. The time at which the first signs of breakage/debonding were detected was recorded. The success rates of the retainers were compared using chi-squared, Kaplan-Meier, and Cox proportional-hazard regression analyses (α = 0.05). In total, 42 patients in the FRC group, 41 in the FSW group, and 45 in the TW group completed the study. The 2-year failure rates were 35.7% in the FRC group, 26.8% in the FSW group, and 17.8% in the TW group. These rates differed insignificantly (chi-squared p = 0.167). According to the Kaplan-Meier analysis, failure occurred at 19.95 months in the FRC group, 21.37 months in the FSW group, and 22.36 months in the TW group. The differences between the survival rates in the three groups were not significant (Cox regression p = 0.146). Although the failure rate of the experimental retainer was two times lower than that of the FRC retainer, the difference was not statistically significant. The experimental TW retainer was successful, and larger studies are warranted to verify these results.

  5. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  6. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  7. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  8. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  9. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  10. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  11. Influence of complete administration of adjuvant chemotherapy cycles on overall and disease-free survival in locally advanced rectal cancer: post hoc analysis of a randomized, multicenter, non-inferiority, phase 3 trial.

    Science.gov (United States)

    Sandra-Petrescu, Flavius; Herrle, Florian; Burkholder, Iris; Kienle, Peter; Hofheinz, Ralf-Dieter

    2018-04-03

    A randomized trial demonstrated that capecitabine is at least as effective as fluorouracil in the adjuvant treatment of patients with locally advanced rectal cancer. However, not all patients receive all planned cycles of chemotherapy. Therefore it is of interest how complete or partial administration of chemotherapy influences oncological outcome. A post hoc analysis of a trial with 401 randomized patients, nine being excluded because of missing data, was performed. 392 patients (197 - capecitabine, 195 - fluorouracil) could be analyzed regarding the number of administered adjuvant chemotherapy cycles. In the subgroup of 361 patients with an overall survival of at least six months, five-year overall and disease-free survival were analyzed in respect to completion (complete vs. incomplete) of chemotherapy cycles. Survival rates and curves were calculated and compared using the log-rank test. The effect of completion of chemotherapy was adjusted for relevant confounding factors. Two hundred fifty-one (64.0%) of analyzed patients received all postoperative scheduled cycles. Five-year overall survival was significantly better in these patients compared to the incomplete group (76.0 vs. 60.6%, p cycles. Five-year overall survival was also significantly better than in the incomplete group (76.0 vs. 66.4%, p = 0.0073). Five-year disease free survival was numerically better (64.9 vs. 58.7%, p = 0.0646; HR [not all cycles vs. all cycles] = 1.42 95% CI: [0.98, 2.07]). Cox regression models show a non-significant better OS (p = 0.061) and DFS (p = 0.083), if chemotherapy cycles were administered completely. Complete administration of chemotherapy cycles was associated with improved five-year overall and disease-free survival in patients with locally advanced rectal cancer.

  12. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Subdiffusivity of a random walk among a Poisson system of moving traps on ${\\mathbb Z}$

    OpenAIRE

    Athreya, Siva; Drewitz, Alexander; Sun, Rongfeng

    2016-01-01

    We consider a random walk among a Poisson system of moving traps on ${\\mathbb Z}$. In earlier work [DGRS12], the quenched and annealed survival probabilities of this random walk have been investigated. Here we study the path of the random walk conditioned on survival up to time $t$ in the annealed case and show that it is subdiffusive. As a by-product, we obtain an upper bound on the number of so-called thin points of a one-dimensional random walk, as well as a bound on the total volume of th...

  14. Comparative efficacy, tolerability, and survival outcomes of various radiopharmaceuticals in castration-resistant prostate cancer with bone metastasis: a meta-analysis of randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Tunio M

    2015-09-01

    Full Text Available Mutahir Tunio,1 Mushabbab Al Asiri,1 Abdulrehman Al Hadab,1 Yasser Bayoumi2 1Radiation Oncology, Comprehensive Cancer Center, King Fahad Medical City, Riyadh, Saudi Arabia; 2Radiation Oncology, National Cancer Institute, Cairo University, Cairo, Egypt Background: A meta-analysis was conducted to assess the impact of radiopharmaceuticals (RPs in castration-resistant prostate cancer (CRPC on pain control, symptomatic skeletal events (SSEs, toxicity profile, quality of life (QoL, and overall survival (OS.Materials and methods: The PubMed/MEDLINE, CANCERLIT, EMBASE, Cochrane Library database, and other search engines were searched to identify randomized controlled trials (RCTs comparing RPs with control (placebo or radiation therapy in metastatic CRPC. Data were extracted and assessed for the risk of bias (Cochrane’s risk of bias tool. Pooled data were expressed as odds ratio (OR, with 95% confidence intervals (CIs; Mantel–Haenszel fixed-effects model.Results: Eight RCTs with a total patient population of 1,877 patients were identified. The use of RP was associated with significant reduction in pain intensity and SSE (OR: 0.63, 95% CI: 0.51–0.78, I2=27%, P<0.0001, improved QoL (OR: 0.71, 95% CI: 0.55–0.91, I2=65%, three trials, 1,178 patients, P=0.006, and a minimal improved OS (OR: 0.84, 95% CI: 0.64–1.04, I2=47%, seven trials, 1,845 patients, P=0.11. A subgroup analysis suggested an improved OS with radium-223 (OR: 0.68, 95% CI: 0.51–0.90, one trial, 921 patients and strontium-89 (OR: 0.21, 95% CI: 0.05–0.91, one trial, 49 patients. Strontium-89 (five trials was associated with increased rates of grade 3 and 4 thrombocytopenia (OR: 4.26, 95% CI: 2.22–8.18, P=0.01, leucopenia (OR: 7.98, 95% CI: 1.82–34.95, P=0.02, pain flare (OR: 6.82, 95% CI: 3.42–13.55, P=0.04, and emesis (OR: 3.61, 95% CI: 1.76–7.40, P=0.02.Conclusion: The use of RPs was associated with significant reduction in SSEs and improved QoL, while the radium-223

  15. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  16. Predictors for contrast media-induced nephropathy and long-term survival: Prospectively assessed data from the randomized controlled Dialysis-Versus-Diuresis (DVD) trial

    Science.gov (United States)

    Hölscher, Birgit; Heitmeyer, Christine; Fobker, Manfred; Breithardt, Günter; Schaefer, Roland M; Reinecke, Holger

    2008-01-01

    BACKGROUND: Among the numerous studies concerning contrast media-induced nephropathy (CIN), there was no prospective trial that provided data on the long-term outcomes. OBJECTIVES: To prospectively assess predictors of CIN and long-term outcomes of affected patients. METHODS: Four hundred twelve consecutive patients with serum creatinine levels of 115 μmol/L to 309 μmol/L (1.3 mg/dL to 3.5 mg/dL) undergoing elective coronary angiography were included. Patients were randomly assigned to periprocedural hydration alone, hydration plus onetime hemodialysis or hydration plus N-acetylcysteine. RESULTS: Multivariate logistic regression identified the following as predictors of CIN within 72 h (equivalent to an increase in creatinine 44.2 μmol/L [0.5 mg/dL] or more) : prophylactic postprocedural hemodialysis (OR 2.86, 95% CI 1.07 to 7.69), use of angiotensin-converting enzyme inhibitors (OR 6.16, 95% CI 2.01 to 18.93), baseline glomerular filtration rate (OR 0.94, 95% CI 0.90 to 0.98) and the amount of contrast media given (OR 1.01, 95% CI 1.00 to 1.01). With regard to long-term outcome (mean follow-up 649 days), multivariate Cox regression models found elevated creatinine levels at 30 days (hazard rate ratio [HRR] 5.48, 95% CI 2.85 to 10.53), but not CIN within 72 h (HRR 1.12, 95% CI 0.63 to 2.02), to be associated with increased mortality. In addition, independent predictors for death during follow-up included left ventricular ejection fraction lower than 35% (HRR 4.01, 95% CI 2.22 to 7.26), serum phosphate (HRR 1.64, 95% CI 1.10 to 2.43) and hemoglobin (HRR 0.80, 95% CI 0.67 to 0.96). CONCLUSION: From the present prospective trial, performance of post-procedural hemodialysis, use of angiotensin-converting enzyme inhibitors, reduced baseline glomerular filtration rate and amount of contrast media were independent predictors of CIN within 72 h after catheterization. Assessing renal function after 30 days, rather than within 72 h, seemed to be more predictive for

  17. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  18. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  19. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  20. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  1. Intrauterine human chorionic gonadotropin infusion in oocyte donors promotes endometrial synchrony and induction of early decidual markers for stromal survival: a randomized clinical trial.

    Science.gov (United States)

    Strug, Michael R; Su, Renwei; Young, James E; Dodds, William G; Shavell, Valerie I; Díaz-Gimeno, Patricia; Ruíz-Alonso, Maria; Simón, Carlos; Lessey, Bruce A; Leach, Richard E; Fazleabas, Asgerally T

    2016-07-01

    Does a single intrauterine infusion of human chorionic gonadotropin (hCG) at the time corresponding to a Day 3 embryo transfer in oocyte donors induce favorable molecular changes in the endometrium for embryo implantation? Intrauterine hCG was associated with endometrial synchronization between endometrial glands and stroma following ovarian stimulation and the induction of early decidual markers associated with stromal cell survival. The clinical potential for increasing IVF success rates using an intrauterine hCG infusion prior to embryo transfer remains unclear based on previously reported positive and non-significant findings. However, infusion of CG in the non-human primate increases the expression of pro-survival early decidual markers important for endometrial receptivity, including α-smooth muscle actin (α-SMA) and NOTCH1. Oocyte donors (n=15) were randomly assigned to receive an intrauterine infusion of 500 IU hCG (n=7) or embryo culture media vehicle (n=8) 3 days following oocyte retrieval during their donor stimulation cycle. Endometrial biopsies were performed 2 days later, followed by either RNA isolation or tissue fixation in formalin and paraffin embedding. Reverse transcription of total RNA from endometrial biopsies generated cDNA, which was used for analysis in the endometrial receptivity array (ERA; n = 5/group) or quantitative RT-PCR to determine relative expression of ESR1, PGR, C3 and NOTCH1. Tissue sections were stained with hematoxylin and eosin followed by blinded staging analysis for dating of endometrial glands and stroma. Immunostaining for ESR1, PGR, α-SMA, C3 and NOTCH1 was performed to determine their tissue localization. Intrauterine hCG infusion was associated with endometrial synchrony and reprograming of stromal development following ovarian stimulation. ESR1 and PGR were significantly elevated in the endometrium of hCG-treated patients, consistent with earlier staging. The ERA did not predict an overall positive impact of

  2. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    Science.gov (United States)

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data

  3. Prospective, Randomized, Double-Blind, Phase III Clinical Trial of Anti-T-Lymphocyte Globulin to Assess Impact on Chronic Graft-Versus-Host Disease-Free Survival in Patients Undergoing HLA-Matched Unrelated Myeloablative Hematopoietic Cell Transplantation.

    Science.gov (United States)

    Soiffer, Robert J; Kim, Haesook T; McGuirk, Joseph; Horwitz, Mitchell E; Johnston, Laura; Patnaik, Mrinal M; Rybka, Witold; Artz, Andrew; Porter, David L; Shea, Thomas C; Boyer, Michael W; Maziarz, Richard T; Shaughnessy, Paul J; Gergis, Usama; Safah, Hana; Reshef, Ran; DiPersio, John F; Stiff, Patrick J; Vusirikala, Madhuri; Szer, Jeff; Holter, Jennifer; Levine, James D; Martin, Paul J; Pidala, Joseph A; Lewis, Ian D; Ho, Vincent T; Alyea, Edwin P; Ritz, Jerome; Glavin, Frank; Westervelt, Peter; Jagasia, Madan H; Chen, Yi-Bin

    2017-12-20

    Purpose Several open-label randomized studies have suggested that in vivo T-cell depletion with anti-T-lymphocyte globulin (ATLG; formerly antithymocyte globulin-Fresenius) reduces chronic graft-versus-host disease (cGVHD) without compromising survival. We report a prospective, double-blind phase III trial to investigate the effect of ATLG (Neovii Biotech, Lexington, MA) on cGVHD-free survival. Patients and Methods Two hundred fifty-four patients 18 to 65 years of age with acute leukemia or myelodysplastic syndrome who underwent myeloablative HLA-matched unrelated hematopoietic cell transplantation (HCT) were randomly assigned one to one to placebo (n =128 placebo) or ATLG (n = 126) treatment at 27 sites. Patients received either ATLG or placebo 20 mg/kg per day on days -3, -2, -1 in addition to tacrolimus and methotrexate as GVHD prophylaxis. The primary study end point was moderate-severe cGVHD-free survival. Results Despite a reduction in grade 2 to 4 acute GVHD (23% v 40%; P = .004) and moderate-severe cGVHD (12% v 33%; P < .001) in ATLG recipients, no difference in moderate-severe cGVHD-free survival between ATLG and placebo was found (2-year estimate: 48% v 44%, respectively; P = .47). Both progression-free survival (PFS) and overall survival (OS) were lower with ATLG (2-year estimate: 47% v 65% [ P = .04] and 59% v 74% [ P = .034], respectively). Multivariable analysis confirmed that ATLG was associated with inferior PFS (hazard ratio, 1.55; 95% CI, 1.05 to 2.28; P = .026) and OS (hazard ratio, 1.74; 95% CI, 1.12 to 2.71; P = .01). Conclusion In this prospective, randomized, double-blind trial of ATLG in unrelated myeloablative HCT, the incorporation of ATLG did not improve moderate-severe cGVHD-free survival. Moderate-severe cGVHD was significantly lower with ATLG, but PFS and OS also were lower. Additional analyses are needed to understand the appropriate role for ATLG in HCT.

  4. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  5. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  6. Impact of prior treatment and depth of response on survival in MM-003, a randomized phase 3 study comparing pomalidomide plus low-dose dexamethasone versus high-dose dexamethasone in relapsed/refractory multiple myeloma

    Science.gov (United States)

    San Miguel, Jesus F.; Weisel, Katja C.; Song, Kevin W.; Delforge, Michel; Karlin, Lionel; Goldschmidt, Hartmut; Moreau, Philippe; Banos, Anne; Oriol, Albert; Garderet, Laurent; Cavo, Michele; Ivanova, Valentina; Alegre, Adrian; Martinez-Lopez, Joaquin; Chen, Christine; Renner, Christoph; Bahlis, Nizar Jacques; Yu, Xin; Teasdale, Terri; Sternas, Lars; Jacques, Christian; Zaki, Mohamed H.; Dimopoulos, Meletios A.

    2015-01-01

    Pomalidomide is a distinct oral IMiD® immunomodulatory agent with direct antimyeloma, stromal-support inhibitory, and immunomodulatory effects. The pivotal, multicenter, open-label, randomized phase 3 trial MM-003 compared pomalidomide + low-dose dexamethasone vs high-dose dexamethasone in 455 patients with refractory or relapsed and refractory multiple myeloma after failure of bortezomib and lenalidomide treatment. Initial results demonstrated significantly longer progression-free survival and overall survival with an acceptable tolerability profile for pomalidomide + low-dose dexamethasone vs high-dose dexamethasone. This secondary analysis describes patient outcomes by treatment history and depth of response. Pomalidomide + low-dose dexamethasone significantly prolonged progression-free survival and favored overall survival vs high-dose dexamethasone for all subgroups analyzed, regardless of prior treatments or refractory status. Both univariate and multivariate analyses showed that no variable relating to either the number (≤ or > 3) or type of prior treatment was a significant predictor of progression-free survival or overall survival. No cross-resistance with prior lenalidomide or thalidomide treatment was observed. Patients achieving a minimal response or better to pomalidomide + low-dose dexamethasone treatment experienced a survival benefit, which was even higher in those achieving at least a partial response (17.2 and 19.9 months, respectively, as compared with 7.5 months for patients with less than minimal response). These data suggest that pomalidomide + low-dose dexamethasone should be considered a standard of care in patients with refractory or relapsed and refractory multiple myeloma regardless of prior treatment. ClinicalTrials.gov: NCT01311687; EudraCT: 2010-019820-30. PMID:26160879

  7. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  8. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  9. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves

    Directory of Open Access Journals (Sweden)

    Guyot Patricia

    2012-02-01

    Full Text Available Abstract Background The results of Randomized Controlled Trials (RCTs on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. Methods We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios with statistics based on repeated reconstructions by multiple observers. Results The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. Conclusion The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  10. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves.

    Science.gov (United States)

    Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J

    2012-02-01

    The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  11. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  12. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  13. Probability densities and the radon variable transformation theorem

    International Nuclear Information System (INIS)

    Ramshaw, J.D.

    1985-01-01

    D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation

  14. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  15. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  16. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  17. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  18. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  19. Modelling survival

    DEFF Research Database (Denmark)

    Ashauer, Roman; Albert, Carlo; Augustine, Starrlight

    2016-01-01

    The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test...

  20. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  1. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  2. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  3. Survival analysis

    International Nuclear Information System (INIS)

    Badwe, R.A.

    1999-01-01

    The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model

  4. Individual patient data analysis of progression-free survival versus overall survival as a first-line end point for metastatic colorectal cancer in modern randomized trials: findings from the analysis and research in cancers of the digestive system database.

    Science.gov (United States)

    Shi, Qian; de Gramont, Aimery; Grothey, Axel; Zalcberg, John; Chibaudel, Benoist; Schmoll, Hans-Joachim; Seymour, Matthew T; Adams, Richard; Saltz, Leonard; Goldberg, Richard M; Punt, Cornelis J A; Douillard, Jean-Yves; Hoff, Paulo M; Hecht, Joel Randolph; Hurwitz, Herbert; Díaz-Rubio, Eduardo; Porschen, Rainer; Tebbutt, Niall C; Fuchs, Charles; Souglakos, John; Falcone, Alfredo; Tournigand, Christophe; Kabbinavar, Fairooz F; Heinemann, Volker; Van Cutsem, Eric; Bokemeyer, Carsten; Buyse, Marc; Sargent, Daniel J

    2015-01-01

    Progression-free survival (PFS) has previously been established as a surrogate for overall survival (OS) for first-line metastatic colorectal cancer (mCRC). Because mCRC treatment has advanced in the last decade with extended OS, this surrogacy requires re-examination. Individual patient data from 16,762 patients were available from 22 first-line mCRC studies conducted from 1997 to 2006; 12 of those studies tested antiangiogenic and/or anti-epidermal growth factor receptor agents. The relationship between PFS (first event of progression or death) and OS was evaluated by using R(2) statistics (the closer the value is to 1, the stronger the correlation) from weighted least squares regression of trial-specific hazard ratios estimated by using Cox and Copula models. Forty-four percent of patients received a regimen that included biologic agents. Median first-line PFS was 8.3 months, and median OS was 18.2 months. The correlation between PFS and OS was modest (R(2), 0.45 to 0.69). Analyses limited to trials that tested treatments with biologic agents, nonstrategy trials, or superiority trials did not improve surrogacy. In modern mCRC trials, in which survival after the first progression exceeds time to first progression, a positive but modest correlation was observed between OS and PFS at both the patient and trial levels. This finding demonstrates the substantial variability in OS introduced by the number of lines of therapy and types of effective subsequent treatments and the associated challenge to the use of OS as an end point to assess the benefit attributable to a single line of therapy. PFS remains an appropriate primary end point for first-line mCRC trials to detect the direct treatment effect of new agents. © 2014 by American Society of Clinical Oncology.

  5. Influence of intravenous amifostine on xerostomia, tumor control, and survival after radiotherapy for head-and- neck cancer: 2-year follow-up of a prospective, randomized, phase III trial

    International Nuclear Information System (INIS)

    Wasserman, Todd H.; Brizel, David M.; Henke, Michael; Monnier, Alain; Eschwege, Francois; Sauer, Rolf; Strnad, Vratislav

    2005-01-01

    Purpose: To evaluate chronic xerostomia and tumor control 18 and 24 months after initial treatment with amifostine in a randomized controlled trial of patients with head-and-neck cancer; at 12 months after radiotherapy (RT), amifostine had been shown to reduce xerostomia without changing tumor control. Methods and Materials: Adults with head-and-neck cancer who underwent once-daily RT for 5-7 weeks (total dose, 50-70 Gy) received either open-label amifostine (200 mg/m 2 i.v.) 15-30 min before each fraction of radiation (n = 150) or RT alone (control; n = 153). Results: Amifostine administration was associated with a reduced incidence of Grade ≥2 xerostomia over 2 years of follow-up (p = 0.002), an increase in the proportion of patients with meaningful (>0.1 g) unstimulated saliva production at 24 months (p = 0.011), and reduced mouth dryness scores on a patient benefit questionnaire at 24 months (p < 0.001). Locoregional control rate, progression-free survival, and overall survival were not significantly different between the amifostine group and the control group. Conclusions: Amifostine administration during head-and-neck RT reduces the severity and duration of xerostomia 2 years after treatment and does not seem to compromise locoregional control rates, progression-free survival, or overall survival

  6. Network ties and survival

    DEFF Research Database (Denmark)

    Acheampong, George; Narteh, Bedman; Rand, John

    2017-01-01

    Poultry farming has been touted as one of the major ways by which poverty can be reduced in low-income economies like Ghana. Yet, anecdotally there is a high failure rate among these poultry farms. This current study seeks to understand the relationship between network ties and survival chances...... of small commercial poultry farms (SCPFs). We utilize data from a 2-year network survey of SCPFs in rural Ghana. The survival of these poultry farms are modelled using a lagged probit model of farms that persisted from 2014 into 2015. We find that network ties are important to the survival chances...... but this probability reduces as the number of industry ties increases but moderation with dynamic capability of the firm reverses this trend. Our findings show that not all network ties aid survival and therefore small commercial poultry farmers need to be circumspect in the network ties they cultivate and develop....

  7. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  8. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  9. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  10. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  11. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  12. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  13. Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis

    Institute of Scientific and Technical Information of China (English)

    1990-01-01

    A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.

  14. Essays on Subjective Survival Probabilities, Consumption, and Retirement Decisions

    NARCIS (Netherlands)

    Kutlu Koc, Vesile

    2015-01-01

    Recent pension reforms in industrialized countries are, in part, motivated by the increased life expectancy. As individuals are expected to take more responsibility in their retirement planning and savings decisions, it is important to understand whether they are aware of improvements in life

  15. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  16. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  17. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  18. Integration, measure and probability

    CERN Document Server

    Pitt, H R

    2012-01-01

    Introductory treatment develops the theory of integration in a general context, making it applicable to other branches of analysis. More specialized topics include convergence theorems and random sequences and functions. 1963 edition.

  19. Long-term survival results of a randomized trial comparing gemcitabine/cisplatin and methotrexate/vinblastine/doxorubicin/cisplatin in patients with locally advanced and metastatic bladder cancer

    DEFF Research Database (Denmark)

    Roberts, J. T.; Maase, Hans von der; Sengeløv, Lisa

    2006-01-01

    Purpose: To compare long-term survival in patients with locally advanced       and metastatic transitional cell carcinoma (TCC) of the urothelium treated       with gemcitabine plus cisplatin (GC) or       methotrexate/vinblastine/doxorubicin/cisplatin (MVAC). PATIENTS AND       METHODS: Efficacy.......       CONCLUSIONS: Long-term overall and progression-free survival following       treatment with GC or MVAC are similar. These results strengthen the role       of GC as a standard of care in patients with locally advanced and       metastatic transitional-cell carcinoma (TCC)....

  20. Long-term survival results of a randomized trial comparing gemcitabine plus cisplatin, with methotrexate, vinblastine, doxorubicin, plus cisplatin in patients with bladder cancer

    DEFF Research Database (Denmark)

    Maase, Hans von der; Sengeløv, Lisa; Roberts, James T.

    2005-01-01

    PURPOSE: To compare long-term survival in patients with locally advanced       or metastatic transitional cell carcinoma (TCC) of the urothelium treated       with gemcitabine/cisplatin (GC) or       methotrexate/vinblastine/doxorubicin/cisplatin (MVAC). PATIENTS AND       METHODS: Efficacy data...... in patients with locally advanced or       metastatic TCC...

  1. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  2. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  3. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  4. Association between time to disease progression end points and overall survival in patients with neuroendocrine tumors

    Directory of Open Access Journals (Sweden)

    Singh S

    2014-08-01

    Full Text Available Simron Singh,1 Xufang Wang,2 Calvin HL Law1 1Sunnybrook Odette Cancer Center, University of Toronto, Toronto, ON, Canada; 2Novartis Oncology, Florham Park, NJ, USA Abstract: Overall survival can be difficult to determine for slowly progressing malignancies, such as neuroendocrine tumors. We investigated whether time to disease progression is positively associated with overall survival in patients with such tumors. A literature review identified 22 clinical trials in patients with neuroendocrine tumors that reported survival probabilities for both time to disease progression (progression-free survival and time to progression and overall survival. Associations between median time to disease progression and median overall survival and between treatment effects on time to disease progression and treatment effects on overall survival were analyzed using weighted least-squares regression. Median time to disease progression was significantly associated with median overall survival (coefficient 0.595; P=0.022. In the seven randomized studies identified, the risk reduction for time to disease progression was positively associated with the risk reduction for overall survival (coefficient on −ln[HR] 0.151; 95% confidence interval −0.843, 1.145; P=0.713. The significant association between median time to disease progression and median overall survival supports the assertion that time to disease progression is an alternative end point to overall survival in patients with neuroendocrine tumors. An apparent albeit not significant trend correlates treatment effects on time to disease progression and treatment effects on overall survival. Informal surveys of physicians’ perceptions are consistent with these concepts, although additional randomized trials are needed. Keywords: neuroendocrine tumors, progression-free survival, disease progression, mortality

  5. Random cyclic constitutive models of 0Cr18Ni10Ti pipe steel

    International Nuclear Information System (INIS)

    Zhao Yongxiang; Yang Bing

    2004-01-01

    Experimental study is performed on the random cyclic constitutive relations of a new pipe stainless steel, 0Cr18Ni10Ti, by an incremental strain-controlled fatigue test. In the test, it is verified that the random cyclic constitutive relations, like the wide recognized random cyclic strain-life relations, is an intrinsic fatigue phenomenon of engineering materials. Extrapolating the previous work by Zhao et al, probability-based constitutive models are constructed, respectively, on the bases of Ramberg-Osgood equation and its modified form. Scattering regularity and amount of the test data are taken into account. The models consist of the survival probability-strain-life curves, the confidence strain-life curves, and the survival probability-confidence-strain-life curves. Availability and feasibility of the models have been indicated by analysis of the present test data

  6. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  7. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  8. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  9. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  10. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  11. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  12. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  13. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  14. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  15. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  16. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  17. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  18. What is Probability Theory?

    Indian Academy of Sciences (India)

    IAS Admin

    statistics at all levels. .... P(Ai) for k < ∞ and A1,A2, ··· ,Ak ∈ F and Ai ∩ Aj = ∅ for i = j. Next, it is reasonable to require that F be closed .... roll of dice, card games such as Bridge. ..... ing data (i.e., generating random variables) according to ...

  19. Racial Differences in CYP3A4 Genotype and Survival Among Men Treated on Radiation Therapy Oncology Group (RTOG) 9202: A Phase III Randomized Trial

    International Nuclear Information System (INIS)

    Roach, Mack; Silvio, Michelle de; Rebbick, Timothy; Grignon, David; Rotman, Marvin; Wolkov, Harvey; Fisher, Barbara; Hanks, Gerald; Shipley, William U.; Pollack, Alan; Sandler, Howard; Watkins-Bruner, Deborah Ph.D.

    2007-01-01

    Purpose: Inherited genotypes may explain the inferior outcomes of African American (AA) men with prostate cancer. To understand how variation in CYP3A4 correlated with outcomes, a retrospective examination of the CYP3A4*1B genotype was performed on men treated with Radiation Therapy Oncology Group (RTOG) 92-02. Methods and Materials: From 1,514 cases, we evaluated 56 (28.4%) of 197 AA and 54 (4.3%) of 1,274 European American (EA) patients. All patients received goserelin and flutamide for 2 months before and during RT (STAD-RT) ± 24 months of goserelin (long-term androgen deprivation plus radiation [LTAD-RT]). Events studied included overall survival and biochemical progression using American Society for Therapeutic Radiology and Oncology consensus guidelines. Results: There were no differences in outcome in patients in with or without CYP3A4 data. There was an association between race and CYP3A4 polymorphisms with 75% of EAs having the Wild Type compared to only 25% of AA men (p <0.0001). There was no association between CYP3A4 classification or race and survival or progression. Conclusions: The samples analyzed support previously reported observations about the distribution of CYP3A4*1B genotype by race, but race was not associated with poorer outcome. However, patient numbers were limited, and selection bias cannot be completely ruled out

  20. Evaluation of probability and hazard in nuclear energy

    International Nuclear Information System (INIS)

    Novikov, V.Ya.; Romanov, N.L.

    1979-01-01

    Various methods of evaluation of accident probability on NPP are proposed because of NPP security statistic evaluation unreliability. The conception of subjective probability for quantitative analysis of security and hazard are described. Intrepretation of probability as real faith of an expert is assumed as a basis of the conception. It is suggested to study the event uncertainty in the framework of subjective probability theory which not only permits but demands to take into account expert opinions when evaluating the probability. These subjective expert evaluations effect to a certain extent the calculation of the usual mathematical event probability. The above technique is advantageous to use for consideration of a separate experiment or random event

  1. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  2. Randomized random walk on a random walk

    International Nuclear Information System (INIS)

    Lee, P.A.

    1983-06-01

    This paper discusses generalizations of the model introduced by Kehr and Kunter of the random walk of a particle on a one-dimensional chain which in turn has been constructed by a random walk procedure. The superimposed random walk is randomised in time according to the occurrences of a stochastic point process. The probability of finding the particle in a particular position at a certain instant is obtained explicitly in the transform domain. It is found that the asymptotic behaviour for large time of the mean-square displacement of the particle depends critically on the assumed structure of the basic random walk, giving a diffusion-like term for an asymmetric walk or a square root law if the walk is symmetric. Many results are obtained in closed form for the Poisson process case, and these agree with those given previously by Kehr and Kunter. (author)

  3. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  4. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  5. Inferential Statistics from Black Hispanic Breast Cancer Survival Data

    Directory of Open Access Journals (Sweden)

    Hafiz M. R. Khan

    2014-01-01

    Full Text Available In this paper we test the statistical probability models for breast cancer survival data for race and ethnicity. Data was collected from breast cancer patients diagnosed in United States during the years 1973–2009. We selected a stratified random sample of Black Hispanic female patients from the Surveillance Epidemiology and End Results (SEER database to derive the statistical probability models. We used three common model building criteria which include Akaike Information Criteria (AIC, Bayesian Information Criteria (BIC, and Deviance Information Criteria (DIC to measure the goodness of fit tests and it was found that Black Hispanic female patients survival data better fit the exponentiated exponential probability model. A novel Bayesian method was used to derive the posterior density function for the model parameters as well as to derive the predictive inference for future response. We specifically focused on Black Hispanic race. Markov Chain Monte Carlo (MCMC method was used for obtaining the summary results of posterior parameters. Additionally, we reported predictive intervals for future survival times. These findings would be of great significance in treatment planning and healthcare resource allocation.

  6. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  7. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  8. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  9. Clinical impact of tumor location on the colon cancer survival and recurrence: analyses of pooled data from three large phase III randomized clinical trials.

    Science.gov (United States)

    Aoyama, Toru; Kashiwabara, Kosuke; Oba, Koji; Honda, Michitaka; Sadahiro, Sotaro; Hamada, Chikuma; Maeda, Hiromichi; Mayanagi, Shuhei; Kanda, Mitsuro; Sakamoto, Junichi; Saji, Shigetoyo; Yoshikawa, Takaki

    2017-11-01

    The aim of the present study was to determine whether or not the overall survival (OS) and disease-free survival (DFS) were affected by the tumor location in patients who underwent curative resection for colon cancer in a pooled analysis of three large phase III studies performed in Japan. In total, 4029 patients were included in the present study. Patients were classified as having right-side colon cancer (RC) if the primary tumor was located in the cecum, ascending colon, hepatic flexure or transverse colon, and left-side colon cancer (LCC) if the tumor site was within the splenic flexure, descending colon, sigmoid colon or recto sigmoid junction. The risk factors for the OS and DFS were analyzed. In the present study, 1449 patients were RC, and 2580 were LCC. The OS rates at 3 and 5 years after surgery were 87.6% and 81.6% in the RC group and 91.5% and 84.5% in the LCC group, respectively. Uni- and multivariate analyses showed that RRC increased the risk of death by 19.7% (adjusted hazard ratio = 1.197; 95% confidence interval, 1.020-1.408; P = 0.0272). In contrast, the DFS was similar between the two locations. The present study confirmed that the tumor location was a risk factor for the OS in patients who underwent curative treatment for colon cancer. Tumor location may, therefore, need to be considered a stratification factor in future phase III trials of colon cancer. © 2017 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.

  10. Community and District Empowerment for Scale-up (CODES): a complex district-level management intervention to improve child survival in Uganda: study protocol for a randomized controlled trial.

    Science.gov (United States)

    Waiswa, Peter; O'Connell, Thomas; Bagenda, Danstan; Mullachery, Pricila; Mpanga, Flavia; Henriksson, Dorcus Kiwanuka; Katahoire, Anne Ruhweza; Ssegujja, Eric; Mbonye, Anthony K; Peterson, Stefan Swartling

    2016-03-11

    Innovative and sustainable strategies to strengthen districts and other sub-national health systems and management are urgently required to reduce child mortality. Although highly effective evidence-based and affordable child survival interventions are well-known, at the district level, lack of data, motivation, analytic and planning capacity often impedes prioritization and management weaknesses impede implementation. The Community and District Empowerment for Scale-up (CODES) project is a complex management intervention designed to test whether districts when empowered with data and management tools can prioritize and implement evidence-based child survival interventions equitably. The CODES strategy combines management, diagnostic, and evaluation tools to identify and analyze the causes of bottlenecks to implementation, build capacity of district management teams to implement context-specific solutions, and to foster community monitoring and social accountability to increase demand for services. CODES combines UNICEF tools designed to systematize priority setting, allocation of resources and problem solving with Community dialogues based on Citizen Report Cards and U-Reports used to engage and empower communities in monitoring health service provision and to demand for quality services. Implementation and all data collection will be by the districts teams or local Community-based Organizations who will be supported by two local implementing partners. The study will be evaluated as a cluster randomized trial with eight intervention and eight comparison districts over a period of 3 years. Evaluation will focus on differences in uptake of child survival interventions and will follow an intention-to-treat analysis. We will also document and analyze experiences in implementation including changes in management practices. By increasing the District Health Management Teams' capacity to prioritize and implement context-specific solutions, and empowering communities to

  11. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  12. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  13. Effect of Tailored Dose-Dense Chemotherapy vs Standard 3-Weekly Adjuvant Chemotherapy on Recurrence-Free Survival Among Women With High-Risk Early Breast Cancer: A Randomized Clinical Trial.

    Science.gov (United States)

    Foukakis, Theodoros; von Minckwitz, Gunter; Bengtsson, Nils-Olof; Brandberg, Yvonne; Wallberg, Birgitta; Fornander, Tommy; Mlineritsch, Brigitte; Schmatloch, Sabine; Singer, Christian F; Steger, Günther; Egle, Daniel; Karlsson, Eva; Carlsson, Lena; Loibl, Sibylle; Untch, Michael; Hellström, Mats; Johansson, Hemming; Anderson, Harald; Malmström, Per; Gnant, Michael; Greil, Richard; Möbus, Volker; Bergh, Jonas

    2016-11-08

    Standard dosing of chemotherapy based on body surface area results in marked interpatient variation in pharmacokinetics, toxic effects, and efficacy. Whether tailored dosing can improve outcomes is unknown, as is the role of dose-dense adjuvant chemotherapy. To determine whether tailored dose-dense adjuvant chemotherapy improves the outcomes of early breast cancer compared with a standard 3-weekly chemotherapy schedule. A randomized, open-label, phase 3 trial of women aged 65 years and younger who had surgery for nonmetastatic node-positive or high-risk node-negative breast cancer at 86 sites in Sweden, Germany, and Austria between February 20, 2007, and September 14, 2011. Patients were randomized 1:1 either to 4 cycles of leukocyte nadir-based tailored and dose-dense adjuvant epirubicin and cyclophosphamide every 2 weeks followed by 4 cycles of tailored dose-dense docetaxel every 2 weeks, or to standard-interval chemotherapy with 3 cycles of fluorouracil and epirubicin-cyclophosphamide every 3 weeks followed by 3 cycles of docetaxel every 3 weeks. The primary end point was breast cancer recurrence-free survival (BCRFS). Secondary end points included 5-year event-free survival (EFS), distant disease-free survival (DDFS), overall survival (OS), and rates of grade 3 or 4 toxic effects. Among 2017 randomized patients (1006 in the tailored dose-dense group and 1011 in the control group; median [IQR] age, 51 [45-58] years; 80% with hormone receptor-positive tumors; 97% with node-positive disease), 2000 received study treatment (≥1 cycle of chemotherapy; 1001 in the tailored dose-dense group and 999 in the control group). After a median follow-up of 5.3 years (IQR, 4.5-6.1 years), 269 BCRFS events were reported, 118 in the tailored dose-dense group and 151 in the control group (HR, 0.79; 95% CI, 0.61-1.01; log-rank P = .06; 5-year BCRFS, 88.7% vs 85.0%). The tailored dose-dense group had significantly better EFS than the control group (HR, 0.79; 95% CI, 0

  14. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  15. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  16. Inference of a random potential from random walk realizations: Formalism and application to the one-dimensional Sinai model with a drift

    International Nuclear Information System (INIS)

    Cocco, S; Monasson, R

    2009-01-01

    We consider the Sinai model, in which a random walker moves in a random quenched potential V, and ask the following questions: 1. how can the quenched potential V be inferred from the observations of one or more realizations of the random motion? 2. how many observations (walks) are required to make a reliable inference, that is, to be able to distinguish between two similar but distinct potentials, V 1 and V 2 ? We show how question 1 can be easily solved within the Bayesian framework. In addition, we show that the answer to question 2 is, in general, intimately connected to the calculation of the survival probability of a fictitious walker in a potential W defined from V 1 and V 2 , with partial absorption at sites where V 1 and V 2 do not coincide. For the one-dimensional Sinai model, this survival probability can be analytically calculated, in excellent agreement with numerical simulations.

  17. Effect of provision of an integrated neonatal survival kit and early cognitive stimulation package by community health workers on developmental outcomes of infants in Kwale County, Kenya: study protocol for a cluster randomized trial.

    Science.gov (United States)

    Pell, Lisa G; Bassani, Diego G; Nyaga, Lucy; Njagi, Isaac; Wanjiku, Catherine; Thiruchselvam, Thulasi; Macharia, William; Minhas, Ripudaman S; Kitsao-Wekulo, Patricia; Lakhani, Amyn; Bhutta, Zulfiqar A; Armstrong, Robert; Morris, Shaun K

    2016-09-08

    Each year, more than 200 million children under the age of 5 years, almost all in low- and middle-income countries (LMICs), fail to achieve their developmental potential. Risk factors for compromised development often coexist and include inadequate cognitive stimulation, poverty, nutritional deficiencies, infection and complications of being born low birthweight and/or premature. Moreover, many of these risk factors are closely associated with newborn morbidity and mortality. As compromised development has significant implications on human capital, inexpensive and scalable interventions are urgently needed to promote neurodevelopment and reduce risk factors for impaired development. This cluster randomized trial aims at evaluating the impact of volunteer community health workers delivering either an integrated neonatal survival kit, an early stimulation package, or a combination of both interventions, to pregnant women during their third trimester of pregnancy, compared to the current standard of care in Kwale County, Kenya. The neonatal survival kit comprises a clean delivery kit (sterile blade, cord clamp, clean plastic sheet, surgical gloves and hand soap), sunflower oil emollient, chlorhexidine, ThermoSpot(TM), Mylar infant sleeve, and a reusable instant heater. Community health workers are also equipped with a portable hand-held electric scale. The early cognitive stimulation package focuses on enhancing caregiver practices by teaching caregivers three key messages that comprise combining a gentle touch with making eye contact and talking to children, responsive feeding and caregiving, and singing. The primary outcome measure is child development at 12 months of age assessed with the Protocol for Child Monitoring (Infant and Toddler version). The main secondary outcome is newborn mortality. This study will provide evidence on effectiveness of delivering an innovative neonatal survival kit and/or early stimulation package to pregnant women in Kwale County

  18. Carbonaceous Survivability on Impact

    Science.gov (United States)

    Bunch, T. E.; Becker, Luann; Morrison, David (Technical Monitor)

    1994-01-01

    In order to gain knowledge about the potential contributions of comets and cosmic dust to the origin of life on Earth, we need to explore the survivability of their potential organic compounds on impact and the formation of secondary products that may have arisen from the chaotic events sustained by the carriers as they fell to Earth. We have performed a series of hypervelocity impact experiments using carbon-bearing impactors (diamond, graphite, kerogens, PAH crystals, and Murchison and Nogoya meteorites) into Al plate targets at velocities - 6 km/s. Estimated peak shock pressures probably did not exceed 120 GPa and peak shock temperatures were probably less than 4000 K for times of nano- to microsecs. Nominal crater dia. are less than one mm. The most significant results of these experiments are the preservation of the higher mass PAHs (e. g., pyrene relative to napthalene) and the formation of additional alkylated PAHs. We have also examined the residues of polystyrene projectiles impacted by a microparticle accelerator into targets at velocities up to 15 km/s. This talk will discuss the results of these experiments and their implications with respect to the survival of carbonaceous deliverables to early Earth. The prospects of survivability of organic molecules on "intact" capture of cosmic dust in space via soft: and hard cosmic dust collectors will also be discussed.

  19. Factors influencing survival and mark retention in postmetamorphic boreal chorus frogs

    Science.gov (United States)

    Swanson, Jennifer E; Bailey, Larissa L.; Muths, Erin L.; Funk, W. Chris

    2013-01-01

    The ability to track individual animals is crucial in many field studies and often requires applying marks to captured individuals. Toe clipping has historically been a standard marking method for wild amphibian populations, but more recent marking methods include visual implant elastomer and photo identification. Unfortunately, few studies have investigated the influence and effectiveness of marking methods for recently metamorphosed individuals and as a result little is known about this life-history phase for most amphibians. Our focus was to explore survival probabilities, mark retention, and mark migration in postmetamorphic Boreal Chorus Frogs (Psuedacris maculata) in a laboratory setting. One hundred forty-seven individuals were assigned randomly to two treatment groups or a control group. Frogs in the first treatment group were marked with visual implant elastomer, while frogs in the second treatment group were toe clipped. Growth and mortality were recorded for one year and resulting data were analyzed using known-fate models in Program MARK. Model selection results suggested that survival probabilities of frogs varied with time and showed some variation among marking treatments. We found that frogs with multiple toes clipped on the same foot had lower survival probabilities than individuals in other treatments, but individuals can be marked by clipping a single toe on two different feet without any mark loss or negative survival effects. Individuals treated with visual implant elastomer had a mark migration rate of 4% and mark loss rate of 6%, and also showed very little negative survival impacts relative to control individuals.

  20. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  1. Immune Biomarkers Predictive for Disease-Free Survival with Adjuvant Sunitinib in High-Risk Locoregional Renal Cell Carcinoma: From Randomized Phase III S-TRAC Study.

    Science.gov (United States)

    George, Daniel J; Martini, Jean-François; Staehler, Michael; Motzer, Robert J; Magheli, Ahmed; Escudier, Bernard; Gerletti, Paola; Li, Sherry; Casey, Michelle; Laguerre, Brigitte; Pandha, Hardev S; Pantuck, Allan J; Patel, Anup; Lechuga, Maria J; Ravaud, Alain

    2018-04-01

    Purpose: Adjuvant sunitinib therapy compared with placebo prolonged disease-free survival (DFS) in patients with locoregional high-risk renal cell carcinoma (RCC) in the S-TRAC trial (ClinicalTrials.gov number NCT00375674). A prospectively designed exploratory analysis of tissue biomarkers was conducted to identify predictors of treatment benefit. Experimental Design: Tissue blocks were used for immunohistochemistry (IHC) staining of programmed cell death ligand 1 (PD-L1), CD4, CD8, and CD68. DFS was compared between < versus ≥ median IHC parameter using the Kaplan-Meier method. For biomarkers with predictive potential, receiver operating characteristics curves were generated. Results: Baseline characteristics were similar in patients with ( n = 191) and without ( n = 419) IHC analysis. Among patients with IHC, longer DFS was observed in patients with tumor CD8 + T-cell density ≥ versus < median [median (95% CI), not reached (6.83-not reached) versus 3.47 years (1.73-not reached); hazard ratio (HR) 0.40 (95% CI, 0.20-0.81); P = 0.009] treated with sunitinib ( n = 101), but not with placebo ( n = 90). The sensitivity and specificity for CD8 + T-cell density in predicting DFS were 0.604 and 0.658, respectively. Shorter DFS was observed in placebo-treated patients with PD-L1 + versus PD-L1 - tumors (HR 1.75; P = 0.103). Among all patients with PD-L1 + tumors, DFS was numerically longer with sunitinib versus placebo (HR 0.58; P = 0.175). Conclusions: Greater CD8 + T-cell density in tumor tissue was associated with longer DFS with sunitinib but not placebo, suggesting predictive treatment effect utility. Further independent cohort validation studies are warranted. The prognostic value of PD-L1 expression in primary tumors from patients with high-risk nonmetastatic RCC should also be further explored. Clin Cancer Res; 24(7); 1554-61. ©2018 AACR . ©2018 American Association for Cancer Research.

  2. Random Fields

    Science.gov (United States)

    Vanmarcke, Erik

    1983-03-01

    Random variation over space and time is one of the few attributes that might safely be predicted as characterizing almost any given complex system. Random fields or "distributed disorder systems" confront astronomers, physicists, geologists, meteorologists, biologists, and other natural scientists. They appear in the artifacts developed by electrical, mechanical, civil, and other engineers. They even underlie the processes of social and economic change. The purpose of this book is to bring together existing and new methodologies of random field theory and indicate how they can be applied to these diverse areas where a "deterministic treatment is inefficient and conventional statistics insufficient." Many new results and methods are included. After outlining the extent and characteristics of the random field approach, the book reviews the classical theory of multidimensional random processes and introduces basic probability concepts and methods in the random field context. It next gives a concise amount of the second-order analysis of homogeneous random fields, in both the space-time domain and the wave number-frequency domain. This is followed by a chapter on spectral moments and related measures of disorder and on level excursions and extremes of Gaussian and related random fields. After developing a new framework of analysis based on local averages of one-, two-, and n-dimensional processes, the book concludes with a chapter discussing ramifications in the important areas of estimation, prediction, and control. The mathematical prerequisite has been held to basic college-level calculus.

  3. Long-term survival in the randomized trial of drug treatment in mild to moderate hypertension of the Oslo study 1972-3.

    Science.gov (United States)

    Holme, Ingar; Kjeldsen, Sverre E

    2015-03-01

    In the Oslo cardiovascular study of 1972-3 a 5-year randomized trial in mild to moderate hypertension was performed. Several changes in treatment practices have been recommended since that time. We followed the mortality patterns up to 40 years. Invited to the Oslo study screening were 25,915 middle-aged men and 16,203 (63%) participated. Reexaminations were done to select suitable participants into the trial. Men had blood pressure 150-179/95-109 mm Hg and the active group (n=406) was treated with thiazides, alpha-methyldopa and propranolol versus untreated controls (n=379). Cox regression analysis was used for statistical analyses. There was no trend towards reduction in total mortality by treatment. A nominally significant increase in risk of death at first myocardial infarction was observed in the trial treatment group across the follow-up period, HR=1.51 (1.01-2.25); (P=0.042). The excess risk developed rapidly during the first 15 years, but the gap between the groups diminished to a large extent during the next 15 years, but the curves stayed at a certain distance for the last 10 years. Cerebrovascular death tended to be non-significantly reduced, HR=0.85 (0.52-1.41). Drug treatment of mild hypertensive men initiated in the 1970s did not reduce mortality at first MI or total mortality. However, during the period (late 1980s and whole 1990s), when large changes in hypertension treatment practices occurred into regimes with more use of combination therapies including metabolically neutral drugs at lower doses, beneficial effects on MI mortality could be observed. Copyright © 2015 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  4. Biomarker analyses and final overall survival results from a phase III, randomized, open-label, first-line study of gefitinib versus carboplatin/paclitaxel in clinically selected patients with advanced non-small-cell lung cancer in Asia (IPASS).

    Science.gov (United States)

    Fukuoka, Masahiro; Wu, Yi-Long; Thongprasert, Sumitra; Sunpaweravong, Patrapim; Leong, Swan-Swan; Sriuranpong, Virote; Chao, Tsu-Yi; Nakagawa, Kazuhiko; Chu, Da-Tong; Saijo, Nagahiro; Duffield, Emma L; Rukazenkov, Yuri; Speake, Georgina; Jiang, Haiyi; Armour, Alison A; To, Ka-Fai; Yang, James Chih-Hsin; Mok, Tony S K

    2011-07-20

    The results of the Iressa Pan-Asia Study (IPASS), which compared gefitinib and carboplatin/paclitaxel in previously untreated never-smokers and light ex-smokers with advanced pulmonary adenocarcinoma were published previously. This report presents overall survival (OS) and efficacy according to epidermal growth factor receptor (EGFR) biomarker status. In all, 1,217 patients were randomly assigned. Biomarkers analyzed were EGFR mutation (amplification mutation refractory system; 437 patients evaluable), EGFR gene copy number (fluorescent in situ hybridization; 406 patients evaluable), and EGFR protein expression (immunohistochemistry; 365 patients evaluable). OS analysis was performed at 78% maturity. A Cox proportional hazards model was used to assess biomarker status by randomly assigned treatment interactions for progression-free survival (PFS) and OS. OS (954 deaths) was similar for gefitinib and carboplatin/paclitaxel with no significant difference between treatments overall (hazard ratio [HR], 0.90; 95% CI, 0.79 to 1.02; P = .109) or in EGFR mutation-positive (HR, 1.00; 95% CI, 0.76 to 1.33; P = .990) or EGFR mutation-negative (HR, 1.18; 95% CI, 0.86 to 1.63; P = .309; treatment by EGFR mutation interaction P = .480) subgroups. A high proportion (64.3%) of EGFR mutation-positive patients randomly assigned to carboplatin/paclitaxel received subsequent EGFR tyrosine kinase inhibitors. PFS was significantly longer with gefitinib for patients whose tumors had both high EGFR gene copy number and EGFR mutation (HR, 0.48; 95% CI, 0.34 to 0.67) but significantly shorter when high EGFR gene copy number was not accompanied by EGFR mutation (HR, 3.85; 95% CI, 2.09 to 7.09). EGFR mutations are the strongest predictive biomarker for PFS and tumor response to first-line gefitinib versus carboplatin/paclitaxel. The predictive value of EGFR gene copy number was driven by coexisting EGFR mutation (post hoc analysis). Treatment-related differences observed for PFS in the EGFR

  5. Progression-free survival results in postmenopausal Asian women: subgroup analysis from a phase III randomized trial of fulvestrant 500 mg vs anastrozole 1 mg for hormone receptor-positive advanced breast cancer (FALCON).

    Science.gov (United States)

    Noguchi, Shinzaburo; Ellis, Matthew J; Robertson, John F R; Thirlwell, Jackie; Fazal, Mehdi; Shao, Zhimin

    2018-05-01

    The international, phase III FALCON study (NCT01602380) in postmenopausal patients with hormone receptor-positive, locally advanced/metastatic breast cancer (LA/MBC) who had not received prior endocrine therapy, demonstrated statistically significant improvement in progression-free survival (PFS) for patients who received fulvestrant 500 mg vs anastrozole 1 mg. This subgroup analysis evaluated PFS in Asian (randomized in China, Japan, or Taiwan) and non-Asian patients from the FALCON study. Eligible patients (estrogen receptor- and/or progesterone receptor-positive LA/MBC; World Health Organization performance status 0-2; ≥ 1 measurable/non-measurable lesion[s]) were randomized. PFS was assessed via Response Evaluation Criteria in Solid Tumours version 1.1, surgery/radiotherapy for disease worsening, or death (any cause). Secondary endpoints included: objective response rate, clinical benefit rate, duration of response, and duration of clinical benefit. Consistency of effect across subgroups was assessed via hazard ratios and 95% confidence intervals (CIs) using a log-rank test. Adverse events (AEs) were evaluated. Of the 462 randomized patients, the Asian and non-Asian subgroups comprised 67 and 395 patients, respectively. In the Asian subgroup, median PFS was 16.6 and 15.9 months with fulvestrant and anastrozole, respectively (hazard ratio 0.81; 95% CI 0.44-1.50). In the non-Asian subgroup, median PFS was 16.5 and 13.8 months, respectively (hazard ratio 0.79; 95% CI 0.62-1.01). Secondary outcomes were numerically improved with fulvestrant vs anastrozole in both subgroups. AE profiles were generally consistent between Asian and non-Asian subgroups. Results of this subgroup analysis suggest that treatment effects in the Asian patient subgroup are broadly consistent with the non-Asian population.

  6. A probability of synthesis of the superheavy element Z = 124

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First Grade College, Department of Physics, Kolar, Karnataka (India)

    2017-10-15

    We have studied the fusion cross section, evaporation residue cross section, compound nucleus formation probability (P{sub CN}) and survival probability (P{sub sur}) of different projectile target combinations to synthesize the superheavy element Z=124. Hence, we have identified the most probable projectile-target combination to synthesize the superheavy element Z = 124. To synthesize the superheavy element Z=124, the most probable projectile target combinations are Kr+Ra, Ni+Cm, Se+Th, Ge+U and Zn+Pu. We hope that our predictions may be a guide for the future experiments in the synthesis of superheavy nuclei Z = 124. (orig.)

  7. Survival Analysis

    CERN Document Server

    Miller, Rupert G

    2011-01-01

    A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.

  8. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  9. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    Science.gov (United States)

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  10. Surprisingly rational: probability theory plus noise explains biases in judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2014-07-01

    The systematic biases seen in people's probability judgments are typically taken as evidence that people do not use the rules of probability theory when reasoning about probability but instead use heuristics, which sometimes yield reasonable judgments and sometimes yield systematic biases. This view has had a major impact in economics, law, medicine, and other fields; indeed, the idea that people cannot reason with probabilities has become a truism. We present a simple alternative to this view, where people reason about probability according to probability theory but are subject to random variation or noise in the reasoning process. In this account the effect of noise is canceled for some probabilistic expressions. Analyzing data from 2 experiments, we find that, for these expressions, people's probability judgments are strikingly close to those required by probability theory. For other expressions, this account produces systematic deviations in probability estimates. These deviations explain 4 reliable biases in human probabilistic reasoning (conservatism, subadditivity, conjunction, and disjunction fallacies). These results suggest that people's probability judgments embody the rules of probability theory and that biases in those judgments are due to the effects of random noise. (c) 2014 APA, all rights reserved.

  11. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  12. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  13. Non-random temporary emigration and the robust design: Conditions for bias at the end of a time series: Section VIII

    Science.gov (United States)

    Langtimm, Catherine A.

    2008-01-01

    Deviations from model assumptions in the application of capture–recapture models to real life situations can introduce unknown bias. Understanding the type and magnitude of bias under these conditions is important to interpreting model results. In a robust design analysis of long-term photo-documented sighting histories of the endangered Florida manatee, I found high survival rates, high rates of non-random temporary emigration, significant time-dependence, and a diversity of factors affecting temporary emigration that made it difficult to model emigration in any meaningful fashion. Examination of the time-dependent survival estimates indicated a suspicious drop in survival rates near the end of the time series that persisted when the original capture histories were truncated and reanalyzed under a shorter time frame. Given the wide swings in manatee emigration estimates from year to year, a likely source of bias in survival was the convention to resolve confounding of the last survival probability in a time-dependent model with the last emigration probabilities by setting the last unmeasurable emigration probability equal to the previous year’s probability when the equality was actually false. Results of a series of simulations demonstrated that if the unmeasurable temporary emigration probabilities in the last time period were not accurately modeled, an estimation model with significant annual variation in survival probabilities and emigration probabilities produced bias in survival estimates at the end of the study or time series being explored. Furthermore, the bias propagated back in time beyond the last two time periods and the number of years affected varied positively with survival and emigration probabilities. Truncating the data to a shorter time frame and reanalyzing demonstrated that with additional years of data surviving temporary emigrants eventually return and are detected, thus in subsequent analysis unbiased estimates are eventually realized.

  14. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  15. Classical probability model for Bell inequality

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2014-01-01

    We show that by taking into account randomness of realization of experimental contexts it is possible to construct common Kolmogorov space for data collected for these contexts, although they can be incompatible. We call such a construction 'Kolmogorovization' of contextuality. This construction of common probability space is applied to Bell's inequality. It is well known that its violation is a consequence of collecting statistical data in a few incompatible experiments. In experiments performed in quantum optics contexts are determined by selections of pairs of angles (θ i ,θ ' j ) fixing orientations of polarization beam splitters. Opposite to the common opinion, we show that statistical data corresponding to measurements of polarizations of photons in the singlet state, e.g., in the form of correlations, can be described in the classical probabilistic framework. The crucial point is that in constructing the common probability space one has to take into account not only randomness of the source (as Bell did), but also randomness of context-realizations (in particular, realizations of pairs of angles (θ i , θ ' j )). One may (but need not) say that randomness of 'free will' has to be accounted for.

  16. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  17. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  18. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  19. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  20. On the universality of knot probability ratios

    Energy Technology Data Exchange (ETDEWEB)

    Janse van Rensburg, E J [Department of Mathematics and Statistics, York University, Toronto, Ontario M3J 1P3 (Canada); Rechnitzer, A, E-mail: rensburg@yorku.ca, E-mail: andrewr@math.ubc.ca [Department of Mathematics, University of British Columbia, 1984 Mathematics Road, Vancouver, BC V6T 1Z2 (Canada)

    2011-04-22

    Let p{sub n} denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let p{sub n}(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is p{sub n}(K)/p{sub n} and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of p{sub n}(K), but there is substantial numerical evidence. It is believed that the entropic exponent, {alpha}, is universal, while the exponential growth rate is independent of the knot type but varies with the lattice. The amplitude, C{sub K}, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L. In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot types in closed curves. (fast track communication)

  1. Palliative radiotherapy in addition to self-expanding metal stent for improving dysphagia and survival in advanced oesophageal cancer (ROCS: Radiotherapy after Oesophageal Cancer Stenting): study protocol for a randomized controlled trial.

    Science.gov (United States)

    Adamson, Douglas; Blazeby, Jane; Nelson, Annmarie; Hurt, Chris; Nixon, Lisette; Fitzgibbon, Jim; Crosby, Tom; Staffurth, John; Evans, Mim; Kelly, Noreen Hopewell; Cohen, David; Griffiths, Gareth; Byrne, Anthony

    2014-10-22

    The single most distressing symptom for patients with advanced esophageal cancer is dysphagia. Amongst the more effective treatments for relief of dysphagia is insertion of a self-expanding metal stent (SEMS). It is possible that the addition of a palliative dose of external beam radiotherapy may prolong the relief of dysphagia and provide additional survival benefit. The ROCS trial will assess the effect of adding palliative radiotherapy after esophageal stent insertion. The study is a randomized multicenter phase III trial, with an internal pilot phase, comparing stent alone versus stent plus palliative radiotherapy in patients with incurable esophageal cancer. Eligible participants are those with advanced esophageal cancer who are in need of stent insertion for primary management of dysphagia. Radiotherapy will be administered as 20 Gray (Gy) in five fractions over one week or 30 Gy in 10 fractions over two weeks, within four weeks of stent insertion. The internal pilot will assess rates and methods of recruitment; pre-agreed criteria will determine progression to the main trial. In total, 496 patients will be randomized in a 1:1 ratio with follow up until death. The primary outcome is time to progression of patient-reported dysphagia. Secondary outcomes include survival, toxicity, health resource utilization, and quality of life. An embedded qualitative study will explore the feasibility of patient recruitment by examining patients' motivations for involvement and their experiences of consent and recruitment, including reasons for not consenting. It will also explore patients' experiences of each trial arm. The ROCS study will be a challenging trial studying palliation in patients with a poor prognosis. The internal pilot design will optimize methods for recruitment and data collection to ensure that the main trial is completed on time. As a pragmatic trial, study strengths include collection of all follow-up data in the usual place of care, and a focus on

  2. Organ Preservation in Rectal Adenocarcinoma: a phase II randomized controlled trial evaluating 3-year disease-free survival in patients with locally advanced rectal cancer treated with chemoradiation plus induction or consolidation chemotherapy, and total mesorectal excision or nonoperative management

    International Nuclear Information System (INIS)

    Smith, J. Joshua; Chow, Oliver S.; Gollub, Marc J.; Nash, Garrett M.; Temple, Larissa K.; Weiser, Martin R.; Guillem, José G.; Paty, Philip B.; Avila, Karin; Garcia-Aguilar, Julio

    2015-01-01

    Treatment of patients with non-metastatic, locally advanced rectal cancer (LARC) includes pre-operative chemoradiation, total mesorectal excision (TME) and post-operative adjuvant chemotherapy. This trimodality treatment provides local tumor control in most patients; but almost one-third ultimately die from distant metastasis. Most survivors experience significant impairment in quality of life (QoL), due primarily to removal of the rectum. A current challenge lies in identifying patients who could safely undergo rectal preservation without sacrificing survival benefit and QoL. This multi-institutional, phase II study investigates the efficacy of total neoadjuvant therapy (TNT) and selective non-operative management (NOM) in LARC. Patients with MRI-staged Stage II or III rectal cancer amenable to TME will be randomized to receive FOLFOX/CAPEOX: a) before induction neoadjuvant chemotherapy (INCT); or b) after consolidation neoadjuvant chemotherapy (CNCT), with 5-FU or capecitabine-based chemoradiation. Patients in both arms will be re-staged after completing all neoadjuvant therapy. Those with residual tumor at the primary site will undergo TME. Patients with clinical complete response (cCR) will receive non-operative management (NOM). NOM patients will be followed every 3 months for 2 years, and every 6 months thereafter. TME patients will be followed according to NCCN guidelines. All will be followed for at least 5 years from the date of surgery or—in patients treated with NOM—the last day of treatment. The studies published thus far on the safety of NOM in LARC have compared survival between select groups of patients with a cCR after NOM, to patients with a pathologic complete response (pCR) after TME. The current study compares 3-year disease-free survival (DFS) in an entire population of patients with LARC, including those with cCR and those with pCR. We will compare the two arms of the study with respect to organ preservation at 3 years, treatment

  3. Organ Preservation in Rectal Adenocarcinoma: a phase II randomized controlled trial evaluating 3-year disease-free survival in patients with locally advanced rectal cancer treated with chemoradiation plus induction or consolidation chemotherapy, and total mesorectal excision or nonoperative management.

    Science.gov (United States)

    Smith, J Joshua; Chow, Oliver S; Gollub, Marc J; Nash, Garrett M; Temple, Larissa K; Weiser, Martin R; Guillem, José G; Paty, Philip B; Avila, Karin; Garcia-Aguilar, Julio

    2015-10-23

    Treatment of patients with non-metastatic, locally advanced rectal cancer (LARC) includes pre-operative chemoradiation, total mesorectal excision (TME) and post-operative adjuvant chemotherapy. This trimodality treatment provides local tumor control in most patients; but almost one-third ultimately die from distant metastasis. Most survivors experience significant impairment in quality of life (QoL), due primarily to removal of the rectum. A current challenge lies in identifying patients who could safely undergo rectal preservation without sacrificing survival benefit and QoL. This multi-institutional, phase II study investigates the efficacy of total neoadjuvant therapy (TNT) and selective non-operative management (NOM) in LARC. Patients with MRI-staged Stage II or III rectal cancer amenable to TME will be randomized to receive FOLFOX/CAPEOX: a) before induction neoadjuvant chemotherapy (INCT); or b) after consolidation neoadjuvant chemotherapy (CNCT), with 5-FU or capecitabine-based chemoradiation. Patients in both arms will be re-staged after completing all neoadjuvant therapy. Those with residual tumor at the primary site will undergo TME. Patients with clinical complete response (cCR) will receive non-operative management (NOM). NOM patients will be followed every 3 months for 2 years, and every 6 months thereafter. TME patients will be followed according to NCCN guidelines. All will be followed for at least 5 years from the date of surgery or--in patients treated with NOM--the last day of treatment. The studies published thus far on the safety of NOM in LARC have compared survival between select groups of patients with a cCR after NOM, to patients with a pathologic complete response (pCR) after TME. The current study compares 3-year disease-free survival (DFS) in an entire population of patients with LARC, including those with cCR and those with pCR. We will compare the two arms of the study with respect to organ preservation at 3 years, treatment compliance

  4. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  5. Entanglement probabilities of polymers: a white noise functional approach

    International Nuclear Information System (INIS)

    Bernido, Christopher C; Carpio-Bernido, M Victoria

    2003-01-01

    The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)θ. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel

  6. Decitabine improves progression-free survival in older high-risk MDS patients with multiple autosomal monosomies: results of a subgroup analysis of the randomized phase III study 06011 of the EORTC Leukemia Cooperative Group and German MDS Study Group.

    Science.gov (United States)

    Lübbert, Michael; Suciu, Stefan; Hagemeijer, Anne; Rüter, Björn; Platzbecker, Uwe; Giagounidis, Aristoteles; Selleslag, Dominik; Labar, Boris; Germing, Ulrich; Salih, Helmut R; Muus, Petra; Pflüger, Karl-Heinz; Schaefer, Hans-Eckart; Bogatyreva, Lioudmila; Aul, Carlo; de Witte, Theo; Ganser, Arnold; Becker, Heiko; Huls, Gerwin; van der Helm, Lieke; Vellenga, Edo; Baron, Frédéric; Marie, Jean-Pierre; Wijermans, Pierre W

    2016-01-01

    In a study of elderly AML patients treated with the hypomethylating agent decitabine (DAC), we noted a surprisingly favorable outcome in the (usually very unfavorable) subgroup with two or more autosomal monosomies (MK2+) within a complex karyotype (Lübbert et al., Haematologica 97:393-401, 2012). We now analyzed 206 myelodysplastic syndrome (MDS) patients (88 % of 233 patients randomized in the EORTC/GMDSSG phase III trial 06011, 61 of them with RAEBt, i.e. AML by WHO) with cytogenetics informative for MK status.. Endpoints are the following: complete/partial (CR/PR) and overall response rate (ORR) and progression-free (PFS) and overall survival (OS). Cytogenetic subgroups are the following: 63 cytogenetically normal (CN) patients, 143 with cytogenetic abnormalities, 73 of them MK-negative (MK-), and 70 MK-positive (MK+). These MK+ patients could be divided into 17 with a single autosomal monosomy (MK1) and 53 with at least two monosomies (MK2+). ORR with DAC in CN patients: 36.1 %, in MK- patients: 16.7 %, in MK+ patients: 43.6 % (MK1: 44.4 %, MK2+ 43.3 %). PFS was prolonged by DAC compared to best supportive care (BSC) in the CN (hazard ratio (HR) 0.55, 99 % confidence interval (CI), 0.26; 1.15, p = 0.03) and MK2+ (HR 0.50; 99 % CI, 0.23; 1.06, p = 0.016) but not in the MK-, MK+, and MK1 subgroups. OS was not improved by DAC in any subgroup. In conclusion, we demonstrate for the first time in a randomized phase III trial that high-risk MDS patients with complex karyotypes harboring two or more autosomal monosomies attain encouraging responses and have improved PFS with DAC treatment compared to BSC.

  7. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  8. The Influence of Mark-Recapture Sampling Effort on Estimates of Rock Lobster Survival.

    Directory of Open Access Journals (Sweden)

    Ziya Kordjazi

    Full Text Available Five annual capture-mark-recapture surveys on Jasus edwardsii were used to evaluate the effect of sample size and fishing effort on the precision of estimated survival probability. Datasets of different numbers of individual lobsters (ranging from 200 to 1,000 lobsters were created by random subsampling from each annual survey. This process of random subsampling was also used to create 12 datasets of different levels of effort based on three levels of the number of traps (15, 30 and 50 traps per day and four levels of the number of sampling-days (2, 4, 6 and 7 days. The most parsimonious Cormack-Jolly-Seber (CJS model for estimating survival probability shifted from a constant model towards sex-dependent models with increasing sample size and effort. A sample of 500 lobsters or 50 traps used on four consecutive sampling-days was required for obtaining precise survival estimations for males and females, separately. Reduced sampling effort of 30 traps over four sampling days was sufficient if a survival estimate for both sexes combined was sufficient for management of the fishery.

  9. The Influence of Mark-Recapture Sampling Effort on Estimates of Rock Lobster Survival

    Science.gov (United States)

    Kordjazi, Ziya; Frusher, Stewart; Buxton, Colin; Gardner, Caleb; Bird, Tomas

    2016-01-01

    Five annual capture-mark-recapture surveys on Jasus edwardsii were used to evaluate the effect of sample size and fishing effort on the precision of estimated survival probability. Datasets of different numbers of individual lobsters (ranging from 200 to 1,000 lobsters) were created by random subsampling from each annual survey. This process of random subsampling was also used to create 12 datasets of different levels of effort based on three levels of the number of traps (15, 30 and 50 traps per day) and four levels of the number of sampling-days (2, 4, 6 and 7 days). The most parsimonious Cormack-Jolly-Seber (CJS) model for estimating survival probability shifted from a constant model towards sex-dependent models with increasing sample size and effort. A sample of 500 lobsters or 50 traps used on four consecutive sampling-days was required for obtaining precise survival estimations for males and females, separately. Reduced sampling effort of 30 traps over four sampling days was sufficient if a survival estimate for both sexes combined was sufficient for management of the fishery. PMID:26990561

  10. Foreign Ownership and Long-term Survival

    DEFF Research Database (Denmark)

    Kronborg, Dorte; Thomsen, Steen

    2006-01-01

    probability. On average exit risk for domestic companies is 2.3 times higher than for foreign companies. First movers like Siemens, Philips, Kodak, Ford, GM or Goodyear have been active in the country for almost a century. Relative foreign survival increases with company age. However, the foreign survival...

  11. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  12. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  13. Probabilistic Survivability Versus Time Modeling

    Science.gov (United States)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  14. Absolute transition probabilities for 559 strong lines of neutral cerium

    Energy Technology Data Exchange (ETDEWEB)

    Curry, J J, E-mail: jjcurry@nist.go [National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)

    2009-07-07

    Absolute radiative transition probabilities are reported for 559 strong lines of neutral cerium covering the wavelength range 340-880 nm. These transition probabilities are obtained by scaling published relative line intensities (Meggers et al 1975 Tables of Spectral Line Intensities (National Bureau of Standards Monograph 145)) with a smaller set of published absolute transition probabilities (Bisson et al 1991 J. Opt. Soc. Am. B 8 1545). All 559 new values are for lines for which transition probabilities have not previously been available. The estimated relative random uncertainty of the new data is +-35% for nearly all lines.

  15. What Are Probability Surveys used by the National Aquatic Resource Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  16. Estimating true instead of apparent survival using spatial Cormack-Jolly-Seber models

    Science.gov (United States)

    Schaub, Michael; Royle, J. Andrew

    2014-01-01

    Survival is often estimated from capture–recapture data using Cormack–Jolly–Seber (CJS) models, where mortality and emigration cannot be distinguished, and the estimated apparent survival probability is the product of the probabilities of true survival and of study area fidelity. Consequently, apparent survival is lower than true survival unless study area fidelity equals one. Underestimation of true survival from capture–recapture data is a main limitation of the method.

  17. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  18. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  19. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  20. Quantum correlations in terms of neutrino oscillation probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Alok, Ashutosh Kumar, E-mail: akalok@iitj.ac.in [Indian Institute of Technology Jodhpur, Jodhpur 342011 (India); Banerjee, Subhashish, E-mail: subhashish@iitj.ac.in [Indian Institute of Technology Jodhpur, Jodhpur 342011 (India); Uma Sankar, S., E-mail: uma@phy.iitb.ac.in [Indian Institute of Technology Bombay, Mumbai 400076 (India)

    2016-08-15

    Neutrino oscillations provide evidence for the mode entanglement of neutrino mass eigenstates in a given flavour eigenstate. Given this mode entanglement, it is pertinent to consider the relation between the oscillation probabilities and other quantum correlations. In this work, we show that all the well-known quantum correlations, such as the Bell's inequality, are directly related to the neutrino oscillation probabilities. The results of the neutrino oscillation experiments, which measure the neutrino survival probability to be less than unity, imply Bell's inequality violation.

  1. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  2. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  3. STRIP: stream learning of influence probabilities

    DEFF Research Database (Denmark)

    Kutzkov, Konstantin

    2013-01-01

    cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...... algorithm is to keep up with a continuous stream of tweets using a small amount of time and memory. Our contribution is a number of randomized approximation algorithms, categorized according to the available space (superlinear, linear, and sublinear in the number of nodes n) and according to dierent models...

  4. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  5. Parent–offspring resemblance in colony-specific adult survival of cliff swallows

    Science.gov (United States)

    Brown, Charles R.; Roche, Erin A.; Brown, Mary Bomberger

    2015-01-01

    Survival is a key component of fitness. Species that occupy discrete breeding colonies with different characteristics are often exposed to varying costs and benefits associated with group size or environmental conditions, and survival is an integrative net measure of these effects. We investigated the extent to which survival probability of adult (≥1-year old) cliff swallows (Petrochelidon pyrrhonota) occupying different colonies resembled that of their parental cohort and thus whether the natal colony had long-term effects on individuals. Individuals were cross-fostered between colonies soon after hatching and their presence as breeders monitored at colonies in the western Nebraska study area for the subsequent decade. Colony-specific adult survival probabilities of offspring born and reared in the same colony, and those cross-fostered away from their natal colony soon after birth, were positively and significantly related to subsequent adult survival of the parental cohort from the natal colony. This result held when controlling for the effect of natal colony size and the age composition of the parental cohort. In contrast, colony-specific adult survival of offspring cross-fostered to a site was unrelated to that of their foster parent cohort or to the cohort of non-fostered offspring with whom they were reared. Adult survival at a colony varied inversely with fecundity, as measured by mean brood size, providing evidence for a survival–fecundity trade-off in this species. The results suggest some heritable variation in adult survival, likely maintained by negative correlations between fitness components. The study provides additional evidence that colonies represent non-random collections of individuals.

  6. Extensions and Applications of the Cox-Aalen Survival Model

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2003-01-01

    Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects......Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects...

  7. Flu Shots, Mammogram, and the Perception of Probabilities

    NARCIS (Netherlands)

    Carman, K.G.; Kooreman, P.

    2010-01-01

    We study individuals’ decisions to decline or accept preventive health care interventions such as flu shots and mammograms. In particular, we analyze the role of perceptions of the effectiveness of the intervention, by eliciting individuals' subjective probabilities of sickness and survival, with

  8. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  9. Sampling Random Bioinformatics Puzzles using Adaptive Probability Distributions

    DEFF Research Database (Denmark)

    Have, Christian Theil; Appel, Emil Vincent; Bork-Jensen, Jette

    2016-01-01

    We present a probabilistic logic program to generate an educational puzzle that introduces the basic principles of next generation sequencing, gene finding and the translation of genes to proteins following the central dogma in biology. In the puzzle, a secret "protein word" must be found by asse...

  10. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  11. Surviving Sengstaken.

    Science.gov (United States)

    Jayakumar, S; Odulaja, A; Patel, S; Davenport, M; Ade-Ajayi, N

    2015-07-01

    To report the outcomes of children who underwent Sengstaken-Blakemore tube (SBT) insertion for life-threatening haemetemesis. Single institution retrospective review (1997-2012) of children managed with SBT insertion. Patient demographics, diagnosis and outcomes were noted. Data are expressed as median (range). 19 children [10 male, age 1 (0.4-16) yr] were identified; 18 had gastro-oesophageal varices and 1 aorto-oesophageal fistula. Varices were secondary to: biliary atresia (n=8), portal vein thrombosis (n=5), alpha-1-anti-trypsin deficiency (n=1), cystic fibrosis (n=1), intrahepatic cholestasis (n=1), sclerosing cholangitis (n=1) and nodular hyperplasia with arterio-portal shunt (n=1). Three children deteriorated rapidly and did not survive to have post-SBT endoscopy. The child with an aortooesophageal fistula underwent aortic stent insertion and subsequently oesophageal replacement. Complications included gastric mucosal ulceration (n=3, 16%), pressure necrosis at lips and cheeks (n=6, 31%) and SBT dislodgment (n=1, 6%). Six (31%) children died. The remaining 13 have been followed up for 62 (2-165) months; five required liver transplantation, two underwent a mesocaval shunt procedure and 6 have completed endoscopic variceal obliteration and are under surveillance. SBT can be an effective, albeit temporary, life-saving manoeuvre in children with catastrophic haematemesis. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. How Life History Can Sway the Fixation Probability of Mutants

    Science.gov (United States)

    Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne

    2016-01-01

    In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected. PMID:27129737

  13. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  15. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  16. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  17. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  18. Reliability of structures by using probability and fatigue theories

    International Nuclear Information System (INIS)

    Lee, Ouk Sub; Kim, Dong Hyeok; Park, Yeon Chang

    2008-01-01

    Methodologies to calculate failure probability and to estimate the reliability of fatigue loaded structures are developed. The applicability of the methodologies is evaluated with the help of the fatigue crack growth models suggested by Paris and Walker. The probability theories such as the FORM (first order reliability method), the SORM (second order reliability method) and the MCS (Monte Carlo simulation) are utilized. It is found that the failure probability decreases with the increase of the design fatigue life and the applied minimum stress, the decrease of the initial edge crack size, the applied maximum stress and the slope of Paris equation. Furthermore, according to the sensitivity analysis of random variables, the slope of Pairs equation affects the failure probability dominantly among other random variables in the Paris and the Walker models

  19. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided

  20. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  1. Random phenomena; Phenomenes aleatoires

    Energy Technology Data Exchange (ETDEWEB)

    Bonnet, G. [Commissariat a l' energie atomique et aux energies alternatives - CEA, C.E.N.G., Service d' Electronique, Section d' Electronique, Grenoble (France)

    1963-07-01

    This document gathers a set of conferences presented in 1962. A first one proposes a mathematical introduction to the analysis of random phenomena. The second one presents an axiomatic of probability calculation. The third one proposes an overview of one-dimensional random variables. The fourth one addresses random pairs, and presents basic theorems regarding the algebra of mathematical expectations. The fifth conference discusses some probability laws: binomial distribution, the Poisson distribution, and the Laplace-Gauss distribution. The last one deals with the issues of stochastic convergence and asymptotic distributions.

  2. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  3. Visualization techniques for spatial probability density function data

    Directory of Open Access Journals (Sweden)

    Udeepta D Bordoloi

    2006-01-01

    Full Text Available Novel visualization methods are presented for spatial probability density function data. These are spatial datasets, where each pixel is a random variable, and has multiple samples which are the results of experiments on that random variable. We use clustering as a means to reduce the information contained in these datasets; and present two different ways of interpreting and clustering the data. The clustering methods are used on two datasets, and the results are discussed with the help of visualization techniques designed for the spatial probability data.

  4. Betting on Illusory Patterns: Probability Matching in Habitual Gamblers.

    Science.gov (United States)

    Gaissmaier, Wolfgang; Wilke, Andreas; Scheibehenne, Benjamin; McCanney, Paige; Barrett, H Clark

    2016-03-01

    Why do people gamble? A large body of research suggests that cognitive distortions play an important role in pathological gambling. Many of these distortions are specific cases of a more general misperception of randomness, specifically of an illusory perception of patterns in random sequences. In this article, we provide further evidence for the assumption that gamblers are particularly prone to perceiving illusory patterns. In particular, we compared habitual gamblers to a matched sample of community members with regard to how much they exhibit the choice anomaly 'probability matching'. Probability matching describes the tendency to match response proportions to outcome probabilities when predicting binary outcomes. It leads to a lower expected accuracy than the maximizing strategy of predicting the most likely event on each trial. Previous research has shown that an illusory perception of patterns in random sequences fuels probability matching. So does impulsivity, which is also reported to be higher in gamblers. We therefore hypothesized that gamblers will exhibit more probability matching than non-gamblers, which was confirmed in a controlled laboratory experiment. Additionally, gamblers scored much lower than community members on the cognitive reflection task, which indicates higher impulsivity. This difference could account for the difference in probability matching between the samples. These results suggest that gamblers are more willing to bet impulsively on perceived illusory patterns.

  5. Understanding survival analysis: Kaplan-Meier estimate.

    Science.gov (United States)

    Goel, Manish Kumar; Khanna, Pardeep; Kishore, Jugal

    2010-10-01

    Kaplan-Meier estimate is one of the best options to be used to measure the fraction of subjects living for a certain amount of time after treatment. In clinical trials or community trials, the effect of an intervention is assessed by measuring the number of subjects survived or saved after that intervention over a period of time. The time starting from a defined point to the occurrence of a given event, for example death is called as survival time and the analysis of group data as survival analysis. This can be affected by subjects under study that are uncooperative and refused to be remained in the study or when some of the subjects may not experience the event or death before the end of the study, although they would have experienced or died if observation continued, or we lose touch with them midway in the study. We label these situations as censored observations. The Kaplan-Meier estimate is the simplest way of computing the survival over time in spite of all these difficulties associated with subjects or situations. The survival curve can be created assuming various situations. It involves computing of probabilities of occurrence of event at a certain point of time and multiplying these successive probabilities by any earlier computed probabilities to get the final estimate. This can be calculated for two groups of subjects and also their statistical difference in the survivals. This can be used in Ayurveda research when they are comparing two drugs and looking for survival of subjects.

  6. Introduction to probability and stochastic processes with applications

    CERN Document Server

    Castañ, Blanco; Arunachalam, Viswanathan; Dharmaraja, Selvamuthu

    2012-01-01

    An easily accessible, real-world approach to probability and stochastic processes Introduction to Probability and Stochastic Processes with Applications presents a clear, easy-to-understand treatment of probability and stochastic processes, providing readers with a solid foundation they can build upon throughout their careers. With an emphasis on applications in engineering, applied sciences, business and finance, statistics, mathematics, and operations research, the book features numerous real-world examples that illustrate how random phenomena occur in nature and how to use probabilistic t

  7. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  8. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  9. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  10. Use of probability tables for propagating uncertainties in neutronics

    International Nuclear Information System (INIS)

    Coste-Delclaux, M.; Diop, C.M.; Lahaye, S.

    2017-01-01

    Highlights: • Moment-based probability table formalism is described. • Representation by probability tables of any uncertainty distribution is established. • Multiband equations for two kinds of uncertainty propagation problems are solved. • Numerical examples are provided and validated against Monte Carlo simulations. - Abstract: Probability tables are a generic tool that allows representing any random variable whose probability density function is known. In the field of nuclear reactor physics, this tool is currently used to represent the variation of cross-sections versus energy (neutron transport codes TRIPOLI4®, MCNP, APOLLO2, APOLLO3®, ECCO/ERANOS…). In the present article we show how we can propagate uncertainties, thanks to a probability table representation, through two simple physical problems: an eigenvalue problem (neutron multiplication factor) and a depletion problem.

  11. Random broadcast on random geometric graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Elsasser, Robert [UNIV OF PADERBORN; Friedrich, Tobias [ICSI/BERKELEY; Sauerwald, Tomas [ICSI/BERKELEY

    2009-01-01

    In this work, we consider the random broadcast time on random geometric graphs (RGGs). The classic random broadcast model, also known as push algorithm, is defined as: starting with one informed node, in each succeeding round every informed node chooses one of its neighbors uniformly at random and informs it. We consider the random broadcast time on RGGs, when with high probability: (i) RGG is connected, (ii) when there exists the giant component in RGG. We show that the random broadcast time is bounded by {Omicron}({radical} n + diam(component)), where diam(component) is a diameter of the entire graph, or the giant component, for the regimes (i), or (ii), respectively. In other words, for both regimes, we derive the broadcast time to be {Theta}(diam(G)), which is asymptotically optimal.

  12. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  13. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  14. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  15. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  16. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  17. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  18. Three Versus 6 Months of Oxaliplatin-Based Adjuvant Chemotherapy for Patients With Stage III Colon Cancer: Disease-Free Survival Results From a Randomized, Open-Label, International Duration Evaluation of Adjuvant (IDEA) France, Phase III Trial.

    Science.gov (United States)

    André, Thierry; Vernerey, Dewi; Mineur, Laurent; Bennouna, Jaafar; Desrame, Jérôme; Faroux, Roger; Fratte, Serge; Hug de Larauze, Marine; Paget-Bailly, Sophie; Chibaudel, Benoist; Bez, Jeremie; Dauba, Jérôme; Louvet, Christophe; Lepere, Céline; Dupuis, Olivier; Becouarn, Yves; Mabro, May; Egreteau, Joëlle; Bouche, Olivier; Deplanque, Gaël; Ychou, Marc; Galais, Marie Pierre; Ghiringhelli, François; Dourthe, Louis Marie; Bachet, Jean-Baptiste; Khalil, Ahmed; Bonnetain, Franck; de Gramont, Aimery; Taieb, Julien

    2018-05-20

    Purpose Reduction of adjuvant treatment duration may decrease toxicities without loss of efficacy in stage III colon cancer. This could offer clear advantages to patients and health care providers. Methods In International Duration Evaluation of Adjuvant Chemotherapy (IDEA) France, as part of the IDEA international collaboration, patient with colon cancer patients were randomly assigned to 3 and 6 months of modified FOLFOX6 (mFOLFOX6: infusional fluorouracil, leucovorin, and oxaliplatin) or capecitabine plus oxaliplatin (CAPOX) by physician choice. The primary end point was disease-free survival (DFS), and analyses were descriptive. Results A total of 2,010 eligible patients received either 3 or 6 months of chemotherapy (modified intention-to-treat population); 2,000 (99%) had stage III colon cancer (N1: 75%, N2: 25%); 1,809 (90%) received mFOLFOX6, and 201 (10%) received CAPOX. The median age was 64 years, and the median follow-up time was 4.3 years. Overall, 94% (3 months) and 78% (6 months) of patients completed treatment (fluoropyrimidines ± oxaliplatin). Maximal grade 2 and 3 neuropathy rates were 28% and 8% in the 3-month arm and 41% and 25% in the 6-month arm ( P < .001). Final rates of residual neuropathy greater than grade 1 were 3% in the 3-month arm and 7% in the 6-month arm ( P < .001). There were 578 DFS events: 314 and 264 in the 3- and 6-month arms, respectively. The 3-year DFS rates were 72% and 76% in the 3- and 6-month arms, respectively (hazard ratio [HR], 1.24; 95% CI, 1.05 to 1.46; P = .0112). In the 3 and 6-month arms, respectively, for patients who received mFOLFOX6, the 3-year DFS rates were 72% and 76% (HR, 1.27; 95% CI, 1.07 to 1.51); for the T4 and/or N2 population, they were 58% and 66% (HR, 1.44; 95% CI, 1.14 to 1.82); and for the T1-3N1 population, they were 81% and 83% (HR, 1.15; 95% CI, 0.89 to 1.49). Conclusion IDEA France, in which 90% of patients received mFOLFOX6, shows superiority of 6 months of adjuvant chemotherapy compared

  19. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  20. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  1. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  2. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  3. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  4. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  5. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  6. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  7. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  8. Estimating survival of dental fillings on the basis of interval-censored data and multi-state models

    DEFF Research Database (Denmark)

    Joly, Pierre; Gerds, Thomas A; Qvist, Vibeke

    2012-01-01

    We aim to compare the life expectancy of a filling in a primary tooth between two types of treatments. We define the probabilities that a dental filling survives without complication until the permanent tooth erupts from beneath (exfoliation). We relate the time to exfoliation of the tooth...... with all these particularities, we propose to use a parametric four-state model with three random effects to take into account the hierarchical cluster structure. For inference, right and interval censoring as well as left truncation have to be dealt with. With the proposed approach, we can conclude...... that the estimated probability that a filling survives without complication until exfoliation is larger for one treatment than for the other, for all ages of the child at the time of treatment....

  9. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States)

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S{sub st}, S{sub st}, p{sub st}) for stochastic uncertainty, a probability space (S{sub su}, S{sub su}, p{sub su}) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S{sub st}, S{sub st}, p{sub st}) and (S{sub su}, S{sub su}, p{sub su}). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency`s standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems.

  10. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  11. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  12. Social class and survival on the S.S. Titanic.

    Science.gov (United States)

    Hall, W

    1986-01-01

    Passengers' chances of surviving the sinking of the S.S. Titanic were related to their sex and their social class: females were more likely to survive than males, and the chances of survival declined with social class as measured by the class in which the passenger travelled. The probable reasons for these differences in rates of survival are discussed as are the reasons accepted by the Mersey Committee of Inquiry into the sinking.

  13. Determination of bounds on failure probability in the presence of ...

    Indian Academy of Sciences (India)

    In particular, fuzzy set theory provides a more rational framework for ..... indicating that the random variations inT andO2 do not affect failure probability significantly. ... The upper-bound for PF shown in figure 6 can be used in decision-making.

  14. Flipping Out: Calculating Probability with a Coin Game

    Science.gov (United States)

    Degner, Kate

    2015-01-01

    In the author's experience with this activity, students struggle with the idea of representativeness in probability. Therefore, this student misconception is part of the classroom discussion about the activities in this lesson. Representativeness is related to the (incorrect) idea that outcomes that seem more random are more likely to happen. This…

  15. The probability that a pair of group elements is autoconjugate

    Indian Academy of Sciences (India)

    Let and ℎ be arbitrary elements of a given finite group . Then and ℎ are said to be autoconjugate if there exists some automorphism of such that ℎ = . In this article, we construct some sharp bounds for the probability that two random elements of are autoconjugate, denoted by P a ( G ) . It is also shown that P ...

  16. Knots and Random Walks in Vibrated Granular Chains

    International Nuclear Information System (INIS)

    Ben-Naim, E.; Daya, Z. A.; Vorobieff, P.; Ecke, R. E.

    2001-01-01

    We study experimentally statistical properties of the opening times of knots in vertically vibrated granular chains. Our measurements are in good qualitative and quantitative agreement with a theoretical model involving three random walks interacting via hard-core exclusion in one spatial dimension. In particular, the knot survival probability follows a universal scaling function which is independent of the chain length, with a corresponding diffusive characteristic time scale. Both the large-exit-time and the small-exit-time tails of the distribution are suppressed exponentially, and the corresponding decay coefficients are in excellent agreement with theoretical values

  17. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  18. What Randomized Benchmarking Actually Measures

    International Nuclear Information System (INIS)

    Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; Sarovar, Mohan; Blume-Kohout, Robin

    2017-01-01

    Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not a well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. Here, these theories allow explicit computation of the error rate that RB measures (r), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.

  19. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  20. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  1. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  2. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  3. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  4. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  5. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  6. Confidence intervals for the lognormal probability distribution

    International Nuclear Information System (INIS)

    Smith, D.L.; Naberejnev, D.G.

    2004-01-01

    The present communication addresses the topic of symmetric confidence intervals for the lognormal probability distribution. This distribution is frequently utilized to characterize inherently positive, continuous random variables that are selected to represent many physical quantities in applied nuclear science and technology. The basic formalism is outlined herein and a conjured numerical example is provided for illustration. It is demonstrated that when the uncertainty reflected in a lognormal probability distribution is large, the use of a confidence interval provides much more useful information about the variable used to represent a particular physical quantity than can be had by adhering to the notion that the mean value and standard deviation of the distribution ought to be interpreted as best value and corresponding error, respectively. Furthermore, it is shown that if the uncertainty is very large a disturbing anomaly can arise when one insists on interpreting the mean value and standard deviation as the best value and corresponding error, respectively. Reliance on using the mode and median as alternative parameters to represent the best available knowledge of a variable with large uncertainties is also shown to entail limitations. Finally, a realistic physical example involving the decay of radioactivity over a time period that spans many half-lives is presented and analyzed to further illustrate the concepts discussed in this communication

  7. Zero field reversal probability in thermally assisted magnetization reversal

    Science.gov (United States)

    Prasetya, E. B.; Utari; Purnama, B.

    2017-11-01

    This paper discussed about zero field reversal probability in thermally assisted magnetization reversal (TAMR). Appearance of reversal probability in zero field investigated through micromagnetic simulation by solving stochastic Landau-Lifshitz-Gibert (LLG). The perpendicularly anisotropy magnetic dot of 50×50×20 nm3 is considered as single cell magnetic storage of magnetic random acces memory (MRAM). Thermally assisted magnetization reversal was performed by cooling writing process from near/almost Curie point to room temperature on 20 times runs for different randomly magnetized state. The results show that the probability reversal under zero magnetic field decreased with the increase of the energy barrier. The zero-field probability switching of 55% attained for energy barrier of 60 k B T and the reversal probability become zero noted at energy barrier of 2348 k B T. The higest zero-field switching probability of 55% attained for energy barrier of 60 k B T which corespond to magnetif field of 150 Oe for switching.

  8. On the probability of cure for heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Hanin, Leonid; Zaider, Marco

    2014-01-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)

  9. Probability theory a concise course

    CERN Document Server

    Rozanov, Y A

    1977-01-01

    This clear exposition begins with basic concepts and moves on to combination of events, dependent events and random variables, Bernoulli trials and the De Moivre-Laplace theorem, a detailed treatment of Markov chains, continuous Markov processes, and more. Includes 150 problems, many with answers. Indispensable to mathematicians and natural scientists alike.

  10. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  11. Androgen Suppression Combined with Elective Nodal and Dose Escalated Radiation Therapy (the ASCENDE-RT Trial): An Analysis of Survival Endpoints for a Randomized Trial Comparing a Low-Dose-Rate Brachytherapy Boost to a Dose-Escalated External Beam Boost for High- and Intermediate-risk Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Morris, W. James, E-mail: jmorris@bccancer.bc.ca [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Vancouver Centre, Vancouver, British Columbia (Canada); Tyldesley, Scott [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Vancouver Centre, Vancouver, British Columbia (Canada); Rodda, Sree [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); Halperin, Ross [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Centre for the Southern Interior, Vancouver, British Columbia (Canada); Pai, Howard [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Vancouver Island Centre, Vancouver, British Columbia (Canada); McKenzie, Michael; Duncan, Graeme [Department of Surgery, University of British Columbia, Vancouver, British Columbia (Canada); BC Cancer Agency–Vancouver Centre, Vancouver, British Columbia (Canada); Morton, Gerard [Department of Radiation Oncology, University of Toronto, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Hamm, Jeremy [Department of Population Oncology, BC Cancer Agency, Vancouver, British Columbia (Canada); Murray, Nevin [BC Cancer Agency–Vancouver Centre, Vancouver, British Columbia (Canada); Department of Medicine, University of British Columbia, Vancouver, British Columbia (Canada)

    2017-06-01

    Purpose: To report the primary endpoint of biochemical progression-free survival (b-PFS) and secondary survival endpoints from ASCENDE-RT, a randomized trial comparing 2 methods of dose escalation for intermediate- and high-risk prostate cancer. Methods and Materials: ASCENDE-RT enrolled 398 men, with a median age of 68 years; 69% (n=276) had high-risk disease. After stratification by risk group, the subjects were randomized to a standard arm with 12 months of androgen deprivation therapy, pelvic irradiation to 46 Gy, followed by a dose-escalated external beam radiation therapy (DE-EBRT) boost to 78 Gy, or an experimental arm that substituted a low-dose-rate prostate brachytherapy (LDR-PB) boost. Of the 398 trial subjects, 200 were assigned to DE-EBRT boost and 198 to LDR-PB boost. The median follow-up was 6.5 years. Results: In an intent-to-treat analysis, men randomized to DE-EBRT were twice as likely to experience biochemical failure (multivariable analysis [MVA] hazard ratio [HR] 2.04; P=.004). The 5-, 7-, and 9-year Kaplan-Meier b-PFS estimates were 89%, 86%, and 83% for the LDR-PB boost versus 84%, 75%, and 62% for the DE-EBRT boost (log-rank P<.001). The LDR-PB boost benefited both intermediate- and high-risk patients. Because the b-PFS curves for the treatment arms diverge sharply after 4 years, the relative advantage of the LDR-PB should increase with longer follow-up. On MVA, the only variables correlated with reduced overall survival were age (MVA HR 1.06/y; P=.004) and biochemical failure (MVA HR 6.30; P<.001). Although biochemical failure was associated with increased mortality and randomization to DE-EBRT doubled the rate of biochemical failure, no significant overall survival difference was observed between the treatment arms (MVA HR 1.13; P=.62). Conclusions: Compared with 78 Gy EBRT, men randomized to the LDR-PB boost were twice as likely to be free of biochemical failure at a median follow-up of 6.5 years.

  12. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  13. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  14. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  15. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Tercariol, Cesar Augusto Sangaletti; Kiipper, Felipe de Moura; Martinez, Alexandre Souto

    2007-01-01

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P (d,N) m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  16. Complexity for survival of livings

    International Nuclear Information System (INIS)

    Zak, Michail

    2007-01-01

    A connection between survivability of livings and complexity of their behavior is established. New physical paradigms-exchange of information via reflections, and chain of abstractions-explaining and describing progressive evolution of complexity in living (active) systems are introduced. A biological origin of these paradigms is associated with a recently discovered mirror neuron that is able to learn by imitation. As a result, an active element possesses the self-nonself images and interacts with them creating the world of mental dynamics. Three fundamental types of complexity of mental dynamics that contribute to survivability are identified. Mathematical model of the corresponding active systems is described by coupled motor-mental dynamics represented by Langevin and Fokker-Planck equations, respectively, while the progressive evolution of complexity is provided by nonlinear evolution of probability density. Application of the proposed formalism to modeling common-sense-based decision-making process is discussed

  17. Complexity for survival of livings

    Energy Technology Data Exchange (ETDEWEB)

    Zak, Michail [Jet Propulsion Laboratory, California Institute of Technology, Advance Computing Algorithms and IVHM Group, Pasadena, CA 91109 (United States)]. E-mail: Michail.Zak@jpl.nasa.gov

    2007-05-15

    A connection between survivability of livings and complexity of their behavior is established. New physical paradigms-exchange of information via reflections, and chain of abstractions-explaining and describing progressive evolution of complexity in living (active) systems are introduced. A biological origin of these paradigms is associated with a recently discovered mirror neuron that is able to learn by imitation. As a result, an active element possesses the self-nonself images and interacts with them creating the world of mental dynamics. Three fundamental types of complexity of mental dynamics that contribute to survivability are identified. Mathematical model of the corresponding active systems is described by coupled motor-mental dynamics represented by Langevin and Fokker-Planck equations, respectively, while the progressive evolution of complexity is provided by nonlinear evolution of probability density. Application of the proposed formalism to modeling common-sense-based decision-making process is discussed.

  18. Theory of random sets

    CERN Document Server

    Molchanov, Ilya

    2017-01-01

    This monograph, now in a thoroughly revised second edition, offers the latest research on random sets. It has been extended to include substantial developments achieved since 2005, some of them motivated by applications of random sets to econometrics and finance. The present volume builds on the foundations laid by Matheron and others, including the vast advances in stochastic geometry, probability theory, set-valued analysis, and statistical inference. It shows the various interdisciplinary relationships of random set theory within other parts of mathematics, and at the same time fixes terminology and notation that often vary in the literature, establishing it as a natural part of modern probability theory and providing a platform for future development. It is completely self-contained, systematic and exhaustive, with the full proofs that are necessary to gain insight. Aimed at research level, Theory of Random Sets will be an invaluable reference for probabilists; mathematicians working in convex and integ...

  19. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  20. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  1. Daniel Courgeau: Probability and social science: methodological relationships between the two approaches [Review of: . Probability and social science: methodological relationships between the two approaches

    NARCIS (Netherlands)

    Willekens, F.J.C.

    2013-01-01

    Throughout history, humans engaged in games in which randomness plays a role. In the 17th century, scientists started to approach chance scientifically and to develop a theory of probability. Courgeau describes how the relationship between probability theory and social sciences emerged and evolved

  2. Generation of pseudo-random numbers

    Science.gov (United States)

    Howell, L. W.; Rheinfurth, M. H.

    1982-01-01

    Practical methods for generating acceptable random numbers from a variety of probability distributions which are frequently encountered in engineering applications are described. The speed, accuracy, and guarantee of statistical randomness of the various methods are discussed.

  3. Adaptive random walks on the class of Web graphs

    Science.gov (United States)

    Tadić, B.

    2001-09-01

    We study random walk with adaptive move strategies on a class of directed graphs with variable wiring diagram. The graphs are grown from the evolution rules compatible with the dynamics of the world-wide Web [B. Tadić, Physica A 293, 273 (2001)], and are characterized by a pair of power-law distributions of out- and in-degree for each value of the parameter β, which measures the degree of rewiring in the graph. The walker adapts its move strategy according to locally available information both on out-degree of the visited node and in-degree of target node. A standard random walk, on the other hand, uses the out-degree only. We compute the distribution of connected subgraphs visited by an ensemble of walkers, the average access time and survival probability of the walks. We discuss these properties of the walk dynamics relative to the changes in the global graph structure when the control parameter β is varied. For β≥ 3, corresponding to the world-wide Web, the access time of the walk to a given level of hierarchy on the graph is much shorter compared to the standard random walk on the same graph. By reducing the amount of rewiring towards rigidity limit β↦βc≲ 0.1, corresponding to the range of naturally occurring biochemical networks, the survival probability of adaptive and standard random walk become increasingly similar. The adaptive random walk can be used as an efficient message-passing algorithm on this class of graphs for large degree of rewiring.

  4. The randomly renewed general item and the randomly inspected item with exponential life distribution

    International Nuclear Information System (INIS)

    Schneeweiss, W.G.

    1979-01-01

    For a randomly renewed item the probability distributions of the time to failure and of the duration of down time and the expectations of these random variables are determined. Moreover, it is shown that the same theory applies to randomly checked items with exponential probability distribution of life such as electronic items. The case of periodic renewals is treated as an example. (orig.) [de

  5. A Randomized Central Limit Theorem

    International Nuclear Information System (INIS)

    Eliazar, Iddo; Klafter, Joseph

    2010-01-01

    The Central Limit Theorem (CLT), one of the most elemental pillars of Probability Theory and Statistical Physics, asserts that: the universal probability law of large aggregates of independent and identically distributed random summands with zero mean and finite variance, scaled by the square root of the aggregate-size (√(n)), is Gaussian. The scaling scheme of the CLT is deterministic and uniform - scaling all aggregate-summands by the common and deterministic factor √(n). This Letter considers scaling schemes which are stochastic and non-uniform, and presents a 'Randomized Central Limit Theorem' (RCLT): we establish a class of random scaling schemes which yields universal probability laws of large aggregates of independent and identically distributed random summands. The RCLT universal probability laws, in turn, are the one-sided and the symmetric Levy laws.

  6. Digital dice computational solutions to practical probability problems

    CERN Document Server

    Nahin, Paul J

    2013-01-01

    Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the

  7. Optimal exploitation of a renewable resource with stochastic nonconvex technology: An analysis of extinction and survival

    International Nuclear Information System (INIS)

    Mitra, Tapan; Roy, Santanu

    1992-11-01

    This paper analyzes the possibilities of extinction and survival of a renewable resource whose technology of reproduction is both stochastic and nonconvex. In particular, the production function is subject to random shocks over time and is allowed to be nonconcave, though it eventually exhibits bounded growth. The existence of a minimum biomass below which the resource can only decrease, is allowed for. Society harvests a part of the current stock every time period over an infinite horizon so as to maximize the expected discounted sum of one period social utilities from the harvested resource. The social utility function is strictly concave. The stochastic process of optimal stocks generated by the optimal stationary policy is analyzed. The nonconvexity in the optimization problem implies that the optimal policy functions are not 'well behaved'. The behaviour of the probability of extinction (and the expected time to extinction), as a function of initial stock, is characterized for various possible configurations of the optimal policy and the technology. Sufficient conditions on the utility and production functions and the rate of impatience, are specified in order to ensure survival of the resource with probability one from some stock level (the minimum safe standard of conservation). Sufficient conditions for almost sure extinction and almost sure survival from all stock levels are also specified. These conditions are related to the corresponding conditions derived in models with deterministic and/or convex technology. 4 figs., 29 refs

  8. Optimal exploitation of a renewable resource with stochastic nonconvex technology: An analysis of extinction and survival

    Energy Technology Data Exchange (ETDEWEB)

    Mitra, Tapan [Department of Economics, Cornell University, Ithaca, NY (United States); Roy, Santanu [Econometric Institute, Erasmus University, Rotterdam (Netherlands)

    1992-11-01

    This paper analyzes the possibilities of extinction and survival of a renewable resource whose technology of reproduction is both stochastic and nonconvex. In particular, the production function is subject to random shocks over time and is allowed to be nonconcave, though it eventually exhibits bounded growth. The existence of a minimum biomass below which the resource can only decrease, is allowed for. Society harvests a part of the current stock every time period over an infinite horizon so as to maximize the expected discounted sum of one period social utilities from the harvested resource. The social utility function is strictly concave. The stochastic process of optimal stocks generated by the optimal stationary policy is analyzed. The nonconvexity in the optimization problem implies that the optimal policy functions are not `well behaved`. The behaviour of the probability of extinction (and the expected time to extinction), as a function of initial stock, is characterized for various possible configurations of the optimal policy and the technology. Sufficient conditions on the utility and production functions and the rate of impatience, are specified in order to ensure survival of the resource with probability one from some stock level (the minimum safe standard of conservation). Sufficient conditions for almost sure extinction and almost sure survival from all stock levels are also specified. These conditions are related to the corresponding conditions derived in models with deterministic and/or convex technology. 4 figs., 29 refs.

  9. Continuous-time random walks with reset events. Historical background and new perspectives

    Science.gov (United States)

    Montero, Miquel; Masó-Puigdellosas, Axel; Villarroel, Javier

    2017-09-01

    In this paper, we consider a stochastic process that may experience random reset events which relocate the system to its starting position. We focus our attention on a one-dimensional, monotonic continuous-time random walk with a constant drift: the process moves in a fixed direction between the reset events, either by the effect of the random jumps, or by the action of a deterministic bias. However, the orientation of its motion is randomly determined after each restart. As a result of these alternating dynamics, interesting properties do emerge. General formulas for the propagator as well as for two extreme statistics, the survival probability and the mean first-passage time, are also derived. The rigor of these analytical results is verified by numerical estimations, for particular but illuminating examples.

  10. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  11. Vicious random walkers in the limit of a large number of walkers

    International Nuclear Information System (INIS)

    Forrester, P.J.

    1989-01-01

    The vicious random walker problem on a line is studied in the limit of a large number of walkers. The multidimensional integral representing the probability that the p walkers will survive a time t (denoted P t (p) ) is shown to be analogous to the partition function of a particular one-component Coulomb gas. By assuming the existence of the thermodynamic limit for the Coulomb gas, one can deduce asymptotic formulas for P t (p) in the large-p, large-t limit. A straightforward analysis gives rigorous asymptotic formulas for the probability that after a time t the walkers are in their initial configuration (this event is termed a reunion). Consequently, asymptotic formulas for the conditional probability of a reunion, given that all walkers survive, are derived. Also, an asymptotic formula for the conditional probability density that any walker will arrive at a particular point in time t, given that all p walkers survive, is calculated in the limit t >> p

  12. Survival and weak chaos.

    Science.gov (United States)

    Nee, Sean

    2018-05-01

    Survival analysis in biology and reliability theory in engineering concern the dynamical functioning of bio/electro/mechanical units. Here we incorporate effects of chaotic dynamics into the classical theory. Dynamical systems theory now distinguishes strong and weak chaos. Strong chaos generates Type II survivorship curves entirely as a result of the internal operation of the system, without any age-independent, external, random forces of mortality. Weak chaos exhibits (a) intermittency and (b) Type III survivorship, defined as a decreasing per capita mortality rate: engineering explicitly defines this pattern of decreasing hazard as 'infant mortality'. Weak chaos generates two phenomena from the normal functioning of the same system. First, infant mortality- sensu engineering-without any external explanatory factors, such as manufacturing defects, which is followed by increased average longevity of survivors. Second, sudden failure of units during their normal period of operation, before the onset of age-dependent mortality arising from senescence. The relevance of these phenomena encompasses, for example: no-fault-found failure of electronic devices; high rates of human early spontaneous miscarriage/abortion; runaway pacemakers; sudden cardiac death in young adults; bipolar disorder; and epilepsy.

  13. Modelling survival after treatment of intraocular melanoma using artificial neural networks and Bayes theorem

    International Nuclear Information System (INIS)

    Taktak, Azzam F G; Fisher, Anthony C; Damato, Bertil E

    2004-01-01

    This paper describes the development of an artificial intelligence (AI) system for survival prediction from intraocular melanoma. The system used artificial neural networks (ANNs) with five input parameters: coronal and sagittal tumour location, anterior tumour margin, largest basal tumour diameter and the cell type. After excluding records with missing data, 2331 patients were included in the study. These were split randomly into training and test sets. Date censorship was applied to the records to deal with patients who were lost to follow-up and patients who died from general causes. Bayes theorem was then applied to the ANN output to construct survival probability curves. A validation set with 34 patients unseen to both training and test sets was used to compare the AI system with Cox's regression (CR) and Kaplan-Meier (KM) analyses. Results showed large differences in the mean 5 year survival probability figures when the number of records with matching characteristics was small. However, as the number of matches increased to >100 the system tended to agree with CR and KM. The validation set was also used to compare the system with a clinical expert in predicting time to metastatic death. The rms error was 3.7 years for the system and 4.3 years for the clinical expert for 15 years survival. For <10 years survival, these figures were 2.7 and 4.2, respectively. We concluded that the AI system can match if not better the clinical expert's prediction. There were significant differences with CR and KM analyses when the number of records was small, but it was not known which model is more accurate

  14. Modelling survival after treatment of intraocular melanoma using artificial neural networks and Bayes theorem

    Energy Technology Data Exchange (ETDEWEB)

    Taktak, Azzam F G [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Fisher, Anthony C [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Damato, Bertil E [Department of Ophthalmology, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom)

    2004-01-07

    This paper describes the development of an artificial intelligence (AI) system for survival prediction from intraocular melanoma. The system used artificial neural networks (ANNs) with five input parameters: coronal and sagittal tumour location, anterior tumour margin, largest basal tumour diameter and the cell type. After excluding records with missing data, 2331 patients were included in the study. These were split randomly into training and test sets. Date censorship was applied to the records to deal with patients who were lost to follow-up and patients who died from general causes. Bayes theorem was then applied to the ANN output to construct survival probability curves. A validation set with 34 patients unseen to both training and test sets was used to compare the AI system with Cox's regression (CR) and Kaplan-Meier (KM) analyses. Results showed large differences in the mean 5 year survival probability figures when the number of records with matching characteristics was small. However, as the number of matches increased to >100 the system tended to agree with CR and KM. The validation set was also used to compare the system with a clinical expert in predicting time to metastatic death. The rms error was 3.7 years for the system and 4.3 years for the clinical expert for 15 years survival. For <10 years survival, these figures were 2.7 and 4.2, respectively. We concluded that the AI system can match if not better the clinical expert's prediction. There were significant differences with CR and KM analyses when the number of records was small, but it was not known which model is more accurate.

  15. Blocked Randomization with Randomly Selected Block Sizes

    Directory of Open Access Journals (Sweden)

    Jimmy Efird

    2010-12-01

    Full Text Available When planning a randomized clinical trial, careful consideration must be given to how participants are selected for various arms of a study. Selection and accidental bias may occur when participants are not assigned to study groups with equal probability. A simple random allocation scheme is a process by which each participant has equal likelihood of being assigned to treatment versus referent groups. However, by chance an unequal number of individuals may be assigned to each arm of the study and thus decrease the power to detect statistically significant differences between groups. Block randomization is a commonly used technique in clinical trial design to reduce bias and achieve balance in the allocation of participants to treatment arms, especially when the sample size is small. This method increases the probability that each arm will contain an equal number of individuals by sequencing participant assignments by block. Yet still, the allocation process may be predictable, for example, when the investigator is not blind and the block size is fixed. This paper provides an overview of blocked randomization and illustrates how to avoid selection bias by using random block sizes.

  16. Survival pathways under stress

    Indian Academy of Sciences (India)

    First page Back Continue Last page Graphics. Survival pathways under stress. Bacteria survive by changing gene expression. pattern. Three important pathways will be discussed: Stringent response. Quorum sensing. Proteins performing function to control oxidative damage.

  17. Random number generation

    International Nuclear Information System (INIS)

    Coveyou, R.R.

    1974-01-01

    The subject of random number generation is currently controversial. Differing opinions on this subject seem to stem from implicit or explicit differences in philosophy; in particular, from differing ideas concerning the role of probability in the real world of physical processes, electronic computers, and Monte Carlo calculations. An attempt is made here to reconcile these views. The role of stochastic ideas in mathematical models is discussed. In illustration of these ideas, a mathematical model of the use of random number generators in Monte Carlo calculations is constructed. This model is used to set up criteria for the comparison and evaluation of random number generators. (U.S.)

  18. Non-randomized study on the effects of preoperative radiotherapy and daily administration of low-dose cisplatin against those of radiotherapy alone for oral cancer. Effects on local control, control of metastases, and overall survival

    International Nuclear Information System (INIS)

    Kurita, Hiroshi; Ohtsuka, Akiko; Kobayashi, Hiroichi; Kurashina, Kenji; Shikama, Naoto; Oguchi, Masahiko

    2000-01-01

    Cisplatin is a known radiation modifier. Our previous study suggested that daily administration of low-dose cisplatin enhanced the efficacy of radiotherapy against primary oral squamous carcinoma. In this paper, we follow the patients who participated in the previous study and survey the benefit of combination low-dose cisplatin in improving local control, prevention of metastases, and overall survival. This study included patients with surgically resectable advanced oral tumors. Ten patients underwent preoperative radiotherapy of 30-40 Gy/15-20 days with concomitant daily administration of low-dose cisplatin (5 mg/body or 5 mg/m 2 ). Ten other patients received external radiotherapy alone. All patients then underwent a planned radical tumor resection. No significant difference was see in loco-regional control rates (primary: 86 vs. 88%, neck: 83 vs. 78% at 48 months) or incidence of metastasis (70 vs. 64%) between the two groups. Nor was there a significant difference in the overall survival rate (60 vs. 66%). The results of this study suggest that the concomitant use of daily administration of low-dose cisplatin with preoperative radiation brings no statistically significant benefit in improving local control and survival rate in patients with advanced resectable oral cancer. (author)

  19. On randomly interrupted diffusion

    International Nuclear Information System (INIS)

    Luczka, J.

    1993-01-01

    Processes driven by randomly interrupted Gaussian white noise are considered. An evolution equation for single-event probability distributions in presented. Stationary states are considered as a solution of a second-order ordinary differential equation with two imposed conditions. A linear model is analyzed and its stationary distributions are explicitly given. (author). 10 refs

  20. Hitting probabilities for nonlinear systems of stochastic waves

    CERN Document Server

    Dalang, Robert C

    2015-01-01

    The authors consider a d-dimensional random field u = \\{u(t,x)\\} that solves a non-linear system of stochastic wave equations in spatial dimensions k \\in \\{1,2,3\\}, driven by a spatially homogeneous Gaussian noise that is white in time. They mainly consider the case where the spatial covariance is given by a Riesz kernel with exponent \\beta. Using Malliavin calculus, they establish upper and lower bounds on the probabilities that the random field visits a deterministic subset of \\mathbb{R}^d, in terms, respectively, of Hausdorff measure and Newtonian capacity of this set. The dimension that ap