WorldWideScience

Sample records for survival probability parameters

  1. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  2. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  3. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  4. Bayesian Analysis for EMP Survival Probability of Solid State Relay

    International Nuclear Information System (INIS)

    Sun Beiyun; Zhou Hui; Cheng Xiangyue; Mao Congguang

    2009-01-01

    The principle to estimate the parameter p of binomial distribution by Bayesian method and the several non-informative prior are introduced. The survival probability of DC solid state relay under current injection at certain amplitude is obtained by this method. (authors)

  5. Finite-size scaling of survival probability in branching processes.

    Science.gov (United States)

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Álvaro

    2015-04-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We derive analytically the existence of finite-size scaling for the survival probability as a function of the control parameter and the maximum number of generations, obtaining the critical exponents as well as the exact scaling function, which is G(y)=2ye(y)/(e(y)-1), with y the rescaled distance to the critical point. Our findings are valid for any branching process of the Galton-Watson type, independently of the distribution of the number of offspring, provided its variance is finite. This proves the universal behavior of the finite-size effects in branching processes, including the universality of the metric factors. The direct relation to mean-field percolation is also discussed.

  6. Survival chance in papillary thyroid cancer in Hungary: individual survival probability estimation using the Markov method

    International Nuclear Information System (INIS)

    Esik, Olga; Tusnady, Gabor; Daubner, Kornel; Nemeth, Gyoergy; Fuezy, Marton; Szentirmay, Zoltan

    1997-01-01

    Purpose: The typically benign, but occasionally rapidly fatal clinical course of papillary thyroid cancer has raised the need for individual survival probability estimation, to tailor the treatment strategy exclusively to a given patient. Materials and methods: A retrospective study was performed on 400 papillary thyroid cancer patients with a median follow-up time of 7.1 years to establish a clinical database for uni- and multivariate analysis of the prognostic factors related to survival (Kaplan-Meier product limit method and Cox regression). For a more precise prognosis estimation, the effect of the most important clinical events were then investigated on the basis of a Markov renewal model. The basic concept of this approach is that each patient has an individual disease course which (besides the initial clinical categories) is affected by special events, e.g. internal covariates (local/regional/distant relapses). On the supposition that these events and the cause-specific death are influenced by the same biological processes, the parameters of transient survival probability characterizing the speed of the course of the disease for each clinical event and their sequence were determined. The individual survival curves for each patient were calculated by using these parameters and the independent significant clinical variables selected from multivariate studies, summation of which resulted in a mean cause-specific survival function valid for the entire group. On the basis of this Markov model, prediction of the cause-specific survival probability is possible for extrastudy cases, if it is supposed that the clinical events occur within new patients in the same manner and with the similar probability as within the study population. Results: The patient's age, a distant metastasis at presentation, the extent of the surgical intervention, the primary tumor size and extent (pT), the external irradiation dosage and the degree of TSH suppression proved to be

  7. Finite-size scaling of survival probability in branching processes

    OpenAIRE

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Alvaro

    2014-01-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We reveal the finite-size scaling law of the survival probability for a given branching process ruled by a probability distribution of the number of offspring per element whose standard deviation is finite, obtaining the exact scaling function as well as the critical exponents. Our findings prove the universal behavi...

  8. Gluon saturation: Survival probability for leading neutrons in DIS

    International Nuclear Information System (INIS)

    Levin, Eugene; Tapia, Sebastian

    2012-01-01

    In this paper we discuss the example of one rapidity gap process: the inclusive cross sections of the leading neutrons in deep inelastic scattering with protons (DIS). The equations for this process are proposed and solved, giving the example of theoretical calculation of the survival probability for one rapidity gap processes. It turns out that the value of the survival probability is small and it decreases with energy.

  9. Fusion probability and survivability in estimates of heaviest nuclei production

    International Nuclear Information System (INIS)

    Sagaidak, Roman

    2012-01-01

    A number of theoretical models have been recently developed to predict production cross sections for the heaviest nuclei in fusion-evaporation reactions. All the models reproduce cross sections obtained in experiments quite well. At the same time they give fusion probability values P fus ≡ P CN differed within several orders of the value. This difference implies a corresponding distinction in the calculated values of survivability. The production of the heaviest nuclei (from Cm to the region of superheavy elements (SHE) close to Z = 114 and N = 184) in fusion-evaporation reactions induced by heavy ions has been considered in a systematic way within the framework of the barrier-passing (fusion) model coupled with the standard statistical model (SSM) of the compound nucleus (CN) decay. Both models are incorporated into the HIVAP code. Available data on the excitation functions for fission and evaporation residues (ER) produced in very asymmetric combinations can be described rather well within the framework of HIVAP. Cross-section data obtained in these reactions allow one to choose model parameters quite definitely. Thus one can scale and fix macroscopic (liquid-drop) fission barriers for nuclei involved in the evaporation-fission cascade. In less asymmetric combinations (with 22 Ne and heavier projectiles) effects of fusion suppression caused by quasi-fission are starting to appear in the entrance channel of reactions. The P fus values derived from the capture-fission and fusion-fission cross-sections obtained at energies above the Bass barrier were plotted as a function of the Coulomb parameter. For more symmetric combinations one can deduce the P fus values semi-empirically, using the ER and fission excitation functions measured in experiments, and applying SSM model with parameters obtained in the analysis of a very asymmetric combination leading to the production of (nearly) the same CN, as was done for reactions leading to the pre-actinide nuclei formation

  10. Estimating the joint survival probabilities of married individuals

    NARCIS (Netherlands)

    Sanders, Lisanne; Melenberg, Bertrand

    We estimate the joint survival probability of spouses using a large random sample drawn from a Dutch census. As benchmarks we use two bivariate Weibull models. We consider more flexible models, using a semi-nonparametric approach, by extending the independent Weibull distribution using squared

  11. Fusion probability and survivability in estimates of heaviest nuclei production

    Directory of Open Access Journals (Sweden)

    Sagaidak Roman N.

    2012-02-01

    Full Text Available Production of the heavy and heaviest nuclei (from Po to the region of superheavy elements close to Z=114 and N=184 in fusion-evaporation reactions induced by heavy ions has been considered in a systematic way within the framework of the barrier-passing model coupled with the statistical model (SM of de-excitation of a compound nucleus (CN. Excitation functions for fission and evaporation residues (ER measured in very asymmetric combinations can be described rather well. One can scale and fix macroscopic (liquid-drop fission barriers for nuclei involved in the calculation of survivability with SM. In less asymmetric combinations, effects of fusion suppression caused by quasi-fission (QF are starting to appear in the entrance channel of reactions. QF effects could be semi-empirically taken into account using fusion probabilities deduced as the ratio of measured ER cross sections to the ones obtained in the assumption of absence of the fusion suppression in corresponding reactions. SM parameters (fission barriers obtained at the analysis of a very asymmetric combination leading to the production of (nearly the same CN should be used for this evaluation.

  12. [Survival analysis with competing risks: estimating failure probability].

    Science.gov (United States)

    Llorca, Javier; Delgado-Rodríguez, Miguel

    2004-01-01

    To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  13. Fingerprints of exceptional points in the survival probability of resonances in atomic spectra

    International Nuclear Information System (INIS)

    Cartarius, Holger; Moiseyev, Nimrod

    2011-01-01

    The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=| | 2 decays exactly as |1-at| 2 e -Γ EP t/(ℎ/2π) , where Γ EP is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.

  14. Fingerprints of exceptional points in the survival probability of resonances in atomic spectra

    Science.gov (United States)

    Cartarius, Holger; Moiseyev, Nimrod

    2011-07-01

    The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=||2 decays exactly as |1-at|2e-ΓEPt/ℏ, where ΓEP is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.

  15. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  16. Survival probabilities for branching Brownian motion with absorption

    OpenAIRE

    Harris, John; Harris, Simon

    2007-01-01

    We study a branching Brownian motion (BBM) with absorption, in which particles move as Brownian motions with drift $-\\rho$, undergo dyadic branching at rate $\\beta>0$, and are killed on hitting the origin. In the case $\\rho>\\sqrt{2\\beta}$ the extinction time for this process, $\\zeta$, is known to be finite almost surely. The main result of this article is a large-time asymptotic formula for the survival probability $P^x(\\zeta>t)$ in the case $\\rho>\\sqrt{2\\beta}$, where $P^x$ is...

  17. Survival probability of diffusion with trapping in cellular neurobiology

    Science.gov (United States)

    Holcman, David; Marchewka, Avi; Schuss, Zeev

    2005-09-01

    The problem of diffusion with absorption and trapping sites arises in the theory of molecular signaling inside and on the membranes of biological cells. In particular, this problem arises in the case of spine-dendrite communication, where the number of calcium ions, modeled as random particles, is regulated across the spine microstructure by pumps, which play the role of killing sites, while the end of the dendritic shaft is an absorbing boundary. We develop a general mathematical framework for diffusion in the presence of absorption and killing sites and apply it to the computation of the time-dependent survival probability of ions. We also compute the ratio of the number of absorbed particles at a specific location to the number of killed particles. We show that the ratio depends on the distribution of killing sites. The biological consequence is that the position of the pumps regulates the fraction of calcium ions that reach the dendrite.

  18. Bounds on survival probability given mean probability of failure per demand; and the paradoxical advantages of uncertainty

    International Nuclear Information System (INIS)

    Strigini, Lorenzo; Wright, David

    2014-01-01

    When deciding whether to accept into service a new safety-critical system, or choosing between alternative systems, uncertainty about the parameters that affect future failure probability may be a major problem. This uncertainty can be extreme if there is the possibility of unknown design errors (e.g. in software), or wide variation between nominally equivalent components. We study the effect of parameter uncertainty on future reliability (survival probability), for systems required to have low risk of even only one failure or accident over the long term (e.g. their whole operational lifetime) and characterised by a single reliability parameter (e.g. probability of failure per demand – pfd). A complete mathematical treatment requires stating a probability distribution for any parameter with uncertain value. This is hard, so calculations are often performed using point estimates, like the expected value. We investigate conditions under which such simplified descriptions yield reliability values that are sure to be pessimistic (or optimistic) bounds for a prediction based on the true distribution. Two important observations are (i) using the expected value of the reliability parameter as its true value guarantees a pessimistic estimate of reliability, a useful property in most safety-related decisions; (ii) with a given expected pfd, broader distributions (in a formally defined meaning of “broader”), that is, systems that are a priori “less predictable”, lower the risk of failures or accidents. Result (i) justifies the simplification of using a mean in reliability modelling; we discuss within which scope this justification applies, and explore related scenarios, e.g. how things improve if we can test the system before operation. Result (ii) not only offers more flexible ways of bounding reliability predictions, but also has important, often counter-intuitive implications for decision making in various areas, like selection of components, project management

  19. Impact parameter dependence of inner-shell ionization probabilities

    International Nuclear Information System (INIS)

    Cocke, C.L.

    1974-01-01

    The probability for ionization of an inner shell of a target atom by a heavy charged projectile is a sensitive function of the impact parameter characterizing the collision. This probability can be measured experimentally by detecting the x-ray resulting from radiative filling of the inner shell in coincidence with the projectile scattered at a determined angle, and by using the scattering angle to deduce the impact parameter. It is conjectured that the functional dependence of the ionization probability may be a more sensitive probe of the ionization mechanism than is a total cross section measurement. Experimental results for the K-shell ionization of both solid and gas targets by oxygen, carbon and fluorine projectiles in the MeV/amu energy range will be presented, and their use in illuminating the inelastic collision process discussed

  20. Mean exit time and survival probability within the CTRW formalism

    Science.gov (United States)

    Montero, M.; Masoliver, J.

    2007-05-01

    An intense research on financial market microstructure is presently in progress. Continuous time random walks (CTRWs) are general models capable to capture the small-scale properties that high frequency data series show. The use of CTRW models in the analysis of financial problems is quite recent and their potentials have not been fully developed. Here we present two (closely related) applications of great interest in risk control. In the first place, we will review the problem of modelling the behaviour of the mean exit time (MET) of a process out of a given region of fixed size. The surveyed stochastic processes are the cumulative returns of asset prices. The link between the value of the MET and the timescale of the market fluctuations of a certain degree is crystal clear. In this sense, MET value may help, for instance, in deciding the optimal time horizon for the investment. The MET is, however, one among the statistics of a distribution of bigger interest: the survival probability (SP), the likelihood that after some lapse of time a process remains inside the given region without having crossed its boundaries. The final part of the manuscript is devoted to the study of this quantity. Note that the use of SPs may outperform the standard “Value at Risk" (VaR) method for two reasons: we can consider other market dynamics than the limited Wiener process and, even in this case, a risk level derived from the SP will ensure (within the desired quintile) that the quoted value of the portfolio will not leave the safety zone. We present some preliminary theoretical and applied results concerning this topic.

  1. Survival and compound nucleus probability of super heavy element Z = 117

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First grade College, Department of Physics, Kolar, Karnataka (India)

    2017-05-15

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of {sup 289-297}Ts, we have calculated the transmission probability (T{sub l}), compound nucleus formation probabilities (P{sub CN}) and survival probability (P{sub sur}) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of {sup 289-297}Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei {sup 289-297}Ts are worked out and listed explicitly. We have also studied the variation of P{sub CN} and P{sub sur} with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  2. Survival and compound nucleus probability of super heavy element Z = 117

    International Nuclear Information System (INIS)

    Manjunatha, H.C.; Sridhar, K.N.

    2017-01-01

    As a part of a systematic study for predicting the most suitable projectile-target combinations for heavy-ion fusion experiments in the synthesis of "2"8"9"-"2"9"7Ts, we have calculated the transmission probability (T_l), compound nucleus formation probabilities (P_C_N) and survival probability (P_s_u_r) of possible projectile-target combinations. We have also studied the fusion cross section, survival cross section and fission cross sections for different projectile-target combination of "2"8"9"-"2"9"7Ts. These theoretical parameters are required before the synthesis of the super heavy element. The calculated probabilities and cross sections show that the production of isotopes of the super heavy element with Z = 117 is strongly dependent on the reaction systems. The most probable reactions to synthetize the super heavy nuclei "2"8"9"-"2"9"7Ts are worked out and listed explicitly. We have also studied the variation of P_C_N and P_s_u_r with the mass number of projectile and target nuclei. This work is useful in the synthesis of the super heavy element Z = 117. (orig.)

  3. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  4. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  5. Assigning probability distributions to input parameters of performance assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, Srikanta [INTERA Inc., Austin, TX (United States)

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available.

  6. Assigning probability distributions to input parameters of performance assessment models

    International Nuclear Information System (INIS)

    Mishra, Srikanta

    2002-02-01

    This study presents an overview of various approaches for assigning probability distributions to input parameters and/or future states of performance assessment models. Specifically,three broad approaches are discussed for developing input distributions: (a) fitting continuous distributions to data, (b) subjective assessment of probabilities, and (c) Bayesian updating of prior knowledge based on new information. The report begins with a summary of the nature of data and distributions, followed by a discussion of several common theoretical parametric models for characterizing distributions. Next, various techniques are presented for fitting continuous distributions to data. These include probability plotting, method of moments, maximum likelihood estimation and nonlinear least squares analysis. The techniques are demonstrated using data from a recent performance assessment study for the Yucca Mountain project. Goodness of fit techniques are also discussed, followed by an overview of how distribution fitting is accomplished in commercial software packages. The issue of subjective assessment of probabilities is dealt with in terms of the maximum entropy distribution selection approach, as well as some common rules for codifying informal expert judgment. Formal expert elicitation protocols are discussed next, and are based primarily on the guidance provided by the US NRC. The Bayesian framework for updating prior distributions (beliefs) when new information becomes available is discussed. A simple numerical approach is presented for facilitating practical applications of the Bayes theorem. Finally, a systematic framework for assigning distributions is presented: (a) for the situation where enough data are available to define an empirical CDF or fit a parametric model to the data, and (b) to deal with the situation where only a limited amount of information is available

  7. Probability of detection as a function of multiple influencing parameters

    Energy Technology Data Exchange (ETDEWEB)

    Pavlovic, Mato

    2014-10-15

    Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.

  8. Probability of detection as a function of multiple influencing parameters

    International Nuclear Information System (INIS)

    Pavlovic, Mato

    2014-01-01

    Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.

  9. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  10. Survival probability in a one-dimensional quantum walk on a trapped lattice

    International Nuclear Information System (INIS)

    Goenuelol, Meltem; Aydiner, Ekrem; Shikano, Yutaka; Muestecaplioglu, Oezguer E

    2011-01-01

    The dynamics of the survival probability of quantum walkers on a one-dimensional lattice with random distribution of absorbing immobile traps is investigated. The survival probability of quantum walkers is compared with that of classical walkers. It is shown that the time dependence of the survival probability of quantum walkers has a piecewise stretched exponential character depending on the density of traps in numerical and analytical observations. The crossover between the quantum analogues of the Rosenstock and Donsker-Varadhan behavior is identified.

  11. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  12. Measuring survival time: a probability-based approach useful in healthcare decision-making.

    Science.gov (United States)

    2011-01-01

    In some clinical situations, the choice between treatment options takes into account their impact on patient survival time. Due to practical constraints (such as loss to follow-up), survival time is usually estimated using a probability calculation based on data obtained in clinical studies or trials. The two techniques most commonly used to estimate survival times are the Kaplan-Meier method and the actuarial method. Despite their limitations, they provide useful information when choosing between treatment options.

  13. Survival probability of a local excitation in a non-Markovian environment: Survival collapse, Zeno and anti-Zeno effects

    International Nuclear Information System (INIS)

    Rufeil-Fiori, E.; Pastawski, H.M.

    2009-01-01

    The decay dynamics of a local excitation interacting with a non-Markovian environment, modeled by a semi-infinite tight-binding chain, is exactly evaluated. We identify distinctive regimes for the dynamics. Sequentially: (i) early quadratic decay of the initial-state survival probability, up to a spreading time t S , (ii) exponential decay described by a self-consistent Fermi Golden Rule, and (iii) asymptotic behavior governed by quantum diffusion through the return processes, leading to an inverse power law decay. At this last cross-over time t R a survival collapse becomes possible. This could reduce the survival probability by several orders of magnitude. The cross-over times t S and t R allow to assess the range of applicability of the Fermi Golden Rule and give the conditions for the observation of the Zeno and anti-Zeno effect.

  14. Estimating Probability of Default on Peer to Peer Market – Survival Analysis Approach

    Directory of Open Access Journals (Sweden)

    Đurović Andrija

    2017-05-01

    Full Text Available Arguably a cornerstone of credit risk modelling is the probability of default. This article aims is to search for the evidence of relationship between loan characteristics and probability of default on peer-to-peer (P2P market. In line with that, two loan characteristics are analysed: 1 loan term length and 2 loan purpose. The analysis is conducted using survival analysis approach within the vintage framework. Firstly, 12 months probability of default through the cycle is used to compare riskiness of analysed loan characteristics. Secondly, log-rank test is employed in order to compare complete survival period of cohorts. Findings of the paper suggest that there is clear evidence of relationship between analysed loan characteristics and probability of default. Longer term loans are more risky than the shorter term ones and the least risky loans are those used for credit card payoff.

  15. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  16. Notes on the Lumped Backward Master Equation for the Neutron Extinction/Survival Probability

    Energy Technology Data Exchange (ETDEWEB)

    Prinja, Anil K [Los Alamos National Laboratory

    2012-07-02

    chains (a fission chain is defined as the initial source neutron and all its subsequent progeny) in which some chains are short lived while others propagate for unusually long times. Under these conditions, fission chains do not overlap strongly and this precludes the cancellation of neutron number fluctuations necessary for the mean to become established as the dominant measure of the neutron population. The fate of individual chains then plays a defining role in the evolution of the neutron population in strongly stochastic systems, and of particular interest and importance in supercritical systems is the extinction probability, defined as the probability that the neutron chain (initiating neutron and its progeny) will be extinguished at a particular time, or its complement, the time-dependent survival probability. The time-asymptotic limit of the latter, the probability of divergence, gives the probability that the neutron population will grow without bound, and is more commonly known as the probability of initiation or just POI. The ability to numerically compute these probabilities, with high accuracy and without overly restricting the underlying physics (e.g., fission neutron multiplicity, reactivity variation) is clearly essential in developing an understanding of the behavior of strongly stochastic systems.

  17. Effects of amphibian chytrid fungus on individual survival probability in wild boreal toads

    Science.gov (United States)

    Pilliod, D.S.; Muths, E.; Scherer, R. D.; Bartelt, P.E.; Corn, P.S.; Hossack, B.R.; Lambert, B.A.; Mccaffery, R.; Gaughan, C.

    2010-01-01

    Chytridiomycosis is linked to the worldwide decline of amphibians, yet little is known about the demographic effects of the disease. We collected capture-recapture data on three populations of boreal toads (Bufo boreas [Bufo = Anaxyrus]) in the Rocky Mountains (U.S.A.). Two of the populations were infected with chytridiomycosis and one was not. We examined the effect of the presence of amphibian chytrid fungus (Batrachochytrium dendrobatidis [Bd]; the agent of chytridiomycosis) on survival probability and population growth rate. Toads that were infected with Bd had lower average annual survival probability than uninfected individuals at sites where Bd was detected, which suggests chytridiomycosis may reduce survival by 31-42% in wild boreal toads. Toads that were negative for Bd at infected sites had survival probabilities comparable to toads at the uninfected site. Evidence that environmental covariates (particularly cold temperatures during the breeding season) influenced toad survival was weak. The number of individuals in diseased populations declined by 5-7%/year over the 6 years of the study, whereas the uninfected population had comparatively stable population growth. Our data suggest that the presence of Bd in these toad populations is not causing rapid population declines. Rather, chytridiomycosis appears to be functioning as a low-level, chronic disease whereby some infected individuals survive but the overall population effects are still negative. Our results show that some amphibian populations may be coexisting with Bd and highlight the importance of quantitative assessments of survival in diseased animal populations. Journal compilation. ?? 2010 Society for Conservation Biology. No claim to original US government works.

  18. Exact results for survival probability in the multistate Landau-Zener model

    International Nuclear Information System (INIS)

    Volkov, M V; Ostrovsky, V N

    2004-01-01

    An exact formula is derived for survival probability in the multistate Landau-Zener model in the special case where the initially populated state corresponds to the extremal (maximum or minimum) slope of a linear diabatic potential curve. The formula was originally guessed by S Brundobler and V Elzer (1993 J. Phys. A: Math. Gen. 26 1211) based on numerical calculations. It is a simple generalization of the expression for the probability of diabatic passage in the famous two-state Landau-Zener model. Our result is obtained via analysis and summation of the entire perturbation theory series

  19. Survival probability in small angle scattering of low energy alkali ions from alkali covered metal surfaces

    International Nuclear Information System (INIS)

    Neskovic, N.; Ciric, D.; Perovic, B.

    1982-01-01

    The survival probability in small angle scattering of low energy alkali ions from alkali covered metal surfaces is considered. The model is based on the momentum approximation. The projectiles are K + ions and the target is the (001)Ni+K surface. The incident energy is 100 eV and the incident angle 5 0 . The interaction potential of the projectile and the target consists of the Born-Mayer, the dipole and the image charge potentials. The transition probability function corresponds to the resonant electron transition to the 4s projectile energy level. (orig.)

  20. Passage and survival probabilities of juvenile Chinook salmon at Cougar Dam, Oregon, 2012

    Science.gov (United States)

    Beeman, John W.; Evans, Scott D.; Haner, Philip V.; Hansel, Hal C.; Hansen, Amy C.; Smith, Collin D.; Sprando, Jamie M.

    2014-01-01

    This report describes studies of juvenile-salmon dam passage and apparent survival at Cougar Dam, Oregon, during two operating conditions in 2012. Cougar Dam is a 158-meter tall rock-fill dam used primarily for flood control, and passes water through a temperature control tower to either a powerhouse penstock or to a regulating outlet (RO). The temperature control tower has moveable weir gates to enable water of different elevations and temperatures to be drawn through the dam to control water temperatures downstream. A series of studies of downstream dam passage of juvenile salmonids were begun after the National Oceanic and Atmospheric Administration determined that Cougar Dam was impacting the viability of anadromous fish stocks. The primary objectives of the studies described in this report were to estimate the route-specific fish passage probabilities at the dam and to estimate the survival probabilities of fish passing through the RO. The first set of dam operating conditions, studied in November, consisted of (1) a mean reservoir elevation of 1,589 feet, (2) water entering the temperature control tower through the weir gates, (3) most water routed through the turbines during the day and through the RO during the night, and (4) mean RO gate openings of 1.2 feet during the day and 3.2 feet during the night. The second set of dam operating conditions, studied in December, consisted of (1) a mean reservoir elevation of 1,507 ft, (2) water entering the temperature control tower through the RO bypass, (3) all water passing through the RO, and (4) mean RO gate openings of 7.3 feet during the day and 7.5 feet during the night. The studies were based on juvenile Chinook salmon (Oncorhynchus tshawytscha) surgically implanted with radio transmitters and passive integrated transponder (PIT) tags. Inferences about general dam passage percentage and timing of volitional migrants were based on surface-acclimated fish released in the reservoir. Dam passage and apparent

  1. Mean-field behavior for the survival probability and the point-to-surface connectivity

    CERN Document Server

    Sakai, A

    2003-01-01

    We consider the critical survival probability for oriented percolation and the contact process, and the point-to-surface connectivity for critical percolation. By similarity, let \\rho denote the critical expoents for both quantities. We prove in a unified fashion that, if \\rho exists and if both two-point function and its certain restricted version exhibit the same mean-field behavior, then \\rho=2 for percolation with d>7 and \\rho=1 for the time-oriented models with d>4.

  2. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    Science.gov (United States)

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  3. Technical report. The application of probability-generating functions to linear-quadratic radiation survival curves.

    Science.gov (United States)

    Kendal, W S

    2000-04-01

    To illustrate how probability-generating functions (PGFs) can be employed to derive a simple probabilistic model for clonogenic survival after exposure to ionizing irradiation. Both repairable and irreparable radiation damage to DNA were assumed to occur by independent (Poisson) processes, at intensities proportional to the irradiation dose. Also, repairable damage was assumed to be either repaired or further (lethally) injured according to a third (Bernoulli) process, with the probability of lethal conversion being directly proportional to dose. Using the algebra of PGFs, these three processes were combined to yield a composite PGF that described the distribution of lethal DNA lesions in irradiated cells. The composite PGF characterized a Poisson distribution with mean, chiD+betaD2, where D was dose and alpha and beta were radiobiological constants. This distribution yielded the conventional linear-quadratic survival equation. To test the composite model, the derived distribution was used to predict the frequencies of multiple chromosomal aberrations in irradiated human lymphocytes. The predictions agreed well with observation. This probabilistic model was consistent with single-hit mechanisms, but it was not consistent with binary misrepair mechanisms. A stochastic model for radiation survival has been constructed from elementary PGFs that exactly yields the linear-quadratic relationship. This approach can be used to investigate other simple probabilistic survival models.

  4. Do ducks and songbirds initiate more nests when the probability of survival is greater?

    Science.gov (United States)

    Grant, Todd A.; Shaffer, Terry L.

    2015-01-01

    Nesting chronology in grassland birds can vary by species, locality, and year. The date a nest is initiated can influence the subsequent probability of its survival in some grassland bird species. Because predation is the most significant cause of nest loss in grassland birds, we examined the relation between timing of nesting and nest survival. Periods of high nest survival that correspond with the peak of nesting activity might reflect long-term adaptations to specific predation pressures commonly recurring during certain periods of the nesting cycle. We evaluated this theory by comparing timing of nesting with date-specific nest survival rates for several duck and passerine species breeding in north-central North Dakota during 1998–2003. Nest survival decreased seasonally with date for five of the seven species we studied. We found little evidence to support consistent relations between timing of nesting, the number of nest initiations, and nest survival for any species we studied, suggesting that factors other than nest predation may better explain nesting chronology for these species. The apparent mismatch between date-specific patterns of nest survival and nest initiation underscores uncertainty about the process of avian nest site selection driven mainly by predation. Although timing of nesting differed among species, the general nesting period was fairly predictable across all years of study, suggesting the potential for research activities or management actions to be timed to take advantage of known periods when nests are active (or inactive). However, our results do not support the notion that biologists can take advantage of periods when many nests are active and survival is also high.

  5. Effect of drift on the temporal asymptotic form of the particle survival probability in media with absorbing traps

    International Nuclear Information System (INIS)

    Arkhincheev, V. E.

    2017-01-01

    A new asymptotic form of the particle survival probability in media with absorbing traps has been established. It is shown that this drift mechanism determines a new temporal behavior of the probability of particle survival in media with absorbing traps over long time intervals.

  6. Parameter resolution in two models for cell survival after radiation

    International Nuclear Information System (INIS)

    Di Cera, E.; Andreasi Bassi, F.; Arcovito, G.

    1989-01-01

    The resolvability of model parameters for the linear-quadratic and the repair-misrepair models for cell survival after radiation has been studied by Monte Carlo simulations as a function of the number of experimental data points collected in a given dose range and the experimental error. Statistical analysis of the results reveals the range of experimental conditions under which the model parameters can be resolved with sufficient accuracy, and points out some differences in the operational aspects of the two models. (orig.)

  7. Corticosterone levels predict survival probabilities of Galapagos marine iguanas during El Nino events.

    Science.gov (United States)

    Romero, L M; Wikelski, M

    2001-06-19

    Plasma levels of corticosterone are often used as a measure of "stress" in wild animal populations. However, we lack conclusive evidence that different stress levels reflect different survival probabilities between populations. Galápagos marine iguanas offer an ideal test case because island populations are affected differently by recurring El Niño famine events, and population-level survival can be quantified by counting iguanas locally. We surveyed corticosterone levels in six populations during the 1998 El Niño famine and the 1999 La Niña feast period. Iguanas had higher baseline and handling stress-induced corticosterone concentrations during famine than feast conditions. Corticosterone levels differed between islands and predicted survival through an El Niño period. However, among individuals, baseline corticosterone was only elevated when body condition dropped below a critical threshold. Thus, the population-level corticosterone response was variable but nevertheless predicted overall population health. Our results lend support to the use of corticosterone as a rapid quantitative predictor of survival in wild animal populations.

  8. Lower survival probabilities for adult Florida manatees in years with intense coastal storms

    Science.gov (United States)

    Langtimm, C.A.; Beck, C.A.

    2003-01-01

    The endangered Florida manatee (Trichechus manatus latirostris) inhabits the subtropical waters of the southeastern United States, where hurricanes are a regular occurrence. Using mark-resighting statistical models, we analyzed 19 years of photo-identification data and detected significant annual variation in adult survival for a subpopulation in northwest Florida where human impact is low. That variation coincided with years when intense hurricanes (Category 3 or greater on the Saffir-Simpson Hurricane Scale) and a major winter storm occurred in the northern Gulf of Mexico. Mean survival probability during years with no or low intensity storms was 0.972 (approximate 95% confidence interval = 0.961-0.980) but dropped to 0.936 (0.864-0.971) in 1985 with Hurricanes Elena, Kate, and Juan; to 0.909 (0.837-0.951) in 1993 with the March "Storm of the Century"; and to 0.817 (0.735-0.878) in 1995 with Hurricanes Opal, Erin, and Allison. These drops in survival probability were not catastrophic in magnitude and were detected because of the use of state-of-the-art statistical techniques and the quality of the data. Because individuals of this small population range extensively along the north Gulf coast of Florida, it was possible to resolve storm effects on a regional scale rather than the site-specific local scale common to studies of more sedentary species. This is the first empirical evidence in support of storm effects on manatee survival and suggests a cause-effect relationship. The decreases in survival could be due to direct mortality, indirect mortality, and/or emigration from the region as a consequence of storms. Future impacts to the population by a single catastrophic hurricane, or series of smaller hurricanes, could increase the probability of extinction. With the advent in 1995 of a new 25- to 50-yr cycle of greater hurricane activity, and longer term change possible with global climate change, it becomes all the more important to reduce mortality and injury

  9. Calculation of parameter failure probability of thermodynamic system by response surface and importance sampling method

    International Nuclear Information System (INIS)

    Shang Yanlong; Cai Qi; Chen Lisheng; Zhang Yangwei

    2012-01-01

    In this paper, the combined method of response surface and importance sampling was applied for calculation of parameter failure probability of the thermodynamic system. The mathematics model was present for the parameter failure of physics process in the thermodynamic system, by which the combination arithmetic model of response surface and importance sampling was established, then the performance degradation model of the components and the simulation process of parameter failure in the physics process of thermodynamic system were also present. The parameter failure probability of the purification water system in nuclear reactor was obtained by the combination method. The results show that the combination method is an effective method for the calculation of the parameter failure probability of the thermodynamic system with high dimensionality and non-linear characteristics, because of the satisfactory precision with less computing time than the direct sampling method and the drawbacks of response surface method. (authors)

  10. Separating the contributions of variability and parameter uncertainty in probability distributions

    International Nuclear Information System (INIS)

    Sankararaman, S.; Mahadevan, S.

    2013-01-01

    This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters

  11. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  12. The two-parametric scaling and new temporal asymptotic of survival probability of diffusing particle in the medium with traps.

    Science.gov (United States)

    Arkhincheev, V E

    2017-03-01

    The new asymptotic behavior of the survival probability of particles in a medium with absorbing traps in an electric field has been established in two ways-by using the scaling approach and by the direct solution of the diffusion equation in the field. It has shown that at long times, this drift mechanism leads to a new temporal behavior of the survival probability of particles in a medium with absorbing traps.

  13. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  14. Cluster geometry and survival probability in systems driven by reaction-diffusion dynamics

    International Nuclear Information System (INIS)

    Windus, Alastair; Jensen, Henrik J

    2008-01-01

    We consider a reaction-diffusion model incorporating the reactions A→φ, A→2A and 2A→3A. Depending on the relative rates for sexual and asexual reproduction of the quantity A, the model exhibits either a continuous or first-order absorbing phase transition to an extinct state. A tricritical point separates the two phase lines. While we comment on this critical behaviour, the main focus of the paper is on the geometry of the population clusters that form. We observe the different cluster structures that arise at criticality for the three different types of critical behaviour and show that there exists a linear relationship for the survival probability against initial cluster size at the tricritical point only.

  15. Cluster geometry and survival probability in systems driven by reaction-diffusion dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Windus, Alastair; Jensen, Henrik J [The Institute for Mathematical Sciences, 53 Prince' s Gate, South Kensington, London SW7 2PG (United Kingdom)], E-mail: h.jensen@imperial.ac.uk

    2008-11-15

    We consider a reaction-diffusion model incorporating the reactions A{yields}{phi}, A{yields}2A and 2A{yields}3A. Depending on the relative rates for sexual and asexual reproduction of the quantity A, the model exhibits either a continuous or first-order absorbing phase transition to an extinct state. A tricritical point separates the two phase lines. While we comment on this critical behaviour, the main focus of the paper is on the geometry of the population clusters that form. We observe the different cluster structures that arise at criticality for the three different types of critical behaviour and show that there exists a linear relationship for the survival probability against initial cluster size at the tricritical point only.

  16. Survival probabilities of first and second clutches of blackbird (Turdus merula in an urban environment

    Directory of Open Access Journals (Sweden)

    Kurucz Kornelia

    2010-01-01

    Full Text Available The breeding success of blackbirds was investigated in April and June 2008 and 2009 in the Botanical Garden of the University of Pecs, with a total of 50 artificial nests at each of the four sessions (with 1 quail egg and 1 plasticine egg placed in every nest. In all four study periods of the two years, 2 nests (4% were destroyed by predators. Six nests (12%, of the nests were not discovered in either of the cases. The survival probability of artificial nests was greater in April than in June (both years, but the difference was significant only in 2008. Nests placed into a curtain of ivy (Hedera helix on a wall were located higher up than those in bushes, yet their predation rates were quite similar. The predation values of quail vs. plasticine eggs did not differ in 2008. In the year 2009, however, significantly more quail eggs were discovered (mostly removed, than plasticine eggs. Marks that were left on plasticine eggs originated mostly from small mammals and small-bodied birds, but the disappearance of a large number of quail and plasticine eggs was probably caused by larger birds, primarily jays.

  17. Survival probability of precipitations and rain attenuation in tropical and equatorial regions

    Science.gov (United States)

    Mohebbi Nia, Masoud; Din, Jafri; Panagopoulos, Athanasios D.; Lam, Hong Yin

    2015-08-01

    This contribution presents a stochastic model useful for the generation of a long-term tropospheric rain attenuation time series for Earth space or a terrestrial radio link in tropical and equatorial heavy rain regions based on the well-known Cox-Ingersoll-Ross model previously employed in research in the fields of finance and economics. This model assumes typical gamma distribution for rain attenuation in heavy rain climatic regions and utilises the temporal dynamic of precipitation collected in equatorial Johor, Malaysia. Different formations of survival probability are also discussed. Furthermore, the correlation between these probabilities and the Markov process is determined, and information on the variance and autocorrelation function of rain events with respect to the particular characteristics of precipitation in this area is presented. The proposed technique proved to preserve the peculiarities of precipitation for an equatorial region and reproduce fairly good statistics of the rain attenuation correlation function that could help to improve the prediction of dynamic characteristics of rain fade events.

  18. Focusing on a Probability Element: Parameter Selection of Message Importance Measure in Big Data

    OpenAIRE

    She, Rui; Liu, Shanyun; Dong, Yunquan; Fan, Pingyi

    2017-01-01

    Message importance measure (MIM) is applicable to characterize the importance of information in the scenario of big data, similar to entropy in information theory. In fact, MIM with a variable parameter can make an effect on the characterization of distribution. Furthermore, by choosing an appropriate parameter of MIM,it is possible to emphasize the message importance of a certain probability element in a distribution. Therefore, parametric MIM can play a vital role in anomaly detection of bi...

  19. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    Science.gov (United States)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  20. Volumetric and MGMT parameters in glioblastoma patients: Survival analysis

    International Nuclear Information System (INIS)

    Iliadis, Georgios; Kotoula, Vassiliki; Chatzisotiriou, Athanasios; Televantou, Despina; Eleftheraki, Anastasia G; Lambaki, Sofia; Misailidou, Despina; Selviaridis, Panagiotis; Fountzilas, George

    2012-01-01

    In this study several tumor-related volumes were assessed by means of a computer-based application and a survival analysis was conducted to evaluate the prognostic significance of pre- and postoperative volumetric data in patients harboring glioblastomas. In addition, MGMT (O 6 -methylguanine methyltransferase) related parameters were compared with those of volumetry in order to observe possible relevance of this molecule in tumor development. We prospectively analyzed 65 patients suffering from glioblastoma (GBM) who underwent radiotherapy with concomitant adjuvant temozolomide. For the purpose of volumetry T1 and T2-weighted magnetic resonance (MR) sequences were used, acquired both pre- and postoperatively (pre-radiochemotherapy). The volumes measured on preoperative MR images were necrosis, enhancing tumor and edema (including the tumor) and on postoperative ones, net-enhancing tumor. Age, sex, performance status (PS) and type of operation were also included in the multivariate analysis. MGMT was assessed for promoter methylation with Multiplex Ligation-dependent Probe Amplification (MLPA), for RNA expression with real time PCR, and for protein expression with immunohistochemistry in a total of 44 cases with available histologic material. In the multivariate analysis a negative impact was shown for pre-radiochemotherapy net-enhancing tumor on the overall survival (OS) (p = 0.023) and for preoperative necrosis on progression-free survival (PFS) (p = 0.030). Furthermore, the multivariate analysis confirmed the importance of PS in PFS and OS of patients. MGMT promoter methylation was observed in 13/23 (43.5%) evaluable tumors; complete methylation was observed in 3/13 methylated tumors only. High rate of MGMT protein positivity (> 20% positive neoplastic nuclei) was inversely associated with pre-operative tumor necrosis (p = 0.021). Our findings implicate that volumetric parameters may have a significant role in the prognosis of GBM patients. Furthermore

  1. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Energy Technology Data Exchange (ETDEWEB)

    Portnoy, David, E-mail: david.portnoy@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States); Feuerbach, Robert; Heimberg, Jennifer [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States)

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of

  2. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    International Nuclear Information System (INIS)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-01-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of spectra

  3. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Science.gov (United States)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the "threat" set of spectra

  4. Parameter Analysis of the VPIN (Volume synchronized Probability of Informed Trading) Metric

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jung Heon; Wu, Kesheng; Simon, Horst D.

    2014-03-01

    VPIN (Volume synchronized Probability of Informed trading) is a leading indicator of liquidity-induced volatility. It is best known for having produced a signal more than hours before the Flash Crash of 2010. On that day, the market saw the biggest one-day point decline in the Dow Jones Industrial Average, which culminated to the market value of $1 trillion disappearing, but only to recover those losses twenty minutes later (Lauricella 2010). The computation of VPIN requires the user to set up a handful of free parameters. The values of these parameters significantly affect the effectiveness of VPIN as measured by the false positive rate (FPR). An earlier publication reported that a brute-force search of simple parameter combinations yielded a number of parameter combinations with FPR of 7%. This work is a systematic attempt to find an optimal parameter set using an optimization package, NOMAD (Nonlinear Optimization by Mesh Adaptive Direct Search) by Audet, le digabel, and tribes (2009) and le digabel (2011). We have implemented a number of techniques to reduce the computation time with NOMAD. Tests show that we can reduce the FPR to only 2%. To better understand the parameter choices, we have conducted a series of sensitivity analysis via uncertainty quantification on the parameter spaces using UQTK (Uncertainty Quantification Toolkit). Results have shown dominance of 2 parameters in the computation of FPR. Using the outputs from NOMAD optimization and sensitivity analysis, We recommend A range of values for each of the free parameters that perform well on a large set of futures trading records.

  5. Recruitment in a Colorado population of big brown bats: Breeding probabilities, litter size, and first-year survival

    Science.gov (United States)

    O'Shea, T.J.; Ellison, L.E.; Neubaum, D.J.; Neubaum, M.A.; Reynolds, C.A.; Bowen, R.A.

    2010-01-01

    We used markrecapture estimation techniques and radiography to test hypotheses about 3 important aspects of recruitment in big brown bats (Eptesicus fuscus) in Fort Collins, Colorado: adult breeding probabilities, litter size, and 1st-year survival of young. We marked 2,968 females with passive integrated transponder (PIT) tags at multiple sites during 2001-2005 and based our assessments on direct recaptures (breeding probabilities) and passive detection with automated PIT tag readers (1st-year survival). We interpreted our data in relation to hypotheses regarding demographic influences of bat age, roost, and effects of years with unusual environmental conditions: extreme drought (2002) and arrival of a West Nile virus epizootic (2003). Conditional breeding probabilities at 6 roosts sampled in 2002-2005 were estimated as 0.64 (95% confidence interval [95% CI] = 0.530.73) in 1-year-old females, but were consistently high (95% CI = 0.940.96) and did not vary by roost, year, or prior year breeding status in older adults. Mean litter size was 1.11 (95% CI = 1.051.17), based on examination of 112 pregnant females by radiography. Litter size was not higher in older or larger females and was similar to results of other studies in western North America despite wide variation in latitude. First-year survival was estimated as 0.67 (95% CI = 0.610.73) for weaned females at 5 maternity roosts over 5 consecutive years, was lower than adult survival (0.79; 95% CI = 0.770.81), and varied by roost. Based on model selection criteria, strong evidence exists for complex roost and year effects on 1st-year survival. First-year survival was lowest in bats born during the drought year. Juvenile females that did not return to roosts as 1-year-olds had lower body condition indices in late summer of their natal year than those known to survive. ?? 2009 American Society of Mammalogists.

  6. Estimating the probability of survival of individual shortleaf pine (Pinus echinata mill.) trees

    Science.gov (United States)

    Sudip Shrestha; Thomas B. Lynch; Difei Zhang; James M. Guldin

    2012-01-01

    A survival model is needed in a forest growth system which predicts the survival of trees on individual basis or on a stand basis (Gertner, 1989). An individual-tree modeling approach is one of the better methods available for predicting growth and yield as it provides essential information about particular tree species; tree size, tree quality and tree present status...

  7. Survival probabilities of loggerhead sea turtles (Caretta caretta estimated from capture-mark-recapture data in the Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    Paolo Casale

    2007-06-01

    Full Text Available Survival probabilities of loggerhead sea turtles (Caretta caretta are estimated for the first time in the Mediterranean by analysing 3254 tagging and 134 re-encounter data from this region. Most of these turtles were juveniles found at sea. Re-encounters were live resightings and dead recoveries and data were analysed with Barker’s model, a modified version of the Cormack-Jolly-Seber model which can combine recapture, live resighting and dead recovery data. An annual survival probability of 0.73 (CI 95% = 0.67-0.78; n=3254 was obtained, and should be considered as a conservative estimate due to an unknown, though not negligible, tag loss rate. This study makes a preliminary estimate of the survival probabilities of in-water developmental stages for the Mediterranean population of endangered loggerhead sea turtles and provides the first insights into the magnitude of the suspected human-induced mortality in the region. The model used here for the first time on sea turtles could be used to obtain survival estimates from other data sets with few or no true recaptures but with other types of re-encounter data, which are a common output of tagging programmes involving these wide-ranging animals.

  8. Estimates for the probability of survival of electrons in passing through a radiator

    International Nuclear Information System (INIS)

    Loos, J.

    1977-01-01

    Some calculations on the survival of electrons passing through various radiator thicknesses are tabulated. The results of these calculations should serve as a guide for expected attenuation of electrons in the beam when various Pb radiators are inserted

  9. Malnutrition among rural and urban children in Lesotho: related hazard and survival probabilities

    Directory of Open Access Journals (Sweden)

    Zeleke Worku

    2003-11-01

    Full Text Available The relationship between the survival time of children and several variables that affect the survival and nutritional status of children under the age of five years in the Maseru District of Lesotho was investigated. Opsomming Die verhouding tussen die oorlewingstyd van kinders en verskeie veranderlikes wat die oorlewings- en voedingstatus van kinders onder die ouderdom van vyf jaar affekteer is in die Maseru-distrik in Lesotho nagevors. *Please note: This is a reduced version of the abstract. Please refer to PDF for full text.

  10. Experiencing El Niño conditions during early life reduces recruiting probabilities but not adult survival

    Science.gov (United States)

    Rodríguez, Cristina; Drummond, Hugh

    2018-01-01

    In wild long-lived animals, analysis of impacts of stressful natal conditions on adult performance has rarely embraced the entire age span, and the possibility that costs are expressed late in life has seldom been examined. Using 26 years of data from 8541 fledglings and 1310 adults of the blue-footed booby (Sula nebouxii), a marine bird that can live up to 23 years, we tested whether experiencing the warm waters and food scarcity associated with El Niño in the natal year reduces recruitment or survival over the adult lifetime. Warm water in the natal year reduced the probability of recruiting; each additional degree (°C) of water temperature meant a reduction of roughly 50% in fledglings' probability of returning to the natal colony as breeders. Warm water in the current year impacted adult survival, with greater effect at the oldest ages than during early adulthood. However, warm water in the natal year did not affect survival at any age over the adult lifespan. A previous study showed that early recruitment and widely spaced breeding allow boobies that experience warm waters in the natal year to achieve normal fledgling production over the first 10 years; our results now show that this reproductive effort incurs no survival penalty, not even late in life. This pattern is additional evidence of buffering against stressful natal conditions via life-history adjustments. PMID:29410788

  11. Additional components of risk assessment and their impact on the probability parameter

    Directory of Open Access Journals (Sweden)

    Piotr Saja

    2017-04-01

    Full Text Available The article raises the issue of risk assessment and its impact on the quality and safety of work. During the assessment of the turning lathe position additional components associated with the jobs personalization were taken into account. Paragraph 2 item 7 of the Regulation of the Minister of Laborr and Social Policy of 26 September 1997 on general safety regulations defines occupational risk as the likelihood of an adverse event. The authors drew attention to the reality of the accident, which sometimes depends on the predisposition of the employee. It turns out that a correct estimation of the probability of occurrence of the accident to be able to react in a timely way seems extremely important.. This parameter will be assessed more accurately if we take into account a number of additional components resulting from the characteristics of the employee. The results of the personalized assessment of risk may allow appropriate planning of corrective and preventive actions.

  12. Probability and statistical correlation of the climatic parameters for estimatingenergy consumption of a building

    Directory of Open Access Journals (Sweden)

    Samarin Oleg Dmitrievich

    2014-01-01

    Full Text Available The problem of the most accurate estimation of energy consumption by ventilation and air conditioning systems in buildings is a high-priority task now because of the decrease of energy and fuel sources and because of the revision of building standards in Russian Federation. That’s why it is very important to find simple but accurate enough correlations of the climatic parameters in heating and cooling seasons of a year.Therefore the probabilistic and statistical relationship of the parameters of external climate in warm and cold seasons are considered. The climatic curves for cold and warm seasons in Moscow showing the most probable combinations between the external air temperature and the relative air humidity are plotted using the data from the Design Guidelines to the State Building Code “Building Climatology”. The statistical relationship of the enthalpy and the external air temperature for climatic conditions of Moscow are determined using these climatic curves and formulas connecting relative air humidity and other parameters of the air moisture degree.The mean value of the external air enthalpy for the heating season is calculated in order to simplify the determination of full heat consumption of ventilating and air conditioning systems taking into account the real mean state of external air. The field of application and the estimation of accuracy and standard deviation for the presented dependences are found. The obtained model contains the only independent parameter namely the external air temperature and therefore it can be easily used in engineering practice especially during preliminary calculation.

  13. Probabilities of Pulmonary and Cardiac Complications and Radiographic Parameters in Breast Cancer Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Noh, O Kyu; Paek, Sung Ho; Ahn, Seung Do; Choi, Eun Kyung; Lee, Sang Wook; Song, Si Yeol; Yoon, Sang Min; Kim, Jong Hoon [Dept. of Radiation Oncology, Asan Medical Center, University of Ulsan College of Medicine, Seoul (Korea, Republic of)

    2010-11-15

    To evaluate the relationship between the normal tissue complication probability (NTCP) of 3- dimensional (3-D) radiotherapy and the radiographic parameters of 2-dimensional (2-D) radiotherapy such as central lung distance (CLD) and maximal heart distance (MHD). We analyzed 110 patients who were treated with postoperative radiotherapy for breast cancer. A two-field tangential technique, a three-field technique, and the reverse hockey stick method were used. The radiation dose administered to whole breast or the chest wall was 50.4 Gy, whereas a 45 Gy was administered to the supraclavicular field. The NTCPs of the heart and lung were calculated by the modified Lyman model and the relative seriality model. For all patients, the NTCPs of radiation-induced pneumonitis and cardiac mortality were 0.5% and 0.7%, respectively. The NTCP of radiation-induced pneumonitis was higher in patients treated with the reverse hockey stick method than in those treated by other two techniques (0.0%, 0.0%, 3.1%, p<0.001). The NTCP of radiation-induced pneumonitis increased with CLD. The NTCP of cardiac mortality increased with MHD (R2=0.808). We found a close correlation between the NTCP of 3-D radiotherapy and 2-D radiographic parameters. Our results are useful to reanalyze the previous 2-D based clinical reports about breast radiation therapy complications as a viewpoint of NTCP.

  14. Probabilities of Pulmonary and Cardiac Complications and Radiographic Parameters in Breast Cancer Radiotherapy

    International Nuclear Information System (INIS)

    Noh, O Kyu; Paek, Sung Ho; Ahn, Seung Do; Choi, Eun Kyung; Lee, Sang Wook; Song, Si Yeol; Yoon, Sang Min; Kim, Jong Hoon

    2010-01-01

    To evaluate the relationship between the normal tissue complication probability (NTCP) of 3- dimensional (3-D) radiotherapy and the radiographic parameters of 2-dimensional (2-D) radiotherapy such as central lung distance (CLD) and maximal heart distance (MHD). We analyzed 110 patients who were treated with postoperative radiotherapy for breast cancer. A two-field tangential technique, a three-field technique, and the reverse hockey stick method were used. The radiation dose administered to whole breast or the chest wall was 50.4 Gy, whereas a 45 Gy was administered to the supraclavicular field. The NTCPs of the heart and lung were calculated by the modified Lyman model and the relative seriality model. For all patients, the NTCPs of radiation-induced pneumonitis and cardiac mortality were 0.5% and 0.7%, respectively. The NTCP of radiation-induced pneumonitis was higher in patients treated with the reverse hockey stick method than in those treated by other two techniques (0.0%, 0.0%, 3.1%, p<0.001). The NTCP of radiation-induced pneumonitis increased with CLD. The NTCP of cardiac mortality increased with MHD (R2=0.808). We found a close correlation between the NTCP of 3-D radiotherapy and 2-D radiographic parameters. Our results are useful to reanalyze the previous 2-D based clinical reports about breast radiation therapy complications as a viewpoint of NTCP.

  15. Intraseasonal variation in survival and probable causes of mortality in greater sage-grouse Centrocercus urophasianus

    Science.gov (United States)

    Blomberg, Erik J.; Gibson, Daniel; Sedinger, James S.; Casazza, Michael L.; Coates, Peter S.

    2013-01-01

    The mortality process is a key component of avian population dynamics, and understanding factors that affect mortality is central to grouse conservation. Populations of greater sage-grouse Centrocercus urophasianus have declined across their range in western North America. We studied cause-specific mortality of radio-marked sage-grouse in Eureka County, Nevada, USA, during two seasons, nesting (2008-2012) and fall (2008-2010), when survival was known to be lower compared to other times of the year. We used known-fate and cumulative incidence function models to estimate weekly survival rates and cumulative risk of cause-specific mortalities, respectively. These methods allowed us to account for temporal variation in sample size and staggered entry of marked individuals into the sample to obtain robust estimates of survival and cause-specific mortality. We monitored 376 individual sage-grouse during the course of our study, and investigated 87 deaths. Predation was the major source of mortality, and accounted for 90% of all mortalities during our study. During the nesting season (1 April - 31 May), the cumulative risk of predation by raptors (0.10; 95% CI: 0.05-0.16) and mammals (0.08; 95% CI: 0.03-013) was relatively equal. In the fall (15 August - 31 October), the cumulative risk of mammal predation was greater (M(mam) = 0.12; 95% CI: 0.04-0.19) than either predation by raptors (M(rap) = 0.05; 95% CI: 0.00-0.10) or hunting harvest (M(hunt) = 0.02; 95% CI: 0.0-0.06). During both seasons, we observed relatively few additional sources of mortality (e.g. collision) and observed no evidence of disease-related mortality (e.g. West Nile Virus). In general, we found little evidence for intraseasonal temporal variation in survival, suggesting that the nesting and fall seasons represent biologically meaningful time intervals with respect to sage-grouse survival.

  16. Stark broadening parameters and transition probabilities of persistent lines of Tl II

    Science.gov (United States)

    de Andrés-García, I.; Colón, C.; Fernández-Martínez, F.

    2018-05-01

    The presence of singly ionized thallium in the stellar atmosphere of the chemically peculiar star χ Lupi was reported by Leckrone et al. in 1999 by analysis of its stellar spectrum obtained with the Goddard High Resolution Spectrograph (GHRS) on board the Hubble Space Telescope. Atomic data about the spectral line of 1307.50 Å and about the hyperfine components of the spectral lines of 1321.71 Å and 1908.64 Å were taken from different sources and used to analyse the isotopic abundance of thallium II in the star χ Lupi. From their results the authors concluded that the photosphere of the star presents an anomalous isotopic composition of Tl II. A study of the atomic parameters of Tl II and of the broadening by the Stark effect of its spectral lines (and therefore of the possible overlaps of these lines) can help to clarify the conclusions about the spectral abundance of Tl II in different stars. In this paper we present calculated values of the atomic transition probabilities and Stark broadening parameters for 49 spectral lines of Tl II obtained by using the Cowan code including core polarization effects and the Griem semiempirical approach. Theoretical values of radiative lifetimes for 11 levels (eight with experimental values in the bibliography) are calculated and compared with the experimental values in order to test the quality of our results. Theoretical trends of the Stark width and shift parameters versus the temperature for spectral lines of astrophysical interest are displayed. Trends of our calculated Stark width for the isoelectronic sequence Tl II-Pb III-Bi IV are also displayed.

  17. Disparities in breast cancer tumor characteristics, treatment, time to treatment, and survival probability among African American and white women.

    Science.gov (United States)

    Foy, Kevin Chu; Fisher, James L; Lustberg, Maryam B; Gray, Darrell M; DeGraffinreid, Cecilia R; Paskett, Electra D

    2018-01-01

    African American (AA) women have a 42% higher breast cancer death rate compared to white women despite recent advancements in management of the disease. We examined racial differences in clinical and tumor characteristics, treatment and survival in patients diagnosed with breast cancer between 2005 and 2014 at a single institution, the James Cancer Hospital, and who were included in the Arthur G. James Cancer Hospital and Richard J. Solove Research Institute Cancer Registry in Columbus OH. Statistical analyses included likelihood ratio chi-square tests for differences in proportions, as well as univariate and multivariate Cox proportional hazards regressions to examine associations between race and overall and progression-free survival probabilities. AA women made up 10.2% (469 of 4593) the sample. Average time to onset of treatment after diagnosis was almost two times longer in AA women compared to white women (62.0 days vs 35.5 days, p  triple negative and late stage breast cancer, and were less likely to receive surgery, especially mastectomy and reconstruction following mastectomy. After adjustment for confounding factors (age, grade, and surgery), overall survival probability was significantly associated with race (HR = 1.33; 95% CI 1.03-1.72). These findings highlight the need for efforts focused on screening and receipt of prompt treatment among AA women diagnosed with breast cancer.

  18. Survival modeling for the estimation of transition probabilities in model-based economic evaluations in the absence of individual patient data: a tutorial.

    Science.gov (United States)

    Diaby, Vakaramoko; Adunlin, Georges; Montero, Alberto J

    2014-02-01

    Survival modeling techniques are increasingly being used as part of decision modeling for health economic evaluations. As many models are available, it is imperative for interested readers to know about the steps in selecting and using the most suitable ones. The objective of this paper is to propose a tutorial for the application of appropriate survival modeling techniques to estimate transition probabilities, for use in model-based economic evaluations, in the absence of individual patient data (IPD). An illustration of the use of the tutorial is provided based on the final progression-free survival (PFS) analysis of the BOLERO-2 trial in metastatic breast cancer (mBC). An algorithm was adopted from Guyot and colleagues, and was then run in the statistical package R to reconstruct IPD, based on the final PFS analysis of the BOLERO-2 trial. It should be emphasized that the reconstructed IPD represent an approximation of the original data. Afterwards, we fitted parametric models to the reconstructed IPD in the statistical package Stata. Both statistical and graphical tests were conducted to verify the relative and absolute validity of the findings. Finally, the equations for transition probabilities were derived using the general equation for transition probabilities used in model-based economic evaluations, and the parameters were estimated from fitted distributions. The results of the application of the tutorial suggest that the log-logistic model best fits the reconstructed data from the latest published Kaplan-Meier (KM) curves of the BOLERO-2 trial. Results from the regression analyses were confirmed graphically. An equation for transition probabilities was obtained for each arm of the BOLERO-2 trial. In this paper, a tutorial was proposed and used to estimate the transition probabilities for model-based economic evaluation, based on the results of the final PFS analysis of the BOLERO-2 trial in mBC. The results of our study can serve as a basis for any model

  19. Corticosterone levels predict survival probabilities of Galápagos marine iguanas during El Niño events

    Science.gov (United States)

    Romero, L. Michael; Wikelski, Martin

    2001-01-01

    Plasma levels of corticosterone are often used as a measure of “stress” in wild animal populations. However, we lack conclusive evidence that different stress levels reflect different survival probabilities between populations. Galápagos marine iguanas offer an ideal test case because island populations are affected differently by recurring El Niño famine events, and population-level survival can be quantified by counting iguanas locally. We surveyed corticosterone levels in six populations during the 1998 El Niño famine and the 1999 La Niña feast period. Iguanas had higher baseline and handling stress-induced corticosterone concentrations during famine than feast conditions. Corticosterone levels differed between islands and predicted survival through an El Niño period. However, among individuals, baseline corticosterone was only elevated when body condition dropped below a critical threshold. Thus, the population-level corticosterone response was variable but nevertheless predicted overall population health. Our results lend support to the use of corticosterone as a rapid quantitative predictor of survival in wild animal populations. PMID:11416210

  20. Survival probability for diffractive dijet production in p anti p collisions from next-to-leading order calculations

    International Nuclear Information System (INIS)

    Klasen, M.; Kramer, G.

    2009-08-01

    We perform next-to-leading order calculations of the single-diffractive and non-diffractive cross sections for dijet production in proton-antiproton collisions at the Tevatron. By comparing their ratio to the data published by the CDF collaboration for two different center-of-mass energies, we deduce the rapidity-gap survival probability as a function of the momentum fraction of the parton in the antiproton. Assuming Regge factorization, this probability can be interpreted as a suppression factor for the diffractive structure function measured in deep-inelastic scattering at HERA. In contrast to the observations for photoproduction, the suppression factor in protonantiproton collisions depends on the momentum fraction of the parton in the Pomeron even at next-to-leading order. (orig.)

  1. Method for Automatic Selection of Parameters in Normal Tissue Complication Probability Modeling.

    Science.gov (United States)

    Christophides, Damianos; Appelt, Ane L; Gusnanto, Arief; Lilley, John; Sebag-Montefiore, David

    2018-07-01

    To present a fully automatic method to generate multiparameter normal tissue complication probability (NTCP) models and compare its results with those of a published model, using the same patient cohort. Data were analyzed from 345 rectal cancer patients treated with external radiation therapy to predict the risk of patients developing grade 1 or ≥2 cystitis. In total, 23 clinical factors were included in the analysis as candidate predictors of cystitis. Principal component analysis was used to decompose the bladder dose-volume histogram into 8 principal components, explaining more than 95% of the variance. The data set of clinical factors and principal components was divided into training (70%) and test (30%) data sets, with the training data set used by the algorithm to compute an NTCP model. The first step of the algorithm was to obtain a bootstrap sample, followed by multicollinearity reduction using the variance inflation factor and genetic algorithm optimization to determine an ordinal logistic regression model that minimizes the Bayesian information criterion. The process was repeated 100 times, and the model with the minimum Bayesian information criterion was recorded on each iteration. The most frequent model was selected as the final "automatically generated model" (AGM). The published model and AGM were fitted on the training data sets, and the risk of cystitis was calculated. The 2 models had no significant differences in predictive performance, both for the training and test data sets (P value > .05) and found similar clinical and dosimetric factors as predictors. Both models exhibited good explanatory performance on the training data set (P values > .44), which was reduced on the test data sets (P values < .05). The predictive value of the AGM is equivalent to that of the expert-derived published model. It demonstrates potential in saving time, tackling problems with a large number of parameters, and standardizing variable selection in NTCP

  2. Sugar administration to newly emerged Aedes albopictus males increases their survival probability and mating performance.

    Science.gov (United States)

    Bellini, Romeo; Puggioli, Arianna; Balestrino, Fabrizio; Brunelli, Paolo; Medici, Anna; Urbanelli, Sandra; Carrieri, Marco

    2014-04-01

    Aedes albopictus male survival in laboratory cages is no more than 4-5 days when kept without any access to sugar indicating their need to feed on a sugar source soon after emergence. We therefore developed a device to administer energetic substances to newly emerged males when released as pupae as part of a sterile insect technique (SIT) programme, made with a polyurethane sponge 4 cm thick and perforated with holes 2 cm in diameter. The sponge was imbibed with the required sugar solution and due to its high retention capacity the sugar solution was available for males to feed for at least 48 h. When evaluated in lab cages, comparing adults emerged from the device with sugar solution vs the device with water only (as negative control), about half of the males tested positive for fructose using the Van Handel anthrone test, compared to none of males in the control cage. We then tested the tool in semi-field and in field conditions with different sugar concentrations (10%, 15%, and 20%) and compared results to the controls fed with water only. Males were recaptured by a battery operated manual aspirator at 24 and 48 h after pupae release. Rather high share 10-25% of captured males tested positive for fructose in recollections in the vicinity of the control stations, while in the vicinity of the sugar stations around 40-55% of males were positive, though variability between replicates was large. The sugar positive males in the control test may have been released males that had access to natural sugar sources found close to the release station and/or wild males present in the environment. Only a slight increase in the proportion of positive males was obtained by increasing the sugar concentration in the feeding device from 10% to 20%. Surprisingly, modification of the device to add a black plastic inverted funnel above the container reduced rather than increased the proportion of fructose positive males collected around the station. No evidence of difference in the

  3. Estimation of failure probability of the end induced current depending on uncertain parameters of a transmission line

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper treats about the risk analysis of an EMC default using a statistical approach based on reliability methods. A probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is computed by taking into account uncertainties on input parameters influencing extreme levels of interference in the context of transmission lines. Results are compared to Monte Carlo simulation (MCS). (authors)

  4. Probability encoding of hydrologic parameters for basalt: Elicitation of expert opinions from a panel of five consulting hydrologists

    International Nuclear Information System (INIS)

    Davis, J.D.

    1984-01-01

    The Columbia River Basalts Underlying the Hanford Site in Washington State are being considered as a possible location for a geologic repository for high-level nuclear waste. To investigate the feasibility of a repository at this site, the hydrologic parameters of the site must be evaluated. Among hydrologic parameters of particular interest are the effective porosity of the Cohassett flow top and flow interior and the vertical-to-horizontal hydraulic conductivity, or anisotropy ratio, of the Cohassett flow interior. Site-specific data for these hydrologic parameters are currently inadequate. To obtain credible, auditable, and independently derived estimates of the specified hydrologic parameters for the purpose of preliminary assessment of candidate repository performance, a panel of five nationally recognized hydrologists was assembled. Their expert judgments were quantified during two rounds of Delphi process by means of a probability encoding method developed to estimate the probability distributions of the selected hydrologic variables. 210 refs., 12 figs., 5 tabs

  5. How long do the dead survive on the road? Carcass persistence probability and implications for road-kill monitoring surveys.

    Directory of Open Access Journals (Sweden)

    Sara M Santos

    Full Text Available BACKGROUND: Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. METHODOLOGY/PRINCIPAL FINDINGS: Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i describe carcass persistence timings for overall and for specific animal groups; ii assess optimal sampling designs according to research objectives; and iii model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning and lizards (in the afternoon, daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. CONCLUSION/SIGNIFICANCE: The guidance given here on monitoring frequencies is particularly relevant to provide

  6. How long do the dead survive on the road? Carcass persistence probability and implications for road-kill monitoring surveys.

    Science.gov (United States)

    Santos, Sara M; Carvalho, Filipe; Mira, António

    2011-01-01

    Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i) describe carcass persistence timings for overall and for specific animal groups; ii) assess optimal sampling designs according to research objectives; and iii) model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning) and lizards (in the afternoon), daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days) for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. The guidance given here on monitoring frequencies is particularly relevant to provide conservation and transportation agencies with accurate numbers of road

  7. Probability encoding of hydrologic parameters for basalt. Elicitation of expert opinions from a panel of five consulting hydrologists

    International Nuclear Information System (INIS)

    Runchal, A.K.; Merkhofer, M.W.; Olmsted, E.; Davis, J.D.

    1984-11-01

    The Columbia River basalts underlying the Hanford Site in Washington State are being considered as a possible location for a geologic repository for high-level nuclear waste. To investigate the feasibility of a repository at this site, the hydrologic parameters of the site must be evaluated. Among hydrologic parameters of particular interest are the effective porosity of the Cohassett basalt flow top and flow interior and the vertical-to-horizontal hydraulic conductivity, or anisotropy ratio, of the Cohassett basalt flow interior. The Cohassett basalt flow is the prime candidate horizon for repository studies. Site-specific data for these hydrologic parameters are currently inadequate for the purpose of preliminary assessment of candidate repository performance. To obtain credible, auditable, and independently derived estimates of the specified hydrologic parameters, a panel of five nationally recognized hydrologists was assembled. Their expert judgments were quantified during two rounds of Delphi process by means of a probability encoding method developed to estimate the probability distributions of the selected hydrologic variables. The results indicate significant differences of expert opinion for cumulative probabilities of less than 10% and greater than 90%, but relatively close agreement in the middle ranges of values. The principal causes of the diversity of opinion are believed to be the lack of site-specific data and the absence of a single, widely accepted, conceptual or theoretical basis for analyzing these variables

  8. Task 4.1: Development of a framework for creating a databank to generate probability density functions for process parameters

    International Nuclear Information System (INIS)

    Burgazzi, Luciano

    2011-01-01

    PSA analysis should be based on the best available data for the types of equipment and systems in the plant. In some cases very limited data may be available for evolutionary designs or new equipments, especially in the case of passive systems. It has been recognized that difficulties arise in addressing the uncertainties related to the physical phenomena and characterizing the parameters relevant to the passive system performance evaluation, since the unavailability of a consistent operational and experimental data base. This lack of experimental evidence and validated data forces the analyst to resort to expert/engineering judgment to a large extent, thus making the results strongly dependent upon the expert elicitation process. This prompts the need for the development of a framework for constructing a database to generate probability distributions for the parameters influencing the system behaviour. The objective of the task is to develop a consistent framework aimed at creating probability distributions for the parameters relevant to the passive system performance evaluation. In order to achieve this goal considerable experience and engineering judgement are also required to determine which existing data are most applicable to the new systems or which generic data bases or models provide the best information for the system design. Eventually in case of absence of documented specific reliability data, documented expert judgement coming out from a well structured procedure could be used to envisage sound probability distributions for the parameters under interest

  9. Impact-parameter-averaged probability of 3dσ - Vacancy sharing in heavy systems

    International Nuclear Information System (INIS)

    Marble, D.K.; McDaniel, F.D.; Zoran, V.; Szilagyi, Z.; Piticu, I.; Fluerasu, D.; Enulescu, A.; Dumitriu, D.; Bucur, B.I.; Ciortea, C.

    1993-01-01

    The probabilities for the 3dσ molecular vacancy sharing in the 0.08 - 1.75 MeV/u F, Co, Ni, Cu + Bi collisions have been estimated by using integral X-ray spectrum measurement. The analytic two-state exponential model of Nikitin has been applied to 3dσ -2p 3/2 vacancy sharing in these collisions systems. This describes satisfactory the velocity dependence at low energies, < 0.5 MeV/u, but around 1 MeV/u the velocity dependence changes its character, indicating departure from the hypotheses of the model. (Author)

  10. Estimating survival probabilities by exposure levels: utilizing vital statistics and complex survey data with mortality follow-up.

    Science.gov (United States)

    Landsman, V; Lou, W Y W; Graubard, B I

    2015-05-20

    We present a two-step approach for estimating hazard rates and, consequently, survival probabilities, by levels of general categorical exposure. The resulting estimator utilizes three sources of data: vital statistics data and census data are used at the first step to estimate the overall hazard rate for a given combination of gender and age group, and cohort data constructed from a nationally representative complex survey with linked mortality records, are used at the second step to divide the overall hazard rate by exposure levels. We present an explicit expression for the resulting estimator and consider two methods for variance estimation that account for complex multistage sample design: (1) the leaving-one-out jackknife method, and (2) the Taylor linearization method, which provides an analytic formula for the variance estimator. The methods are illustrated with smoking and all-cause mortality data from the US National Health Interview Survey Linked Mortality Files, and the proposed estimator is compared with a previously studied crude hazard rate estimator that uses survey data only. The advantages of a two-step approach and possible extensions of the proposed estimator are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Exact calculations of survival probability for diffusion on growing lines, disks, and spheres: The role of dimension.

    Science.gov (United States)

    Simpson, Matthew J; Baker, Ruth E

    2015-09-07

    Unlike standard applications of transport theory, the transport of molecules and cells during embryonic development often takes place within growing multidimensional tissues. In this work, we consider a model of diffusion on uniformly growing lines, disks, and spheres. An exact solution of the partial differential equation governing the diffusion of a population of individuals on the growing domain is derived. Using this solution, we study the survival probability, S(t). For the standard non-growing case with an absorbing boundary, we observe that S(t) decays to zero in the long time limit. In contrast, when the domain grows linearly or exponentially with time, we show that S(t) decays to a constant, positive value, indicating that a proportion of the diffusing substance remains on the growing domain indefinitely. Comparing S(t) for diffusion on lines, disks, and spheres indicates that there are minimal differences in S(t) in the limit of zero growth and minimal differences in S(t) in the limit of fast growth. In contrast, for intermediate growth rates, we observe modest differences in S(t) between different geometries. These differences can be quantified by evaluating the exact expressions derived and presented here.

  12. THE REGULARITY OF INFLUENCE OF TRAFFIC PARAMETERS ON THE PROBABILITY OF REALISATION OF PLANNED PASSENGER TRANSFER AT TRANSFER NODES

    Directory of Open Access Journals (Sweden)

    G. Samchuk

    2017-06-01

    Full Text Available The article deals with the definition of traffic parameters that ensure the minimum value of the transfer waiting time for passengers. On the basis of experimental studies results, a regression equation to determine the probability of realisation of the planned transfer between a pair of vehicles was proposed. Using the identified regression equation, the transfer waiting time can be assessed for any headway exceeding 7,5 min.

  13. Experimental impact-parameter--dependent probabilities for K-shell vacancy production by fast heavy-ion projectiles

    International Nuclear Information System (INIS)

    Randall, R.R.; Bednar, J.A.; Curnutte, B.; Cocke, C.L.

    1976-01-01

    The impact-parameter dependence of the probability for production of target K x rays has been measured for oxygen projectiles on copper and for carbon and fluorine projectiles on argon at scaled velocities near 0.5. The O-on-Cu data were taken for 1.56-, 1.88-, and 2.69-MeV/amu O beams incident upon thin Cu foils. A thin Ar-gas target was used for 1.56-MeV/amu C and F beams, permitting measurements to be made for charge-pure C +4 , C +6 , F +9 and F +5 projectiles. Ar and Cu K x rays were observed with a Si(Li) detector and scattered projectiles with a collimated surface-barrier detector. Comparison of the shapes of the measured K-vacancy--production probability curves with predictions of the semiclassical Coulomb approximation (SCA) shows adequate agreement for the O-on-Cu system. For the higher ratio of projectile-to-target nuclear charge (Z 1 /Z 2 ) characterizing the C-on-Ar and F-on-Ar systems, the SCA predictions are entirely inadequate in describing the observed impact-parameter dependence. In particular, they cannot account for large probabilities found at large impact parameters. Furthermore, the dependence of the shapes on the projectile charge state is found to become pronounced at larger Z 1 /Z 2 . Attempts to account for this behavior in terms of alternative vacancy-production processes are discussed

  14. Nomogram including pretherapeutic parameters for prediction of survival after SIRT of hepatic metastases from colorectal cancer

    International Nuclear Information System (INIS)

    Fendler, Wolfgang Peter; Ilhan, Harun; Paprottka, Philipp M.; Jakobs, Tobias F.; Heinemann, Volker; Bartenstein, Peter; Haug, Alexander R.; Khalaf, Feras; Ezziddin, Samer; Hacker, Marcus

    2015-01-01

    Pre-therapeutic prediction of outcome is important for clinicians and patients in determining whether selective internal radiation therapy (SIRT) is indicated for hepatic metastases of colorectal cancer (CRC). Pre-therapeutic characteristics of 100 patients with colorectal liver metastases (CRLM) treated by radioembolization were analyzed to develop a nomogram for predicting survival. Prognostic factors were selected by univariate Cox regression analysis and subsequent tested by multivariate analysis for predicting patient survival. The nomogram was validated with reference to an external patient cohort (n = 25) from the Bonn University Department of Nuclear Medicine. Of the 13 parameters tested, four were independently associated with reduced patient survival in multivariate analysis. These parameters included no liver surgery before SIRT (HR:1.81, p = 0.014), CEA serum level ≥ 150 ng/ml (HR:2.08, p = 0.001), transaminase toxicity level ≥2.5 x upper limit of normal (HR:2.82, p = 0.001), and summed computed tomography (CT) size of the largest two liver lesions ≥10 cm (HR:2.31, p < 0.001). The area under the receiver-operating characteristic curve for our prediction model was 0.83 for the external patient cohort, indicating superior performance of our multivariate model compared to a model ignoring covariates. The nomogram developed in our study entailing four pre-therapeutic parameters gives good prediction of patient survival post SIRT. (orig.)

  15. Nomogram including pretherapeutic parameters for prediction of survival after SIRT of hepatic metastases from colorectal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Fendler, Wolfgang Peter [Ludwig-Maximilians-University of Munich, Department of Nuclear Medicine, Munich (Germany); Klinik und Poliklinik fuer Nuklearmedizin, Munich (Germany); Ilhan, Harun [Ludwig-Maximilians-University of Munich, Department of Nuclear Medicine, Munich (Germany); Paprottka, Philipp M. [Ludwig-Maximilians-University of Munich, Department of Clinical Radiology, Munich (Germany); Jakobs, Tobias F. [Hospital Barmherzige Brueder, Department of Diagnostic and Interventional Radiology, Munich (Germany); Heinemann, Volker [Ludwig-Maximilians-University of Munich, Department of Internal Medicine III, Munich (Germany); Ludwig-Maximilians-University of Munich, Comprehensive Cancer Center, Munich (Germany); Bartenstein, Peter; Haug, Alexander R. [Ludwig-Maximilians-University of Munich, Department of Nuclear Medicine, Munich (Germany); Ludwig-Maximilians-University of Munich, Comprehensive Cancer Center, Munich (Germany); Khalaf, Feras [University Hospital Bonn, Department of Nuclear Medicine, Bonn (Germany); Ezziddin, Samer [Saarland University Medical Center, Department of Nuclear Medicine, Homburg (Germany); Hacker, Marcus [Vienna General Hospital, Department of Nuclear Medicine, Vienna (Austria)

    2015-09-15

    Pre-therapeutic prediction of outcome is important for clinicians and patients in determining whether selective internal radiation therapy (SIRT) is indicated for hepatic metastases of colorectal cancer (CRC). Pre-therapeutic characteristics of 100 patients with colorectal liver metastases (CRLM) treated by radioembolization were analyzed to develop a nomogram for predicting survival. Prognostic factors were selected by univariate Cox regression analysis and subsequent tested by multivariate analysis for predicting patient survival. The nomogram was validated with reference to an external patient cohort (n = 25) from the Bonn University Department of Nuclear Medicine. Of the 13 parameters tested, four were independently associated with reduced patient survival in multivariate analysis. These parameters included no liver surgery before SIRT (HR:1.81, p = 0.014), CEA serum level ≥ 150 ng/ml (HR:2.08, p = 0.001), transaminase toxicity level ≥2.5 x upper limit of normal (HR:2.82, p = 0.001), and summed computed tomography (CT) size of the largest two liver lesions ≥10 cm (HR:2.31, p < 0.001). The area under the receiver-operating characteristic curve for our prediction model was 0.83 for the external patient cohort, indicating superior performance of our multivariate model compared to a model ignoring covariates. The nomogram developed in our study entailing four pre-therapeutic parameters gives good prediction of patient survival post SIRT. (orig.)

  16. Modeling visual search using three-parameter probability functions in a hierarchical Bayesian framework.

    Science.gov (United States)

    Lin, Yi-Shin; Heinke, Dietmar; Humphreys, Glyn W

    2015-04-01

    In this study, we applied Bayesian-based distributional analyses to examine the shapes of response time (RT) distributions in three visual search paradigms, which varied in task difficulty. In further analyses we investigated two common observations in visual search-the effects of display size and of variations in search efficiency across different task conditions-following a design that had been used in previous studies (Palmer, Horowitz, Torralba, & Wolfe, Journal of Experimental Psychology: Human Perception and Performance, 37, 58-71, 2011; Wolfe, Palmer, & Horowitz, Vision Research, 50, 1304-1311, 2010) in which parameters of the response distributions were measured. Our study showed that the distributional parameters in an experimental condition can be reliably estimated by moderate sample sizes when Monte Carlo simulation techniques are applied. More importantly, by analyzing trial RTs, we were able to extract paradigm-dependent shape changes in the RT distributions that could be accounted for by using the EZ2 diffusion model. The study showed that Bayesian-based RT distribution analyses can provide an important means to investigate the underlying cognitive processes in search, including stimulus grouping and the bottom-up guidance of attention.

  17. Estimation of probability density functions of damage parameter for valve leakage detection in reciprocating pump used in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo [Div. of Mechanical Engineering, Ajou University, Suwon (Korea, Republic of)

    2016-10-15

    This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.

  18. Estimation of probability density functions of damage parameter for valve leakage detection in reciprocating pump used in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Jong Kyeom; Kim, Tae Yun; Kim, Hyun Su; Chai, Jang Bom; Lee, Jin Woo

    2016-01-01

    This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage

  19. Estimation of Probability Density Functions of Damage Parameter for Valve Leakage Detection in Reciprocating Pump Used in Nuclear Power Plants

    Directory of Open Access Journals (Sweden)

    Jong Kyeom Lee

    2016-10-01

    Full Text Available This paper presents an advanced estimation method for obtaining the probability density functions of a damage parameter for valve leakage detection in a reciprocating pump. The estimation method is based on a comparison of model data which are simulated by using a mathematical model, and experimental data which are measured on the inside and outside of the reciprocating pump in operation. The mathematical model, which is simplified and extended on the basis of previous models, describes not only the normal state of the pump, but also its abnormal state caused by valve leakage. The pressure in the cylinder is expressed as a function of the crankshaft angle, and an additional volume flow rate due to the valve leakage is quantified by a damage parameter in the mathematical model. The change in the cylinder pressure profiles due to the suction valve leakage is noticeable in the compression and expansion modes of the pump. The damage parameter value over 300 cycles is calculated in two ways, considering advance or delay in the opening and closing angles of the discharge valves. The probability density functions of the damage parameter are compared for diagnosis and prognosis on the basis of the probabilistic features of valve leakage.

  20. Determination of probability density functions for parameters in the Munson-Dawson model for creep behavior of salt

    International Nuclear Information System (INIS)

    Pfeifle, T.W.; Mellegard, K.D.; Munson, D.E.

    1992-10-01

    The modified Munson-Dawson (M-D) constitutive model that describes the creep behavior of salt will be used in performance assessment calculations to assess compliance of the Waste Isolation Pilot Plant (WIPP) facility with requirements governing the disposal of nuclear waste. One of these standards requires that the uncertainty of future states of the system, material model parameters, and data be addressed in the performance assessment models. This paper presents a method in which measurement uncertainty and the inherent variability of the material are characterized by treating the M-D model parameters as random variables. The random variables can be described by appropriate probability distribution functions which then can be used in Monte Carlo or structural reliability analyses. Estimates of three random variables in the M-D model were obtained by fitting a scalar form of the model to triaxial compression creep data generated from tests of WIPP salt. Candidate probability distribution functions for each of the variables were then fitted to the estimates and their relative goodness-of-fit tested using the Kolmogorov-Smirnov statistic. A sophisticated statistical software package obtained from BMDP Statistical Software, Inc. was used in the M-D model fitting. A separate software package, STATGRAPHICS, was used in fitting the candidate probability distribution functions to estimates of the variables. Skewed distributions, i.e., lognormal and Weibull, were found to be appropriate for the random variables analyzed

  1. Probability Distributions for Cyclone Key Parameters and Cyclonic Wind Speed for the East Coast of Indian Region

    Directory of Open Access Journals (Sweden)

    Pradeep K. Goyal

    2011-09-01

    Full Text Available This paper presents a study conducted on the probabilistic distribution of key cyclone parameters and the cyclonic wind speed by analyzing the cyclone track records obtained from India meteorological department for east coast region of India. The dataset of historical landfalling storm tracks in India from 1975–2007 with latitude /longitude and landfall locations are used to map the cyclone tracks in a region of study. The statistical tests were performed to find a best fit distribution to the track data for each cyclone parameter. These parameters include central pressure difference, the radius of maximum wind speed, the translation velocity, track angle with site and are used to generate digital simulated cyclones using wind field simulation techniques. For this, different sets of values for all the cyclone key parameters are generated randomly from their probability distributions. Using these simulated values of the cyclone key parameters, the distribution of wind velocity at a particular site is obtained. The same distribution of wind velocity at the site is also obtained from actual track records and using the distributions of the cyclone key parameters as published in the literature. The simulated distribution is compared with the wind speed distributions obtained from actual track records. The findings are useful in cyclone disaster mitigation.

  2. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  3. Measurement of the ionization probability of the 1s sigma molecular orbital in half a collision at zero impact parameter

    International Nuclear Information System (INIS)

    Chemin, J.F.; Andriamonje, S.; Guezet, D.; Thibaud, J.P.; Aguer, P.; Hannachi, F.; Bruandet, J.F.

    1984-01-01

    We have measured, for the first time, the ionization probability Psub(1s sigma) of the 1s sigma molecular orbital in the way into a nuclear reaction (in half a collision at zero impact parameter) in a near symmetric collision 58 Ni + 54 Fe at 230 MeV leads to a compound nucleus of 112 Xe highly excited which decays first by sequential emission of charged particles and then by sequential emission of gamma rays. The determination of Psub(1s sigma) is based on the coincidence measurement between X-rays and γ-rays and the Doppler shift method is used to discrimine the ''atomic'' and ''nuclear'' X-rays

  4. Optimum parameters in a model for tumour control probability, including interpatient heterogeneity: evaluation of the log-normal distribution

    International Nuclear Information System (INIS)

    Keall, P J; Webb, S

    2007-01-01

    The heterogeneity of human tumour radiation response is well known. Researchers have used the normal distribution to describe interpatient tumour radiosensitivity. However, many natural phenomena show a log-normal distribution. Log-normal distributions are common when mean values are low, variances are large and values cannot be negative. These conditions apply to radiosensitivity. The aim of this work was to evaluate the log-normal distribution to predict clinical tumour control probability (TCP) data and to compare the results with the homogeneous (δ-function with single α-value) and normal distributions. The clinically derived TCP data for four tumour types-melanoma, breast, squamous cell carcinoma and nodes-were used to fit the TCP models. Three forms of interpatient tumour radiosensitivity were considered: the log-normal, normal and δ-function. The free parameters in the models were the radiosensitivity mean, standard deviation and clonogenic cell density. The evaluation metric was the deviance of the maximum likelihood estimation of the fit of the TCP calculated using the predicted parameters to the clinical data. We conclude that (1) the log-normal and normal distributions of interpatient tumour radiosensitivity heterogeneity more closely describe clinical TCP data than a single radiosensitivity value and (2) the log-normal distribution has some theoretical and practical advantages over the normal distribution. Further work is needed to test these models on higher quality clinical outcome datasets

  5. Overview of input parameters for calculation of the probability of a brittle fracture of the reactor pressure vessel

    International Nuclear Information System (INIS)

    Horacek, L.

    1994-12-01

    The parameters are summarized for a calculation of the probability of brittle fracture of the WWER-440 reactor pressure vessel (RPV). The parameters were selected for 2 basic approaches, viz., one based on the Monte Carlo method and the other on the FORM and SORM methods (First and Second Order Reliability Methods). The approaches were represented by US computer codes VISA-II and OCA-P and by the German ZERBERUS code. The philosophy of the deterministic and probabilistic aspects of the VISA-II code is outlined, and the differences between the US and Czech PWR's are discussed in this context. Briefly described is the partial approach to the evaluation of the WWER type RPV's based on the assessment of their resistance to brittle fracture by fracture mechanics tools and by using the FORM and SORM methods. Attention is paid to the input data for the WWER modification of the VISA-II code. The data are categorized with respect to randomness, i.e. to the stochastic or deterministic nature of their behavior. 18 tabs., 14 refs

  6. Quantifying Uranium Isotope Ratios Using Resonance Ionization Mass Spectrometry: The Influence of Laser Parameters on Relative Ionization Probability

    Energy Technology Data Exchange (ETDEWEB)

    Isselhardt, Brett H. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    Resonance Ionization Mass Spectrometry (RIMS) has been developed as a method to measure relative uranium isotope abundances. In this approach, RIMS is used as an element-selective ionization process to provide a distinction between uranium atoms and potential isobars without the aid of chemical purification and separation. We explore the laser parameters critical to the ionization process and their effects on the measured isotope ratio. Specifically, the use of broad bandwidth lasers with automated feedback control of wavelength was applied to the measurement of 235U/238U ratios to decrease laser-induced isotopic fractionation. By broadening the bandwidth of the first laser in a 3-color, 3-photon ionization process from a bandwidth of 1.8 GHz to about 10 GHz, the variation in sequential relative isotope abundance measurements decreased from >10% to less than 0.5%. This procedure was demonstrated for the direct interrogation of uranium oxide targets with essentially no sample preparation. A rate equation model for predicting the relative ionization probability has been developed to study the effect of variation in laser parameters on the measured isotope ratio. This work demonstrates that RIMS can be used for the robust measurement of uranium isotope ratios.

  7. How Long Do the Dead Survive on the Road? Carcass Persistence Probability and Implications for Road-Kill Monitoring Surveys

    OpenAIRE

    Santos, Sara; Carvalho, Filipe; Mira, António

    2011-01-01

    Background: Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the eff...

  8. Survival probability of Baltic larval cod in relation to spatial overlap patterns with their prey obtained from drift model studies

    DEFF Research Database (Denmark)

    Hinrichsen, H.H.; Schmidt, J.O.; Petereit, C.

    2005-01-01

    Temporal mismatch between the occurrence of larvae and their prey potentially affects the spatial overlap and thus the contact rates between predator and prey. This might have important consequences for growth and survival. We performed a case study investigating the influence of circulation......-prey overlap, dependent on the hatching time of cod larvae. By performing model runs for the years 1979-1998 investigated the intra- and interannual variability of potential spatial overlap between predator and prey. Assuming uniform prey distributions, we generally found the overlap to have decreased since...

  9. Colon cancer: association of histopathological parameters and patients' survival with clinical presentation.

    Science.gov (United States)

    Alexiusdottir, Kristin K; Snaebjornsson, Petur; Tryggvadottir, Laufey; Jonasson, Larus; Olafsdottir, Elinborg J; Björnsson, Einar Stefan; Möller, Pall Helgi; Jonasson, Jon G

    2013-10-01

    Available data correlating symptoms of colon cancer patients with the severity of the disease are very limited. In a population-based setting, we correlated information on symptoms of colon cancer patients with several pathological tumor parameters and survival. Information on all patients diagnosed with colon cancer in Iceland in 1995-2004 for this retrospective, population-based study was obtained from the Icelandic Cancer Registry. Information on symptoms of patients and blood hemoglobin was collected from patients' files. Pathological parameters were obtained from a previously performed standardized tumor review. A total of 768 patients entered this study; the median age was 73 years. Tumors in patients presenting at diagnosis with visible blood in stools were significantly more likely to be of lower grade, having pushing border, conspicuous peritumoral lymphocytic infiltration, and lower frequency of vessel invasion. Patients with abdominal pain and anemia were significantly more likely to have vessel invasion. Logistic regression showed that visible blood in stools was significantly associated with protecting pathological factors (OR range 0.38-0.83, p characteristics and adverse outcome for patients. © 2013 APMIS Published by Blackwell Publishing Ltd.

  10. Determination of diffuseness parameter to estimate the survival probability of projectile using Woods-Saxon formula at intermediate beam energies

    International Nuclear Information System (INIS)

    Kumar, Rajiv; Goyal, Monika; Roshni; Singh, Pradeep; Kharab, Rajesh

    2017-01-01

    In present work, the S-matrix has been evaluated by using simple Woods-Saxon formula as well as the realistic expression for a number of projectiles varying from 26N e to 76 Ge at intermediate incident beam energies ranging from 30 MeV/A to 300 MeV/A. The target is 197 Au in each and every case. The realistic S-matrix is compared with that of obtained by using the simple Woods-Saxon formula. The motive of this comparison is to fix the value of otherwise free Δ so that the much involved evaluation of realistic S-matrix can be replaced by the simple Woods-Saxon formula

  11. WE-H-BRA-08: A Monte Carlo Cell Nucleus Model for Assessing Cell Survival Probability Based On Particle Track Structure Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B [Northwestern Memorial Hospital, Chicago, IL (United States); Georgia Institute of Technology, Atlanta, GA (Georgia); Wang, C [Georgia Institute of Technology, Atlanta, GA (Georgia)

    2016-06-15

    Purpose: To correlate the damage produced by particles of different types and qualities to cell survival on the basis of nanodosimetric analysis and advanced DNA structures in the cell nucleus. Methods: A Monte Carlo code was developed to simulate subnuclear DNA chromatin fibers (CFs) of 30nm utilizing a mean-free-path approach common to radiation transport. The cell nucleus was modeled as a spherical region containing 6000 chromatin-dense domains (CDs) of 400nm diameter, with additional CFs modeled in a sparser interchromatin region. The Geant4-DNA code was utilized to produce a particle track database representing various particles at different energies and dose quantities. These tracks were used to stochastically position the DNA structures based on their mean free path to interaction with CFs. Excitation and ionization events intersecting CFs were analyzed using the DBSCAN clustering algorithm for assessment of the likelihood of producing DSBs. Simulated DSBs were then assessed based on their proximity to one another for a probability of inducing cell death. Results: Variations in energy deposition to chromatin fibers match expectations based on differences in particle track structure. The quality of damage to CFs based on different particle types indicate more severe damage by high-LET radiation than low-LET radiation of identical particles. In addition, the model indicates more severe damage by protons than of alpha particles of same LET, which is consistent with differences in their track structure. Cell survival curves have been produced showing the L-Q behavior of sparsely ionizing radiation. Conclusion: Initial results indicate the feasibility of producing cell survival curves based on the Monte Carlo cell nucleus method. Accurate correlation between simulated DNA damage to cell survival on the basis of nanodosimetric analysis can provide insight into the biological responses to various radiation types. Current efforts are directed at producing cell

  12. Long-term survival in laparoscopic vs open resection for colorectal liver metastases: inverse probability of treatment weighting using propensity scores.

    Science.gov (United States)

    Lewin, Joel W; O'Rourke, Nicholas A; Chiow, Adrian K H; Bryant, Richard; Martin, Ian; Nathanson, Leslie K; Cavallucci, David J

    2016-02-01

    This study compares long-term outcomes between intention-to-treat laparoscopic and open approaches to colorectal liver metastases (CLM), using inverse probability of treatment weighting (IPTW) based on propensity scores to control for selection bias. Patients undergoing liver resection for CLM by 5 surgeons at 3 institutions from 2000 to early 2014 were analysed. IPTW based on propensity scores were generated and used to assess the marginal treatment effect of the laparoscopic approach via a weighted Cox proportional hazards model. A total of 298 operations were performed in 256 patients. 7 patients with planned two-stage resections were excluded leaving 284 operations in 249 patients for analysis. After IPTW, the population was well balanced. With a median follow up of 36 months, 5-year overall survival (OS) and recurrence-free survival (RFS) for the cohort were 59% and 38%. 146 laparoscopic procedures were performed in 140 patients, with weighted 5-year OS and RFS of 54% and 36% respectively. In the open group, 138 procedures were performed in 122 patients, with a weighted 5-year OS and RFS of 63% and 38% respectively. There was no significant difference between the two groups in terms of OS or RFS. In the Brisbane experience, after accounting for bias in treatment assignment, long term survival after LLR for CLM is equivalent to outcomes in open surgery. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  13. First estimates of the probability of survival in a small-bodied, high-elevation frog (Boreal Chorus Frog, Pseudacris maculata), or how historical data can be useful

    Science.gov (United States)

    Muths, Erin L.; Scherer, R. D.; Amburgey, S. M.; Matthews, T.; Spencer, A. W.; Corn, P.S.

    2016-01-01

    In an era of shrinking budgets yet increasing demands for conservation, the value of existing (i.e., historical) data are elevated. Lengthy time series on common, or previously common, species are particularly valuable and may be available only through the use of historical information. We provide first estimates of the probability of survival and longevity (0.67–0.79 and 5–7 years, respectively) for a subalpine population of a small-bodied, ostensibly common amphibian, the Boreal Chorus Frog (Pseudacris maculata (Agassiz, 1850)), using historical data and contemporary, hypothesis-driven information–theoretic analyses. We also test a priori hypotheses about the effects of color morph (as suggested by early reports) and of drought (as suggested by recent climate predictions) on survival. Using robust mark–recapture models, we find some support for early hypotheses regarding the effect of color on survival, but we find no effect of drought. The congruence between early findings and our analyses highlights the usefulness of historical information in providing raw data for contemporary analyses and context for conservation and management decisions.

  14. Fracture strength and probability of survival of narrow and extra-narrow dental implants after fatigue testing: In vitro and in silico analysis.

    Science.gov (United States)

    Bordin, Dimorvan; Bergamo, Edmara T P; Fardin, Vinicius P; Coelho, Paulo G; Bonfante, Estevam A

    2017-07-01

    To assess the probability of survival (reliability) and failure modes of narrow implants with different diameters. For fatigue testing, 42 implants with the same macrogeometry and internal conical connection were divided, according to diameter, as follows: narrow (Ø3.3×10mm) and extra-narrow (Ø2.9×10mm) (21 per group). Identical abutments were torqued to the implants and standardized maxillary incisor crowns were cemented and subjected to step-stress accelerated life testing (SSALT) in water. The use-level probability Weibull curves, and reliability for a mission of 50,000 and 100,000 cycles at 50N, 100, 150 and 180N were calculated. For the finite element analysis (FEA), two virtual models, simulating the samples tested in fatigue, were constructed. Loading at 50N and 100N were applied 30° off-axis at the crown. The von-Mises stress was calculated for implant and abutment. The beta (β) values were: 0.67 for narrow and 1.32 for extra-narrow implants, indicating that failure rates did not increase with fatigue in the former, but more likely were associated with damage accumulation and wear-out failures in the latter. Both groups showed high reliability (up to 97.5%) at 50 and 100N. A decreased reliability was observed for both groups at 150 and 180N (ranging from 0 to 82.3%), but no significant difference was observed between groups. Failure predominantly involved abutment fracture for both groups. FEA at 50N-load, Ø3.3mm showed higher von-Mises stress for abutment (7.75%) and implant (2%) when compared to the Ø2.9mm. There was no significant difference between narrow and extra-narrow implants regarding probability of survival. The failure mode was similar for both groups, restricted to abutment fracture. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Impact-parameter dependence of the total probability for electromagnetic electron-positron pair production in relativistic heavy-ion collisions

    International Nuclear Information System (INIS)

    Hencken, K.; Trautmann, D.; Baur, G.

    1995-01-01

    We calculate the impact-parameter-dependent total probability P total (b) for the electromagnetic production of electron-positron pairs in relativistic heavy-ion collisions in lowest order. We study expecially impact parameters smaller than the Compton wavelength of the electron, where the equivalent-photon approximation cannot be used. Calculations with and without a form factor for the heavy ions are done; the influence is found to be small. The lowest-order results are found to violate unitarity and are used for the calculation of multiple-pair production probabilities with the help of the approximate Poisson distribution already found in earlier publications

  16. ASSOSIATION BETWEEN PARAMETERS OF MINERAL BONE METABOLISM AND SURVIVAL IN PATIENTS UNDERGOING CHRONIC HEMODIALYSIS

    Directory of Open Access Journals (Sweden)

    Branislav Apostolović

    2015-12-01

    Full Text Available Beside the traditional risk factors which have an effect on cardiovascular diseases, hemodialysis patients are exposed to metabolic factors, such as malnutrition, microinflammation and oxidative stress, along with mineral bone disorder. The aim of this study was to determine a three-year survival in patients undergoing chronic hemodialysis and to analyse correlation with parameters of mineral bone metabolism. During the three-year follow-up 186 patients were included, of which 115 men (61.83% and 71 women, with a mean age 61.47±12.42. The exact date and the direct cause of death were recorded and mineral bone metabolism parameters were analysed. Out of 67 dead patients, 33 (49.25% died from cardiovascular cause. Out of the total number of deaths in our study, only 11.9% of patients had a target PTH values. Patients with PTH>600 pg/ml are exposed to an increased risk from the overall mortality (RR=0.48, 95% CI (0.24-0.95, p=0.04, but also from cardiovascular mortality (RR=0.34, 95% CI (0.12-0.93, p=0.034 compared to patients with normal serum PTH. These patients have a statistically significant higher serum phosphorus in comparison with patients with normal PTH levels (1.72±0.42 vs. 1.39±0.36, p=0.032. Phosphorus above 2.10 mmol/L increases the relative risk for the overall mortality rate by 60% (RR=0.59, 95% CI (0.35-0.89, p=0.049. In our study, 2-fold higher risk of all-cause mortality (RR=2.00, 95% CI (0.92-4.36, p=0.048, and even 3-fold higher risk of cardiovascular mortality (RR=3.03, 95% CI (0.71-1.29, p=0.039 were found in patients with CaxP levels above 4.50 mmol2/L2. Three-year mortality rate of patients undergoing hemodialysis was 36.02%, while half of the patients died from cardiovascular disease. Patients with hyperparathyroidism and elevated calcium phosphorus product are at the highest risk, both for all-cause and cardiovascular mortality. Patients with hyperphosphatemia are at higher risk for all-cause mortality.

  17. Genetic parameters and factors influencing survival to 24 hrs after birth in Danish meat sheep breeds

    DEFF Research Database (Denmark)

    Maxa, J; Sharifi, A R; Pedersen, J

    2009-01-01

    In this study, influential factors and (co)variance components for survival to 24 h after birth were determined and estimated for Texel, Shropshire, and Oxford Down, the most common sheep breeds in Denmark. Data from 1992 to 2006 containing 138,813 survival records were extracted from the sheep...... recording database at the Danish Agricultural Advisory Service. Estimation of (co)variance components was carried out using univariate animal models, applying logistic link functions. The logistic functions were also used for estimation of fixed effects. Both direct and maternal additive genetic effects......, as well as common litter effects, were included in the models. The mean survival to 24 h after birth was 92.5, 91.7, and 88.5% for Texel, Shropshire, and Oxford Down, respectively. There was a curvilinear relationship between survival to 24 h after birth and birth weight, with survival less for light...

  18. Análisis de supervivencia en presencia de riesgos competitivos: estimadores de la probabilidad de suceso Survival analysis with competing risks: estimating failure probability

    Directory of Open Access Journals (Sweden)

    Javier Llorca

    2004-10-01

    Full Text Available Objetivo: Mostrar el efecto de los riesgos competitivos de muerte en el análisis de supervivencia. Métodos: Se presenta un ejemplo sobre la supervivencia libre de rechazo tras un trasplante cardíaco, en el que la muerte antes de desarrollar el rechazo actúa como riesgo competitivo. Mediante una simulación se comparan el estimador de Kaplan-Meier y el modelo de decrementos múltiples. Resultados: El método de Kaplan-Meier sobrestima el riesgo de rechazo. A continuación, se expone la aplicación del modelo de decrementos múltiples para el análisis de acontecimientos secundarios (en el ejemplo, la muerte tras el rechazo. Finalmente, se discuten las asunciones propias del método de Kaplan-Meier y las razones por las que no puede ser aplicado en presencia de riesgos competitivos. Conclusiones: El análisis de supervivencia debe ajustarse por los riesgos competitivos de muerte para evitar la sobrestimación del riesgo de fallo que se produce con el método de Kaplan-Meier.Objective: To show the impact of competing risks of death on survival analysis. Method: We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. Results: The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection. Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Conclusions: Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  19. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters

    Directory of Open Access Journals (Sweden)

    Simon van Mourik

    2014-06-01

    Full Text Available Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.

  20. Assignment of probability distributions for parameters in the 1996 performance assessment for the Waste Isolation Pilot Plant. Part 1: description of process

    International Nuclear Information System (INIS)

    Rechard, Rob P.; Tierney, Martin S.

    2005-01-01

    A managed process was used to consistently and traceably develop probability distributions for parameters representing epistemic uncertainty in four preliminary and the final 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP). The key to the success of the process was the use of a three-member team consisting of a Parameter Task Leader, PA Analyst, and Subject Matter Expert. This team, in turn, relied upon a series of guidelines for selecting distribution types. The primary function of the guidelines was not to constrain the actual process of developing a parameter distribution but rather to establish a series of well-defined steps where recognized methods would be consistently applied to all parameters. An important guideline was to use a small set of distributions satisfying the maximum entropy formalism. Another important guideline was the consistent use of the log transform for parameters with large ranges (i.e. maximum/minimum>10 3 ). A parameter development team assigned 67 probability density functions (PDFs) in the 1989 PA and 236 PDFs in the 1996 PA using these and other guidelines described

  1. The influence of printing parameters on cell survival rate and printability in microextrusion-based 3D cell printing technology.

    Science.gov (United States)

    Zhao, Yu; Li, Yang; Mao, Shuangshuang; Sun, Wei; Yao, Rui

    2015-11-02

    Three-dimensional (3D) cell printing technology has provided a versatile methodology to fabricate cell-laden tissue-like constructs and in vitro tissue/pathological models for tissue engineering, drug testing and screening applications. However, it still remains a challenge to print bioinks with high viscoelasticity to achieve long-term stable structure and maintain high cell survival rate after printing at the same time. In this study, we systematically investigated the influence of 3D cell printing parameters, i.e. composition and concentration of bioink, holding temperature and holding time, on the printability and cell survival rate in microextrusion-based 3D cell printing technology. Rheological measurements were utilized to characterize the viscoelasticity of gelatin-based bioinks. Results demonstrated that the bioink viscoelasticity was increased when increasing the bioink concentration, increasing holding time and decreasing holding temperature below gelation temperature. The decline of cell survival rate after 3D cell printing process was observed when increasing the viscoelasticity of the gelatin-based bioinks. However, different process parameter combinations would result in the similar rheological characteristics and thus showed similar cell survival rate after 3D bioprinting process. On the other hand, bioink viscoelasticity should also reach a certain point to ensure good printability and shape fidelity. At last, we proposed a protocol for 3D bioprinting of temperature-sensitive gelatin-based hydrogel bioinks with both high cell survival rate and good printability. This research would be useful for biofabrication researchers to adjust the 3D bioprinting process parameters quickly and as a referable template for designing new bioinks.

  2. Effect of inactive yeast cell wall on growth performance, survival rate and immune parameters in Pacific White Shrimp (Litopenaeus vannamei

    Directory of Open Access Journals (Sweden)

    Rutchanee Chotikachinda

    2008-10-01

    Full Text Available Effects of dietary inactive yeast cell wall on growth performance, survival rate, and immune parameters in pacific white shrimp (Litopenaeus vannamei was investigated. Three dosages of inactive yeast cell wall (0, 1, and 2 g kg-1 were tested in three replicate groups of juvenile shrimps with an average initial weight of 7.15±0.05 g for four weeks. There was no significant difference in final weight, survival rate, specific growth rate, feed conversion ratio, feed intake, protein efficiency ratio, and apparent net protein utilization of each treatments. However, different levels of inactive yeast cell wall showed an effect on certain immune parameters (p<0.05. Total hemocyte counts, granular hemocyte count, and bacterial clearance were better in shrimp fed diets supplemented with 1 and 2 g kg-1 inactive yeast cell wall as compared with thecontrol group.

  3. The design and analysis of salmonid tagging studies in the Columbia basin. Volume 8: A new model for estimating survival probabilities and residualization from a release-recapture study of fall chinook salmon (Oncorhynchus tschawytscha) smolts in the Snake River

    International Nuclear Information System (INIS)

    Lowther, A.B.; Skalski, J.

    1997-09-01

    Standard release-recapture analysis using Cormack-Jolly-Seber (CJS) models to estimate survival probabilities between hydroelectric facilities for Snake river fall chinook salmon (Oncorhynchus tschawytscha) ignore the possibility of individual fish residualizing and completing their migration in the year following tagging. These models do not utilize available capture history data from this second year and, thus, produce negatively biased estimates of survival probabilities. A new multinomial likelihood model was developed that results in biologically relevant, unbiased estimates of survival probabilities using the full two years of capture history data. This model was applied to 1995 Snake River fall chinook hatchery releases to estimate the true survival probability from one of three upstream release points (Asotin, Billy Creek, and Pittsburgh Landing) to Lower Granite Dam. In the data analyzed here, residualization is not a common physiological response and thus the use of CJS models did not result in appreciably different results than the true survival probability obtained using the new multinomial likelihood model

  4. The inflammatory milieu within the pancreatic cancer microenvironment correlates with clinicopathologic parameters, chemoresistance and survival

    International Nuclear Information System (INIS)

    Delitto, Daniel; Black, Brian S.; Sorenson, Heather L.; Knowlton, Andrea E.; Thomas, Ryan M.; Sarosi, George A.; Moldawer, Lyle L.; Behrns, Kevin E.; Liu, Chen; George, Thomas J.; Trevino, Jose G.; Wallet, Shannon M.; Hughes, Steven J.

    2015-01-01

    The tumor microenvironment impacts pancreatic cancer (PC) development, progression and metastasis. How intratumoral inflammatory mediators modulate this biology remains poorly understood. We hypothesized that the inflammatory milieu within the PC microenvironment would correlate with clinicopathologic findings and survival. Pancreatic specimens from normal pancreas (n = 6), chronic pancreatitis (n = 9) and pancreatic adenocarcinoma (n = 36) were homogenized immediately upon resection. Homogenates were subjected to multiplex analysis of 41 inflammatory mediators. Twenty-three mediators were significantly elevated in adenocarcinoma specimens compared to nonmalignant controls. Increased intratumoral IL-8 concentrations associated with larger tumors (P = .045) and poor differentiation (P = .038); the administration of neoadjuvant chemotherapy associated with reduced IL-8 concentrations (P = .003). Neoadjuvant therapy was also associated with elevated concentrations of Flt-3 L (P = .005). Elevated levels of pro-inflammatory cytokines IL-1β (P = .017) and TNFα (P = .033) were associated with a poor histopathologic response to neoadjuvant therapy. Elevated concentrations of G-CSF (P = .016) and PDGF-AA (P = .012) correlated with reduced overall survival. Conversely, elevated concentrations of FGF-2 (P = .038), TNFα (P = .031) and MIP-1α (P = .036) were associated with prolonged survival. The pancreatic cancer microenvironment harbors a unique inflammatory milieu with potential diagnostic and prognostic value

  5. On the surviving fraction in irradiated multicellular tumour spheroids: calculation of overall radiosensitivity parameters, influence of hypoxia and volume effects

    International Nuclear Information System (INIS)

    Horas, Jorge A; Olguin, Osvaldo R; Rizzotto, Marcos G

    2005-01-01

    We model the heterogeneous response to radiation of multicellular tumour spheroids assuming position- and volume-dependent radiosensitivity. We propose a method to calculate the overall radiosensitivity parameters to obtain the surviving fraction of tumours. A mathematical model of a spherical tumour with a hypoxic core and a viable rim which is a caricature of a real tumour is constructed. The model is embedded in a two-compartment linear-quadratic (LQ) model, assuming a mixed bivariated Gaussian distribution to attain the radiosensitivity parameters. Ergodicity, i.e., the equivalence between ensemble and volumetric averages is used to obtain the overall radiosensitivities for the two compartments. We obtain expressions for the overall radiosensitivity parameters resulting from the use of both a linear and a nonlinear dependence of the local radiosensitivity with position. The model's results are compared with experimental data of surviving fraction (SF) for multicellular spheroids of different sizes. We make one fit using only the smallest spheroid data and we are able to predict the SF for the larger spheroids. These predictions are acceptable particularly using bounded sensitivities. We conclude with the importance of taking into account the contribution of clonogenic hypoxic cells to radiosensitivity and with the convenience of using bounded local sensitivities to predict overall radiosensitivity parameters

  6. Post-treatment changes of tumour perfusion parameters can help to predict survival in patients with high-grade astrocytoma

    Energy Technology Data Exchange (ETDEWEB)

    Sanz-Requena, Roberto; Marti-Bonmati, Luis [Hospital Quironsalud Valencia, Radiology Department, Valencia (Spain); Hospital Universitari i Politecnic La Fe, Grupo de Investigacion Biomedica en Imagen, Valencia (Spain); Revert-Ventura, Antonio J.; Salame-Gamarra, Fares [Hospital de Manises, Radiology Department, Manises (Spain); Garcia-Marti, Gracian [Hospital Quironsalud Valencia, Radiology Department, Valencia (Spain); Hospital Universitari i Politecnic La Fe, Grupo de Investigacion Biomedica en Imagen, Valencia (Spain); CIBER-SAM, Instituto de Salud Carlos III, Madrid (Spain); Perez-Girbes, Alexandre [Hospital Universitari i Politecnic La Fe, Grupo de Investigacion Biomedica en Imagen, Valencia (Spain); Molla-Olmos, Enrique [Hospital La Ribera, Radiology Department, Alzira (Spain)

    2017-08-15

    Vascular characteristics of tumour and peritumoral volumes of high-grade gliomas change with treatment. This work evaluates the variations of T2*-weighted perfusion parameters as overall survival (OS) predictors. Forty-five patients with histologically confirmed high-grade astrocytoma (8 grade III and 37 grade IV) were included. All patients underwent pre- and post-treatment T2*-weighted contrast-enhanced magnetic resonance (MR) imaging. Tumour, peritumoral and control volumes were segmented. Relative variations of cerebral blood flow (CBF), cerebral blood volume (CBV), mean transit time (MTT), K{sup trans-T2*}, k{sub ep-T2*}, v{sub e-T2*} and v{sub p-T2*} were calculated. Differences regarding tumour grade and surgical resection extension were evaluated with ANOVA tests. For each parameter, two groups were defined by non-supervised clusterisation. Survival analysis were performed on these groups. For the tumour region, the 90th percentile increase or stagnation of CBV was associated with shorter survival, while a decrease related to longer survival (393 ± 189 vs 594 ± 294 days; log-rank p = 0.019; Cox hazard-ratio, 2.31; 95% confidence interval [CI], 1.12-4.74). K{sup trans-T2*} showed similar results (414 ± 177 vs 553 ± 312 days; log-rank p = 0.037; hazard-ratio, 2.19; 95% CI, 1.03-4.65). The peritumoral area values showed no relationship with OS. Post-treatment variations of the highest CBV and K{sup trans-T2*} values in the tumour volume are predictive factors of OS in patients with high-grade gliomas. (orig.)

  7. Accurate potential energy curves, spectroscopic parameters, transition dipole moments, and transition probabilities of 21 low-lying states of the CO+ cation

    Science.gov (United States)

    Xing, Wei; Shi, Deheng; Zhang, Jicai; Sun, Jinfeng; Zhu, Zunlue

    2018-05-01

    This paper calculates the potential energy curves of 21 Λ-S and 42 Ω states, which arise from the first two dissociation asymptotes of the CO+ cation. The calculations are conducted using the complete active space self-consistent field method, which is followed by the valence internally contracted multireference configuration interaction approach with the Davidson correction. To improve the reliability and accuracy of the potential energy curves, core-valence correlation and scalar relativistic corrections, as well as the extrapolation of potential energies to the complete basis set limit are taken into account. The spectroscopic parameters and vibrational levels are determined. The spin-orbit coupling effect on the spectroscopic parameters and vibrational levels is evaluated. To better study the transition probabilities, the transition dipole moments are computed. The Franck-Condon factors and Einstein coefficients of some emissions are calculated. The radiative lifetimes are determined for a number of vibrational levels of several states. The transitions between different Λ-S states are evaluated. Spectroscopic routines for observing these states are proposed. The spectroscopic parameters, vibrational levels, transition dipole moments, and transition probabilities reported in this paper can be considered to be very reliable and can be used as guidelines for detecting these states in an appropriate spectroscopy experiment, especially for the states that were very difficult to observe or were not detected in previous experiments.

  8. Echocardiographic parameters and survival in Chagas heart disease with severe systolic dysfunction.

    Science.gov (United States)

    Rassi, Daniela do Carmo; Vieira, Marcelo Luiz Campos; Arruda, Ana Lúcia Martins; Hotta, Viviane Tiemi; Furtado, Rogério Gomes; Rassi, Danilo Teixeira; Rassi, Salvador

    2014-03-01

    Echocardiography provides important information on the cardiac evaluation of patients with heart failure. The identification of echocardiographic parameters in severe Chagas heart disease would help implement treatment and assess prognosis. To correlate echocardiographic parameters with the endpoint cardiovascular mortality in patients with ejection fraction Celular em Cardiopatias) - Chagas heart disease arm. The following parameters were collected: left ventricular systolic and diastolic diameters and volumes; ejection fraction; left atrial diameter; left atrial volume; indexed left atrial volume; systolic pulmonary artery pressure; integral of the aortic flow velocity; myocardial performance index; rate of increase of left ventricular pressure; isovolumic relaxation time; E, A, Em, Am and Sm wave velocities; E wave deceleration time; E/A and E/Em ratios; and mitral regurgitation. In the mean 24.18-month follow-up, 27 patients died. The mean ejection fraction was 26.6 ± 5.34%. In the multivariate analysis, the parameters ejection fraction (HR = 1.114; p = 0.3704), indexed left atrial volume (HR = 1.033; p 70.71 mL/m2 were associated with a significant increase in mortality (log rank p < 0.0001). The indexed left atrial volume was the only independent predictor of mortality in this population of Chagasic patients with severe systolic dysfunction.

  9. Echocardiographic Parameters and Survival in Chagas Heart Disease with Severe Systolic Dysfunction

    International Nuclear Information System (INIS)

    Rassi, Daniela do Carmo; Vieira, Marcelo Luiz Campos; Arruda, Ana Lúcia Martins; Hotta, Viviane Tiemi; Furtado, Rogério Gomes; Rassi, Danilo Teixeira; Rassi, Salvador

    2014-01-01

    Echocardiography provides important information on the cardiac evaluation of patients with heart failure. The identification of echocardiographic parameters in severe Chagas heart disease would help implement treatment and assess prognosis. To correlate echocardiographic parameters with the endpoint cardiovascular mortality in patients with ejection fraction < 35%. Study with retrospective analysis of pre-specified echocardiographic parameters prospectively collected from 60 patients included in the Multicenter Randomized Trial of Cell Therapy in Patients with Heart Diseases (Estudo Multicêntrico Randomizado de Terapia Celular em Cardiopatias) - Chagas heart disease arm. The following parameters were collected: left ventricular systolic and diastolic diameters and volumes; ejection fraction; left atrial diameter; left atrial volume; indexed left atrial volume; systolic pulmonary artery pressure; integral of the aortic flow velocity; myocardial performance index; rate of increase of left ventricular pressure; isovolumic relaxation time; E, A, Em, Am and Sm wave velocities; E wave deceleration time; E/A and E/Em ratios; and mitral regurgitation. In the mean 24.18-month follow-up, 27 patients died. The mean ejection fraction was 26.6 ± 5.34%. In the multivariate analysis, the parameters ejection fraction (HR = 1.114; p = 0.3704), indexed left atrial volume (HR = 1.033; p < 0.0001) and E/Em ratio (HR = 0.95; p = 0.1261) were excluded. The indexed left atrial volume was an independent predictor in relation to the endpoint, and values > 70.71 mL/m 2 were associated with a significant increase in mortality (log rank p < 0.0001). The indexed left atrial volume was the only independent predictor of mortality in this population of Chagasic patients with severe systolic dysfunction

  10. Echocardiographic Parameters and Survival in Chagas Heart Disease with Severe Systolic Dysfunction

    Energy Technology Data Exchange (ETDEWEB)

    Rassi, Daniela do Carmo, E-mail: dani.rassi@hotmail.com [Faculdade de Medicina e Hospital das Clínicas da Universidade Federal de Goiás (UFG), Goiânia, GO (Brazil); Vieira, Marcelo Luiz Campos [Instituto do Coração da Faculdade de Medicina da Universidade de São Paulo (USP), São Paulo, SP (Brazil); Arruda, Ana Lúcia Martins [Instituto de Radiologia da Faculdade de Medicina da Universidade de São Paulo (USP), São Paulo, SP (Brazil); Hotta, Viviane Tiemi [Instituto do Coração da Faculdade de Medicina da Universidade de São Paulo (USP), São Paulo, SP (Brazil); Furtado, Rogério Gomes; Rassi, Danilo Teixeira; Rassi, Salvador [Faculdade de Medicina e Hospital das Clínicas da Universidade Federal de Goiás (UFG), Goiânia, GO (Brazil)

    2014-03-15

    Echocardiography provides important information on the cardiac evaluation of patients with heart failure. The identification of echocardiographic parameters in severe Chagas heart disease would help implement treatment and assess prognosis. To correlate echocardiographic parameters with the endpoint cardiovascular mortality in patients with ejection fraction < 35%. Study with retrospective analysis of pre-specified echocardiographic parameters prospectively collected from 60 patients included in the Multicenter Randomized Trial of Cell Therapy in Patients with Heart Diseases (Estudo Multicêntrico Randomizado de Terapia Celular em Cardiopatias) - Chagas heart disease arm. The following parameters were collected: left ventricular systolic and diastolic diameters and volumes; ejection fraction; left atrial diameter; left atrial volume; indexed left atrial volume; systolic pulmonary artery pressure; integral of the aortic flow velocity; myocardial performance index; rate of increase of left ventricular pressure; isovolumic relaxation time; E, A, Em, Am and Sm wave velocities; E wave deceleration time; E/A and E/Em ratios; and mitral regurgitation. In the mean 24.18-month follow-up, 27 patients died. The mean ejection fraction was 26.6 ± 5.34%. In the multivariate analysis, the parameters ejection fraction (HR = 1.114; p = 0.3704), indexed left atrial volume (HR = 1.033; p < 0.0001) and E/Em ratio (HR = 0.95; p = 0.1261) were excluded. The indexed left atrial volume was an independent predictor in relation to the endpoint, and values > 70.71 mL/m{sup 2} were associated with a significant increase in mortality (log rank p < 0.0001). The indexed left atrial volume was the only independent predictor of mortality in this population of Chagasic patients with severe systolic dysfunction.

  11. The association of {sup 18}F-FDG PET/CT parameters with survival in malignant pleural mesothelioma

    Energy Technology Data Exchange (ETDEWEB)

    Klabatsa, Astero; Lang-Lazdunski, Loic [Guys and St Thomas' NHS Foundation Trust, Department of Thoracic Oncology, London (United Kingdom); Chicklore, Sugama; Barrington, Sally F.; Goh, Vicky [Kings College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Cook, Gary J.R. [Kings College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Kings College London, Clinical PET Centre, Division of Imaging Sciences and Biomedical Engineering, St Thomas' Hospital, London (United Kingdom)

    2014-02-15

    Malignant pleural mesothelioma (MPM) is a disease with poor prognosis despite multimodal therapy but there is variation in survival between patients. Prognostic information is therefore potentially valuable in managing patients, particularly in the context of clinical trials where patients could be stratified according to risk. Therefore we have evaluated the prognostic ability of parameters derived from baseline 2-[{sup 18}F]fluoro-2-deoxy-D-glucose positron emission tomography/computed tomography ({sup 18}F-FDG PET/CT). In order to determine the relationships between metabolic activity and prognosis we reviewed all {sup 18}F-FDG PET/CT scans used for pretreatment staging of MPM patients in our institution between January 2005 and December 2011 (n = 60) and measured standardised uptake values (SUV) including mean, maximum and peak values, metabolic tumour volume (MTV) and total lesion glycolysis (TLG). Overall survival (OS) or time to last censor was recorded, as well as histological subtypes. Median follow-up was 12.7 months (1.9-60.9) and median OS was 14.1 months (1.9-54.9). By univariable analysis histological subtype (p = 0.013), TLG (p = 0.024) and MTV (p = 0.038) were significantly associated with OS and SUV{sub max} was borderline (p = 0.051). On multivariable analysis histological subtype and TLG were associated with OS but at borderline statistical significance (p = 0.060 and 0.058, respectively). No statistically significant differences in any PET parameters were found between the epithelioid and non-epithelioid histological subtypes. {sup 18}F-FDG PET/CT parameters that take into account functional volume (MTV, TLG) show significant associations with survival in patients with MPM before adjusting for histological subtype and are worthy of further evaluation to determine their ability to stratify patients in clinical trials. (orig.)

  12. Comparative studies of parameters based on the most probable versus an approximate linear extrapolation distance estimates for circular cylindrical absorbing rod

    International Nuclear Information System (INIS)

    Wassef, W.A.

    1982-01-01

    Estimates and techniques that are valid to calculate the linear extrapolation distance for an infinitely long circular cylindrical absorbing region are reviewed. Two estimates, in particular, are put into consideration, that is the most probable and the value resulting from an approximate technique based on matching the integral transport equation inside the absorber with the diffusion approximation in the surrounding infinite scattering medium. Consequently, the effective diffusion parameters and the blackness of the cylinder are derived and subjected to comparative studies. A computer code is set up to calculate and compare the different parameters, which is useful in reactor analysis and serves to establish a beneficial estimates that are amenable to direct application to reactor design codes

  13. Effect of gamma radiation on the growth, survival, hematology and histological parameters of rainbow trout (Oncorhynchus mykiss) larvae

    Energy Technology Data Exchange (ETDEWEB)

    Oujifard, Amin, E-mail: oujifard.amin@gmail.com [Fisheries Department, Faculty of Agriculture and Natural Resources, Persian Gulf University, Borazjan, Bushehr (Iran, Islamic Republic of); Amiri, Roghayeh [Department of Veterinary, Agricultural Medical and Industrial Research School, Nuclear Science and Technology Research Institute, AEOI, Karaj (Iran, Islamic Republic of); Shahhosseini, Gholamreza [Fisheries Department, Faculty of Natural Resources and Marine Sciences, TarbiatModares University, Noor, Mazandaran (Iran, Islamic Republic of); Davoodi, Reza [Fisheries Department, Faculty of Agriculture and Natural Resources, Persian Gulf University, Borazjan, Bushehr (Iran, Islamic Republic of); Moghaddam, Jamshid Amiri [Fisheries Department, Faculty of Natural Resources and Marine Sciences, TarbiatModares University, Noor, Mazandaran (Iran, Islamic Republic of)

    2015-08-15

    Highlights: • Incrementing of gamma radiation reveals the negative effects on fish larvae. • Radiation adversely affected the weight, blood cells and intestinal morphology of the larvae. • No mortality was observed at low dosage of gamma radiation on fish larvae. - Abstract: Effects of low (1, 2.5 and 5 Gy) and high doses (10, 20 and 40 Gy) of gamma radiation were examined on the growth, survival, blood parameters and morphological changes of the intestines of rainbow trout (Oncorhynchus mykiss) larvae (103 ± 20 mg) after 12 weeks of exposure. Negative effects of gamma radiation on growth and survival were observed as radiation level and time increased. Changes were well documented at 10 and 20 Gy. All the fish were dead at the dose of 40 Gy. In all the treatments, levels of red blood cells (RBC), hematocrit (HCT) and hemoglobin (HB) were significantly (P < 0.05) declined as the irradiation levels increased, whereas the amount of mean corpuscular volume (MCV) and mean corpuscular hemoglobin (MCH) did not change. No significant differences (P > 0.05) were found in the levels of white blood cells (WBC), lymphocytes and monocytes. Destruction of the intestinal epithelium cells was indicated as the irradiation levels increased to 1 Gy and above. The highest levels of growth, survival, specific growth rate (SGR), condition factor (CF) and protein efficiency rate (PER) were obtained in the control treatment. The results showed that gamma rays can be a potential means for damaging rainbow trout cells.

  14. Effect of gamma radiation on the growth, survival, hematology and histological parameters of rainbow trout (Oncorhynchus mykiss) larvae

    International Nuclear Information System (INIS)

    Oujifard, Amin; Amiri, Roghayeh; Shahhosseini, Gholamreza; Davoodi, Reza; Moghaddam, Jamshid Amiri

    2015-01-01

    Highlights: • Incrementing of gamma radiation reveals the negative effects on fish larvae. • Radiation adversely affected the weight, blood cells and intestinal morphology of the larvae. • No mortality was observed at low dosage of gamma radiation on fish larvae. - Abstract: Effects of low (1, 2.5 and 5 Gy) and high doses (10, 20 and 40 Gy) of gamma radiation were examined on the growth, survival, blood parameters and morphological changes of the intestines of rainbow trout (Oncorhynchus mykiss) larvae (103 ± 20 mg) after 12 weeks of exposure. Negative effects of gamma radiation on growth and survival were observed as radiation level and time increased. Changes were well documented at 10 and 20 Gy. All the fish were dead at the dose of 40 Gy. In all the treatments, levels of red blood cells (RBC), hematocrit (HCT) and hemoglobin (HB) were significantly (P < 0.05) declined as the irradiation levels increased, whereas the amount of mean corpuscular volume (MCV) and mean corpuscular hemoglobin (MCH) did not change. No significant differences (P > 0.05) were found in the levels of white blood cells (WBC), lymphocytes and monocytes. Destruction of the intestinal epithelium cells was indicated as the irradiation levels increased to 1 Gy and above. The highest levels of growth, survival, specific growth rate (SGR), condition factor (CF) and protein efficiency rate (PER) were obtained in the control treatment. The results showed that gamma rays can be a potential means for damaging rainbow trout cells

  15. Toxicity of mercury (Hg on survival and growth rate, hemato- and histopathological parameters ofOreochromis niloticus

    Directory of Open Access Journals (Sweden)

    Kukuh Nirmala

    2013-11-01

    Full Text Available Heavy metals are serious pollutants of the aquatic environment because of their environmental persistence and ability to be accumulated by aquatic organisms. Oreochromis niloticus exposed to 0, 0.16, 0.5, and 1.0 ppm Hg for 30 days. The aim of this study was to determine the influence of mercury in water on survival rate, growth rates, hematological, and histological parameters of Oreochromis niloticus. This study was conducted from Mei to June 2009. The experimental design was arranged in completely randomized design with four treatments and three replications. Stock density was 8 fish/aquarium with mean initial body weight was 15.70±1.13 g. Growth and survival rates of test fish were decreased with increasing the Hg concentration. Red blood cell (RBC count, haematocrit content, and haemoglobin content decreased when compared to the control. The number of white blood cells (WBC increased in mercuric treated fish. The results are statistically significant at p<0.05 level. Keywords:mercury, survival and growth rate, hematology, histopathology, Oreochromis niloticus

  16. Prognostic value of pre-treatment DCE-MRI parameters in predicting disease free and overall survival for breast cancer patients undergoing neoadjuvant chemotherapy

    International Nuclear Information System (INIS)

    Pickles, Martin D.; Manton, David J.; Lowry, Martin; Turnbull, Lindsay W.

    2009-01-01

    The purpose of this study was to investigate whether dynamic contrast enhanced MRI (DCE-MRI) data, both pharmacokinetic and empirical, can predict, prior to neoadjuvant chemotherapy, which patients are likely to have a shorter disease free survival (DFS) and overall survival (OS) interval following surgery. Traditional prognostic parameters were also included in the survival analysis. Consequently, a comparison of the prognostic value could be made between all the parameters studied. MR examinations were conducted on a 1.5 T system in 68 patients prior to the initiation of neoadjuvant chemotherapy. DCE-MRI consisted of a fast spoiled gradient echo sequence acquired over 35 phases with a mean temporal resolution of 11.3 s. Both pharmacokinetic and empirical parameters were derived from the DCE-MRI data. Kaplan-Meier survival plots were generated for each parameter and group comparisons were made utilising logrank tests. The results from the 54 patients entered into the univariate survival analysis demonstrated that traditional prognostic parameters (tumour grade, hormonal status and size), empirical parameters (maximum enhancement index, enhancement index at 30 s, area under the curve and initial slope) and adjuvant therapies demonstrated significant differences in survival intervals. Further multivariate Cox regression survival analysis revealed that empirical enhancement parameters contributed the greatest prediction of both DFS and OS in the resulting models. In conclusion, this study has demonstrated that in patients who exhibit high levels of perfusion and vessel permeability pre-treatment, evidenced by elevated empirical DCE-MRI parameters, a significantly lower disease free survival and overall survival can be expected.

  17. Prognostic value of pre-treatment DCE-MRI parameters in predicting disease free and overall survival for breast cancer patients undergoing neoadjuvant chemotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Pickles, Martin D. [Centre for Magnetic Resonance Investigations, Division of Cancer, Postgraduate Medical School, University of Hull, Hull Royal Infirmary, Anlaby Road, Hull, HU3 2JZ (United Kingdom)], E-mail: m.pickles@hull.ac.uk; Manton, David J. [Centre for Magnetic Resonance Investigations, Division of Cancer, Postgraduate Medical School, University of Hull, Hull Royal Infirmary, Anlaby Road, Hull, HU3 2JZ (United Kingdom)], E-mail: d.j.manton@hull.ac.uk; Lowry, Martin [Centre for Magnetic Resonance Investigations, Division of Cancer, Postgraduate Medical School, University of Hull, Hull Royal Infirmary, Anlaby Road, Hull, HU3 2JZ (United Kingdom)], E-mail: m.lowry@hull.ac.uk; Turnbull, Lindsay W. [Centre for Magnetic Resonance Investigations, Division of Cancer, Postgraduate Medical School, University of Hull, Hull Royal Infirmary, Anlaby Road, Hull, HU3 2JZ (United Kingdom)], E-mail: l.w.turnbull@hull.ac.uk

    2009-09-15

    The purpose of this study was to investigate whether dynamic contrast enhanced MRI (DCE-MRI) data, both pharmacokinetic and empirical, can predict, prior to neoadjuvant chemotherapy, which patients are likely to have a shorter disease free survival (DFS) and overall survival (OS) interval following surgery. Traditional prognostic parameters were also included in the survival analysis. Consequently, a comparison of the prognostic value could be made between all the parameters studied. MR examinations were conducted on a 1.5 T system in 68 patients prior to the initiation of neoadjuvant chemotherapy. DCE-MRI consisted of a fast spoiled gradient echo sequence acquired over 35 phases with a mean temporal resolution of 11.3 s. Both pharmacokinetic and empirical parameters were derived from the DCE-MRI data. Kaplan-Meier survival plots were generated for each parameter and group comparisons were made utilising logrank tests. The results from the 54 patients entered into the univariate survival analysis demonstrated that traditional prognostic parameters (tumour grade, hormonal status and size), empirical parameters (maximum enhancement index, enhancement index at 30 s, area under the curve and initial slope) and adjuvant therapies demonstrated significant differences in survival intervals. Further multivariate Cox regression survival analysis revealed that empirical enhancement parameters contributed the greatest prediction of both DFS and OS in the resulting models. In conclusion, this study has demonstrated that in patients who exhibit high levels of perfusion and vessel permeability pre-treatment, evidenced by elevated empirical DCE-MRI parameters, a significantly lower disease free survival and overall survival can be expected.

  18. Sensor Fusion Based on an Integrated Neural Network and Probability Density Function (PDF) Dual Kalman Filter for On-Line Estimation of Vehicle Parameters and States.

    Science.gov (United States)

    Vargas-Melendez, Leandro; Boada, Beatriz L; Boada, Maria Jesus L; Gauchia, Antonio; Diaz, Vicente

    2017-04-29

    Vehicles with a high center of gravity (COG), such as light trucks and heavy vehicles, are prone to rollover. This kind of accident causes nearly 33 % of all deaths from passenger vehicle crashes. Nowadays, these vehicles are incorporating roll stability control (RSC) systems to improve their safety. Most of the RSC systems require the vehicle roll angle as a known input variable to predict the lateral load transfer. The vehicle roll angle can be directly measured by a dual antenna global positioning system (GPS), but it is expensive. For this reason, it is important to estimate the vehicle roll angle from sensors installed onboard in current vehicles. On the other hand, the knowledge of the vehicle's parameters values is essential to obtain an accurate vehicle response. Some of vehicle parameters cannot be easily obtained and they can vary over time. In this paper, an algorithm for the simultaneous on-line estimation of vehicle's roll angle and parameters is proposed. This algorithm uses a probability density function (PDF)-based truncation method in combination with a dual Kalman filter (DKF), to guarantee that both vehicle's states and parameters are within bounds that have a physical meaning, using the information obtained from sensors mounted on vehicles. Experimental results show the effectiveness of the proposed algorithm.

  19. Probability encoding of hydrologic parameters for basalt. Elicitation of expert opinions from a panel of three basalt waste isolation project staff hydrologists

    International Nuclear Information System (INIS)

    Runchal, A.K.; Merkhofer, M.W.; Olmsted, E.; Davis, J.D.

    1984-11-01

    The present study implemented a probability encoding method to estimate the probability distributions of selected hydrologic variables for the Cohassett basalt flow top and flow interior, and the anisotropy ratio of the interior of the Cohassett basalt flow beneath the Hanford Site. Site-speciic data for these hydrologic parameters are currently inadequate for the purpose of preliminary assessment of candidate repository performance. However, this information is required to complete preliminary performance assessment studies. Rockwell chose a probability encoding method developed by SRI International to generate credible and auditable estimates of the probability distributions of effective porosity and hydraulic conductivity anisotropy. The results indicate significant differences of opinion among the experts. This was especially true of the values of the effective porosity of the Cohassett basalt flow interior for which estimates differ by more than five orders of magnitude. The experts are in greater agreement about the values of effective porosity of the Cohassett basalt flow top; their estimates for this variable are generally within one to two orders of magnitiude of each other. For anisotropy ratio, the expert estimates are generally within two or three orders of magnitude of each other. Based on this study, the Rockwell hydrologists estimate the effective porosity of the Cohassett basalt flow top to be generally higher than do the independent experts. For the effective porosity of the Cohassett basalt flow top, the estimates of the Rockwell hydrologists indicate a smaller uncertainty than do the estimates of the independent experts. On the other hand, for the effective porosity and anisotropy ratio of the Cohassett basalt flow interior, the estimates of the Rockwell hydrologists indicate a larger uncertainty than do the estimates of the independent experts

  20. Spectroscopic parameters, vibrational levels, transition dipole moments and transition probabilities of the 9 low-lying states of the NCl+ cation

    Science.gov (United States)

    Yin, Yuan; Shi, Deheng; Sun, Jinfeng; Zhu, Zunlue

    2018-03-01

    This work calculates the potential energy curves of 9 Λ-S and 28 Ω states of the NCl+ cation. The technique employed is the complete active space self-consistent field method, which is followed by the internally contracted multireference configuration interaction approach with the Davidson correction. The Λ-S states are X2Π, 12Σ+, 14Π, 14Σ+, 14Σ-, 24Π, 14Δ, 16Σ+, and 16Π, which are yielded from the first two dissociation channels of NCl+ cation. The Ω states are generated from these Λ-S states. The 14Π, 14Δ, 16Σ+, and 16Π states are inverted with the spin-orbit coupling effect included. The 14Σ+, 16Σ+, and 16Π states are very weakly bound, whose well depths are only several-hundred cm- 1. One avoided crossing of PECs occurs between the 12Σ+ and 22Σ+ states. To improve the quality of potential energy curves, core-valence correlation and scalar relativistic corrections are included. The potential energies are extrapolated to the complete basis set limit. The spectroscopic parameters and vibrational levels are calculated. The transition dipole moments are computed. The Franck-Condon factors, Einstein coefficients, and radiative lifetimes of many transitions are determined. The spectroscopic approaches are proposed for observing these states according to the transition probabilities. The spin-orbit coupling effect on the spectroscopic and vibrational properties is evaluated. The spectroscopic parameters, vibrational levels, transition dipole moments, as well as transition probabilities reported in this paper could be considered to be very reliable.

  1. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    International Nuclear Information System (INIS)

    Lee, Tsair-Fwu; Chao, Pei-Ju; Wang, Hung-Yu; Hsu, Hsuan-Chih; Chang, PaoShu; Chen, Wen-Cheng

    2012-01-01

    With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3 + xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R 2 , the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD 50 ) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD 50 =43.6 Gy and m=0.18 with the SEF data, and TD 50 =44.1 Gy and m=0.11 with the QoL data. The rate of grade 3 + xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Our study shows the agreement between the NTCP parameter modeling based on SEF and

  2. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    Science.gov (United States)

    2012-01-01

    Background With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement between the NTCP

  3. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    Directory of Open Access Journals (Sweden)

    Lee Tsair-Fwu

    2012-12-01

    Full Text Available Abstract Background With advances in modern radiotherapy (RT, many patients with head and neck (HN cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB model to derive parameters for the normal tissue complication probability (NTCP for xerostomia based on scintigraphy assessments and quality of life (QoL questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50 and the slope of the dose–response curve (m were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement

  4. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  5. Positive random variables with a discrete probability mass at the origin: Parameter estimation for left-censored samples with application to air quality monitoring data

    International Nuclear Information System (INIS)

    Gogolak, C.V.

    1986-11-01

    The concentration of a contaminant measured in a particular medium might be distributed as a positive random variable when it is present, but it may not always be present. If there is a level below which the concentration cannot be distinguished from zero by the analytical apparatus, a sample from such a population will be censored on the left. The presence of both zeros and positive values in the censored portion of such samples complicates the problem of estimating the parameters of the underlying positive random variable and the probability of a zero observation. Using the method of maximum likelihood, it is shown that the solution to this estimation problem reduces largely to that of estimating the parameters of the distribution truncated at the point of censorship. The maximum likelihood estimate of the proportion of zero values follows directly. The derivation of the maximum likelihood estimates for a lognormal population with zeros is given in detail, and the asymptotic properties of the estimates are examined. The estimation method was used to fit several different distributions to a set of severely censored 85 Kr monitoring data from six locations at the Savannah River Plant chemical separations facilities

  6. MRCI study on the spectroscopic parameters, transition dipole moments and transition probabilities of 16 low-lying states of the BeB radical

    Science.gov (United States)

    Zhou, Dan; Shi, Deheng; Sun, Jinfeng; Zhu, Zunlue

    2018-03-01

    In this work, we calculate the potential energy curves of 16 Λ-S and 36 Ω states of beryllium boride (BeB) radical using the complete active space self-consistent field method, followed by the valence internally contracted multireference configuration interaction approach with Davidson correction. The 16 Λ-S states are the X2Π, A2Σ+, B2Π, C2Δ, D2Ʃ-, E2Σ+, G2Π, I2Σ+, a4Σ-, b4Π, c4Σ-, d4Δ, e4Σ+, g4Π, h4Π, and 24Σ+, which are obtained from the first three dissociation channels of the BeB radical. The Ω states are obtained from the Λ-S states. Of the Λ-S states, the G2Π, I2Σ+, and h4Π states exhibit double well curves. The G2Π, b4Π, and g4Π states are inverted with the spin-orbit coupling effect included. The d4Δ, e4Σ+, and g4Π states as well as the second well of the h4Π state are very weakly bound. Avoided crossings exist between the G2Π and H2Π states, the A2Σ+ and E2Σ+ states, the c4Σ- and f4Σ- states, the g4Π and h4Π states, the I2Σ+ and 42Σ+ states, as well as the 24Σ+ and 34Σ+ states. To improve the quality of the potential energy curves, core-valence correlation and scalar relativistic corrections, as well as the extrapolation of the potential energies to the complete basis set limit, are included. The transition dipole moments are computed. Spectroscopic parameters and vibrational levels are determined along with Franck-Condon factors, Einstein coefficients, and radiative lifetimes of many electronic transitions. The transition probabilities are evaluated. The spin-orbit coupling effect on the spectroscopic parameters and vibrational levels is discussed. The spectroscopic parameters, vibrational levels, and transition probabilities reported in this paper can be considered very reliable and can be employed to predict these states in an appropriate spectroscopy experiment.

  7. Survival probability of larval sprat in response to decadal changes in diel vertical migration behavior and prey abundance in the Baltic Sea

    DEFF Research Database (Denmark)

    Hinrichsen, Hans-Harald; Peck, Myron A.; Schmidt, Jörn

    2010-01-01

    distribution and climate-driven abiotic and biotic environmental factors including variability in the abundance of different, key prey species (calanoid copepods) as well as seasonal changes, long-term trends, and spatial differences in water temperature. Climate forcing affected Baltic sprat larval survival......, larvae were predicted to experience optimal conditions to ensure higher survival throughout the later larval and early juvenile stages. However, this behavioral shift also increased the susceptibility of larvae to unfavorable winddriven surface currents, contributing to the marked increase in interannual...

  8. Predicting treatment effect from surrogate endpoints and historical trials: an extrapolation involving probabilities of a binary outcome or survival to a specific time.

    Science.gov (United States)

    Baker, Stuart G; Sargent, Daniel J; Buyse, Marc; Burzykowski, Tomasz

    2012-03-01

    Using multiple historical trials with surrogate and true endpoints, we consider various models to predict the effect of treatment on a true endpoint in a target trial in which only a surrogate endpoint is observed. This predicted result is computed using (1) a prediction model (mixture, linear, or principal stratification) estimated from historical trials and the surrogate endpoint of the target trial and (2) a random extrapolation error estimated from successively leaving out each trial among the historical trials. The method applies to either binary outcomes or survival to a particular time that is computed from censored survival data. We compute a 95% confidence interval for the predicted result and validate its coverage using simulation. To summarize the additional uncertainty from using a predicted instead of true result for the estimated treatment effect, we compute its multiplier of standard error. Software is available for download. © 2011, The International Biometric Society No claim to original US government works.

  9. Influence of binder type and process parameters on the compression properties and microbial survival in diclofenac tablet formulations

    Directory of Open Access Journals (Sweden)

    John Oluwasogo Ayorinde

    2011-12-01

    Full Text Available The influence of binder type and process parameters on the compression properties and microbial survival in diclofenac tablet formulations were studied using a novel gum from Albizia zygia. Tablets were produced from diclofenac formulations containing corn starch, lactose and dicalcium phosphate. Formulations were analyzed using the Heckel and Kawakita plots. Determination of microbial viability in the formulations was done on the compressed tablets of both contaminated and uncontaminated tablets prepared from formulations. Direct compression imparted a higher plasticity on the materials than the wet granulation method. Tablets produced by wet granulation presented with a higher crushing strength than those produced by the direct compression method. Significantly higher microbial survival (pA influência do tipo de ligante e os parâmetros do processo de propriedades de compressão e sobrevivência microbiana em comprimidos de diclofenaco foram estudados utilizando uma nova goma de Albizia zygia. Os comprimidos foram produzidos a partir de formulações de diclofenaco contendo amido de milho, lactose e fosfato bicálcico. As formulações foram analisadas usando os gráficos de Heckel e Kawakita. A determinação da viabilidade microbiana nas formulações foi feita nos comprimidos contaminados e não contaminados preparados a partir de formulações. A compressão direta confere maior plasticidade dos materiais do que o método de granulação úmida. Comprimidos produzidos por granulação úmida apresentaram maior força de esmagamento do que aqueles produzidos pelo método de compressão direta. Observou-se sobrevivência significativamente maior (p<0,05 em formulações preparadas por compressão direta. A sobrevivência percentual dos esporos de Bacillus subtilis diminuiu com o aumento da concentração do agregante. O estudo mostrou que a goma de Albizia é capaz de conferir maior plasticidade aos materiais e apresentou maior redução da

  10. Survival of inlays and partial crowns made of IPS empress after a 10-year observation period and in relation to various treatment parameters.

    Science.gov (United States)

    Stoll, Richard; Cappel, I; Jablonski-Momeni, Anahita; Pieper, K; Stachniss, V

    2007-01-01

    This study evaluated the long-term survival of inlays and partial crowns made of IPS Empress. For this purpose, the patient data of a prospective study were examined in retrospect and statistically evaluated. All of the inlays and partial crowns fabricated of IPS-Empress within the Department of Operative Dentistry at the School of Dental Medicine of Philipps University, Marburg, Germany were systematically recorded in a database between 1991 and 2001. The corresponding patient files were revised at the end of 2001. The information gathered in this way was used to evaluate the survival of the restorations using the method described by Kaplan and Meyer. A total of n = 1624 restorations were fabricated of IPS-Empress within the observation period. During this time, n = 53 failures were recorded. The remaining restorations were observed for a mean period of 18.77 months. The failures were mainly attributed to fractures, endodontic problems and cementation errors. The last failure was established after 82 months. At this stage, a cumulative survival probability of p = 0.81 was registered with a standard error of 0.04. At this time, n = 30 restorations were still being observed. Restorations on vital teeth (n = 1588) showed 46 failures, with a cumulative survival probability of p = 0.82. Restorations performed on non-vital teeth (n = 36) showed seven failures, with a cumulative survival probability of p = 0.53. Highly significant differences were found between the two groups (p < 0.0001) in a log-rank test. No significant difference (p = 0.41) was found between the patients treated by students (n = 909) and those treated by qualified dentists (n = 715). Likewise, no difference (p = 0.13) was established between the restorations seated with a high viscosity cement (n = 295) and those placed with a low viscosity cement (n = 1329).

  11. Effects of Garlic (Alliumsativum and chloramphenicol on growth performance, physiological parameters and survival of Nile tilapia (Oreochromis niloticus

    Directory of Open Access Journals (Sweden)

    A. M. Shalaby

    2006-04-01

    Full Text Available We studied and compared the effects of chloramphenicol antibiotic and garlic (Allium sativum, used as immunostimulants and growth promoters, on some physiological parameters, growth performance, survival rate, and bacteriological characteristics of Nile tilapia (Oreochromis niloticus. Fish (7±1g/fish were assigned to eight treatments, with three replicates each. Treatment groups had a different level of Allium sativum (10, 20, 30, and 40g/kg diet and chloramphenicol (15, 30, and 45mg/kg diet added to their diets; the control group diet was free from garlic and antibiotic. Diets also contained 32% crude protein (CP and were administered at a rate of 3% live body weight twice daily for 90 days. Results showed that the final weight and specific growth rate (SGR of O. niloticus increased significantly with increasing levels of Allium sativum and chloramphenicol. The highest growth performance was verified with 30g Allium sativum / kg diet and 30mg chloramphenicol / kg diet. The lowest feed conversion ratio (FCR was observed with 30g Allium sativum / kg diet and 30mg chloramphenicol / kg diet. There were significant differences in the protein efficiency ratio (PER with all treatments, except with 45mg chloramphenicol / kg diet. No changes in the hepatosomatic index and survival rate were observed. Crude protein content in whole fish increased significantly in the group fed on 30g Allium sativum / kg diet, while total lipids decreased significantly in the same group. Ash of whole fish showed significantly high values with 30g Allium sativum and 15mg chloramphenicol / kg diet while the lowest value was observed in the control group. Blood parameters, erythrocyte count (RBC, and hemoglobin content in fish fed on diets containing 40g Allium sativum and all levels of chloramphenicol were significantly higher than in control. Significantly higher hematocrit values were seen with 30 and 45mg chloramphenicol / kg diet. There were no significant differences

  12. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  13. p53/Surviving Ratio as a Parameter for Chemotherapy Induction Response in Children with Acute Myeloid Leukemia

    Directory of Open Access Journals (Sweden)

    Rinaldi Lenggana

    2016-11-01

    Full Text Available Acute myeloid leukemia (AML is a malignancy that is often found in children. Many studies into the failure of apoptosis function, or programmed cell death, is one of the most important regulatory mechanisms of cellular hemostasis which is closely linked to the development of cancer, are important. Also, regulation of the apoptotic (p53 and anti-apoptotic (surviving proteins influence treatment outcome. One role of p53 is to monitor cellular stress necessary to induce apoptosis. Surviving (BIRC5 is a group of proteins in the apoptosis inhibitor which works by inhibiting caspase-3. The role of surviving is considered very important in oncogenesis proliferation and cell growth regulation. Chemotherapy in childhood AML can inhibit cell growth and induce slowing as well as stopping the cell cycle. Thus, the aim of this study was to compare p53 and surviving before and after receiving induction chemotherapy in children with AML and also to determine the p53/surviving ratio. Peripheral blood mononuclear cells were collected from AML children before treatment and three months after starting their induction therapy. p53 and surviving were measured by flowcytometry using monoclonal antibodies. Data were analyzed by t-test for comparison between groups and Spearman’s test to find out the correlation between variables with a significant value of p < 0.05. A total of 8 children were evaluated. The intensity of p53 expression was not significantly increased after induction phase chemotherapy (p = 0.224, but surviving expression and the ratio of p53/surviving were significantly increased in the treatment group compared with the levels prior to chemotherapy (p = 0.002, p = 0.034, and there was a strong negative correlation between p53 and surviving after chemotherapy (r = −0.63, p = 0.049.

  14. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  15. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  16. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  17. Association analysis of insulin-like growth factor-1 axis parameters with survival and functional status in nonagenarians of the Leiden Longevity Study

    DEFF Research Database (Denmark)

    van der Spoel, Evie; Rozing, Maarten P; Houwing-Duistermaat, Jeanine J

    2015-01-01

    Reduced insulin/insulin-like growth factor 1 (IGF-1) signaling has been associated with longevity in various model organisms. However, the role of insulin/IGF-1 signaling in human survival remains controversial. The aim of this study was to test whether circulating IGF-1 axis parameters associate...... with old age survival and functional status in nonagenarians from the Leiden Longevity Study. This study examined 858 Dutch nonagenarian (males≥89 years; females≥91 years) siblings from 409 families, without selection on health or demographic characteristics. Nonagenarians were divided over sex...

  18. COUNTRY-LEVEL SOCIOECONOMIC INDICATORS ASSOCIATED WITH SURVIVAL PROBABILITY OF BECOMING A CENTENARIAN AMONG OLDER EUROPEAN ADULTS: GENDER INEQUALITY, MALE LABOUR FORCE PARTICIPATION AND PROPORTIONS OF WOMEN IN PARLIAMENTS.

    Science.gov (United States)

    Kim, Jong In; Kim, Gukbin

    2017-03-01

    This study confirms an association between survival probability of becoming a centenarian (SPBC) for those aged 65 to 69 and country-level socioeconomic indicators in Europe: the gender inequality index (GII), male labour force participation (MLP) rates and proportions of seats held by women in national parliaments (PWP). The analysis was based on SPBC data from 34 countries obtained from the United Nations (UN). Country-level socioeconomic indicator data were obtained from the UN and World Bank databases. The associations between socioeconomic indicators and SPBC were assessed using correlation coefficients and multivariate regression models. The findings show significant correlations between the SPBC for women and men aged 65 to 69 and country-level socioeconomic indicators: GII (r=-0.674, p=0.001), MLP (r=0.514, p=0.002) and PWP (r=0.498, p=0.003). The SPBC predictors for women and men were lower GIIs and higher MLP and PWP (R 2=0.508, p=0.001). Country-level socioeconomic indicators appear to have an important effect on the probability of becoming a centenarian in European adults aged 65 to 69. Country-level gender equality policies in European counties may decrease the risk of unhealthy old age and increase longevity in elders through greater national gender equality; disparities in GII and other country-level socioeconomic indicators impact longevity probability. National longevity strategies should target country-level gender inequality.

  19. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  20. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  1. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  2. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  3. Impact of immune parameters on long-term survival in metastatic renal cell      carcinoma

    DEFF Research Database (Denmark)

    Donskov, Frede; Maase, Hans von der

    2006-01-01

    with estimated       5-year survival rates of 60%, 25%, and 0%, respectively. These findings       were apparent in both our own prognostic model and in an extended Memorial       Sloan-Kettering Cancer Center (New York, NY) prognostic model. CONCLUSION:       This study points on five clinical and three...

  4. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  5. A study of V79 cell survival after for proton and carbon ion beams as represented by the parameters of Katz' track structure model

    DEFF Research Database (Denmark)

    Grzanka, Leszek; Waligórski, M. P. R.; Bassler, Niels

    different sets of data obtained for the same cell line and different ions, measured at different laboratories, we have fitted model parameters to a set of carbon-irradiated V79 cells, published by Furusawa et al. (2), and to a set of proton-irradiated V79 cells, published by Wouters et al. (3), separately....... We found that values of model parameters best fitted to the carbon data of Furusawa et al. yielded predictions of V79 survival after proton irradiation which did not match the V79 proton data of Wouters et al. Fitting parameters to both sets combined did not improve the accuracy of model predictions...... carbon irradiation. 1. Katz, R., Track structure in radiobiology and in radiation detection. Nuclear Track Detection 2: 1-28 (1978). 2. Furusawa Y. et al. Inactivation of aerobic and hypoxic cells from three different cell lines by accelerated 3He-, 12C- and 20Ne beams. Radiat Res. 2012 Jan; 177...

  6. Can cell survival parameters be deduced from non-clonogenic assays of radiation damage to normal tissue

    International Nuclear Information System (INIS)

    Michalowski, A.; Wheldon, T.E.; Kirk, J.

    1984-01-01

    The relationship between dose-response curves for large scale radiation injury to tissues and survival curves for clonogenic cells is not necessarily simple. Sterilization of clonogenic cells occurs near-instantaneously compared with the protracted lag period for gross injury to tissues. Moreover, with some types of macroscopic damage, the shapes of the dose-response curves may depend on time of assay. Changes in the area or volume of irradiated tissue may also influence the shapes of these curves. The temporal pattern of expression of large scale injury also varies between tissues, and two distinct groups can be recognized. In rapidly proliferating tissues, lag period is almost independent of dose, whilst in slowly proliferating tissues, it is inversely proportional to dose. This might be explained by invoking differences in corresponding proliferative structures of the tissues. (Three compartmental Type H versus one compartmental Type F proliferative organization). For the second group of tissues particularly, mathematical modelling suggests a systematic dissociation of the dose-response curves for clonogenic cell survival and large scale injury. In particular, it may be difficult to disentangle the contributions made to inter-fraction sparing by cellular repair processes and by proliferation-related factors. (U.K.)

  7. Influence of inoculation levels and processing parameters on the survival of Campylobacter jejuni in German style fermented turkey sausages.

    Science.gov (United States)

    Alter, Thomas; Bori, Anouchka; Hamedi, Ahmad; Ellerbroek, Lüppo; Fehlhaber, Karsten

    2006-10-01

    This study investigated the influence of inoculum levels and manufacturing methods on the survival of Campylobacter (C.) jejuni in raw fermented turkey sausages. Sausages were prepared and inoculated with C. jejuni. After inoculation, these sausages were processed and ripened for 8 days. Samples were taken throughout the ripening process. The presence of C. jejuni was established bacteriologically. Additionally, lactic acid bacteria were enumerated, pH values and water activity were measured to verify the ripening process. To detect changes in genotype and verify the identity of the recovered clones, AFLP analysis was carried out on the re-isolated strains. Whereas no C. jejuni were detectable when inoculating the sausages with the lowest inoculum (0.08-0.44 log(10) cfu/g sausage emulsion), C. jejuni were detectable for 12-24h by enrichment when inoculated with approximately 2 log(10) cfu/g. After inoculation with 4 and 6 log(10) cfu/g respectively, C. jejuni were detectable without enrichment for 12-48 h and by enrichment for 144 h at the most. The greatest decrease of the C. jejuni population occurred during the first 4 h of ripening. Only a very high inoculum level allowed the survival of the organism during a fermentation process and during ripening to pose a potential risk for consumers. Lower initial Campylobacter inoculums will be eliminated during proper ripening of the sausages, if sufficient decrease in water activity and pH-value is ensured.

  8. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  9. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  10. Change in volume parameters induced by neoadjuvant chemotherapy provide accurate prediction of overall survival after resection in patients with oesophageal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Tamandl, Dietmar; Fueger, Barbara; Kinsperger, Patrick; Haug, Alexander; Ba-Ssalamah, Ahmed [Medical University of Vienna, Department of Biomedical Imaging and Image-Guided Therapy, Comprehensive Cancer Center GET-Unit, Vienna (Austria); Gore, Richard M. [University of Chicago Pritzker School of Medicine, Department of Radiology, Chicago, IL (United States); Hejna, Michael [Medical University of Vienna, Department of Internal Medicine, Division of Medical Oncology, Comprehensive Cancer Center GET-Unit, Vienna (Austria); Paireder, Matthias; Schoppmann, Sebastian F. [Medical University of Vienna, Department of Surgery, Upper-GI-Service, Comprehensive Cancer Center GET-Unit, Vienna (Austria)

    2016-02-15

    To assess the prognostic value of volumetric parameters measured with CT and PET/CT in patients with neoadjuvant chemotherapy (NACT) and resection for oesophageal cancer (EC). Patients with locally advanced EC, who were treated with NACT and resection, were retrospectively analysed. Data from CT volumetry and {sup 18} F-FDG PET/CT (maximum standardized uptake [SUVmax], metabolic tumour volume [MTV], and total lesion glycolysis [TLG]) were recorded before and after NACT. The impact of volumetric parameter changes induced by NACT (MTV{sub RATIO}, TLG{sub RATIO}, etc.) on overall survival (OS) was assessed using a Cox proportional hazards model. Eighty-four patients were assessed using CT volumetry; of those, 50 also had PET/CT before and after NACT. Low post-treatment CT volume and thickness, MTV, TLG, and SUVmax were all associated with longer OS (p < 0.05), as were CTthickness{sub RATIO}, MTV{sub RATIO}, TLG{sub RATIO}, and SUVmax{sub RATIO} (p < 0.05). In the multivariate analysis, only MTV{sub RATIO} (Hazard ratio, HR 2.52 [95 % Confidence interval, CI 1.33-4.78], p = 0.005), TLG{sub RATIO} (HR 3.89 [95%CI 1.46-10.34], p = 0.006), and surgical margin status (p < 0.05), were independent predictors of OS. MTV{sub RATIO} and TLG{sub RATIO} are independent prognostic factors for survival in patients after NACT and resection for EC. (orig.)

  11. Volumetric parameters on FDG PET can predict early intrahepatic recurrence-free survival in patients with hepatocellular carcinoma after curative surgical resection

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jeong Won [Catholic Kwandong University College of Medicine, Department of Nuclear Medicine, Incheon (Korea, Republic of); Hwang, Sang Hyun; Kim, Hyun Jeong; Kim, Dongwoo; Cho, Arthur; Yun, Mijin [Yonsei University College of Medicine, Department of Nuclear Medicine, Seoul (Korea, Republic of)

    2017-11-15

    This study assessed the prognostic values of volumetric parameters on {sup 18}F-fluorodeoxyglucose (FDG) positron emission tomography (PET) in predicting early intrahepatic recurrence-free survival (RFS) after curative resection in patients with hepatocellular carcinoma (HCC). A retrospective analysis was performed on 242 patients with HCC who underwent staging FDG PET and subsequent curative surgical resection. The tumor-to-non-tumorous liver uptake ratio, metabolic tumor volume (MTV), and total lesion glycolysis (TLG) of the HCC lesions on PET were measured. The prognostic values of clinical factors and PET parameters for predicting overall RFS, overall survival (OS), extrahepatic RFS, and early and late intrahepatic RFS were assessed. The median follow-up period was 54.7 months, during which 110 patients (45.5%) experienced HCC recurrence and 62 (25.6%) died. Patients with extrahepatic and early intrahepatic recurrence showed worse OS than did those with no recurrence or late intrahepatic recurrence (p < 0.001). Serum bilirubin level, MTV, and TLG were independent prognostic factors for overall RFS and OS (p < 0.05). Only MTV and TLG were prognostic for extrahepatic RFS (p < 0.05). Serum alpha-fetoprotein and bilirubin levels, MTV, and TLG were prognostic for early intrahepatic RFS (p < 0.05) and hepatitis C virus (HCV) positivity and serum albumin level were independently prognostic for late intrahepatic RFS (p < 0.05). Intrahepatic recurrence showed different prognoses according to the time interval of recurrence in which early recurrence had as poor survival as extrahepatic recurrence. MTV and TLG on initial staging PET were significant independent factors for predicting early intrahepatic and extrahepatic RFS in patients with HCC after curative resection. Only HCV positivity and serum albumin level were significant for late intrahepatic RFS, which is mainly attributable to the de novo formation of new primary HCC. (orig.)

  12. Pretreatment F-18 FDG PET/CT Parameters to Evaluate Progression-Free Survival in Gastric Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeonghun; Lim, Seok Tae; Na, Chang Ju; Han, Yeonhee; Kim, Chanyoung; Jeong, Hwanjeong; Sohn, Myunghee [Chonbuk National Univ., Jeonju (Korea, Republic of)

    2014-03-15

    We performed this study to evaluate the predictive value of pretreatment F-18 FDG PET/CT for progression-free survival (PFS) in patients with gastric cancer. Of 321 patients with a diagnosis of gastric cancer, we retrospectively enrolled 97 patients (men:women = 61:36, age 59.8±13.2 years), who underwent pretreatment F-18 fluoro-2-deoxyglucose positron emission tomography/computed tomography (F-18 FDG PET/CT) from January 2009 to December 2009. Maximum standardized uptake value (SUVmax) was measured for each case with detectable primary lesions. In the remaining non-detectable cases, SUVmax was measured from the corresponding site seen on gastroduodenoscopy for analysis. In subgroup analysis, metabolic tumor volume (MTV) was measured in 50 patients with clearly distinguishable primary lesions. SUVmax, stage, depth of tumor invasion and presence of lymph node metastasis were analyzed in terms of PFS. Receiver operating characteristic (ROC) curves were used to find optimal cutoff values of SUVmax and MTV for disease progression. The relationship between SUVmax, MTV and PFS was analyzed using the Kaplan-Meier with log-rank test and Cox's proportional hazard regression methods. Of 97 patients, 15 (15.5 %) had disease progression. The mean follow-up duration was 29.6±10.2 months. The mean PFS of low SUVmax group (≤5.74) was significantly longer than that of the high SUVmax group (>5.74) (30.9±8.0 vs 24.3±13.6 months, p =0.008). In univariate analysis, stage (I vs II, III, IV), depth of tumor invasion (T1 vs T2, T3, T4), presence of lymph node metastasis and SUVmax (>5.74 vs ≤5.74) were significantly associated with recurrence. In multivariate analysis, high SUVmax (>5.74) was the only poor prognostic factor for PFS (p =0.002, HR 11.03, 95% CI 2.48.49.05). Subgroup multivariate analysis revealed that high MTV (>16.42) was the only poor prognostic factor for PFS (p =0.034, HR 3.59, 95 % CI 1.10.11.71). In gastric cancer, SUVmax measured by pretreatment F-18

  13. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  14. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  15. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  16. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  17. A new formalism for modelling parameters α and β of the linear-quadratic model of cell survival for hadron therapy

    Science.gov (United States)

    Vassiliev, Oleg N.; Grosshans, David R.; Mohan, Radhe

    2017-10-01

    We propose a new formalism for calculating parameters α and β of the linear-quadratic model of cell survival. This formalism, primarily intended for calculating relative biological effectiveness (RBE) for treatment planning in hadron therapy, is based on a recently proposed microdosimetric revision of the single-target multi-hit model. The main advantage of our formalism is that it reliably produces α and β that have correct general properties with respect to their dependence on physical properties of the beam, including the asymptotic behavior for very low and high linear energy transfer (LET) beams. For example, in the case of monoenergetic beams, our formalism predicts that, as a function of LET, (a) α has a maximum and (b) the α/β ratio increases monotonically with increasing LET. No prior models reviewed in this study predict both properties (a) and (b) correctly, and therefore, these prior models are valid only within a limited LET range. We first present our formalism in a general form, for polyenergetic beams. A significant new result in this general case is that parameter β is represented as an average over the joint distribution of energies E 1 and E 2 of two particles in the beam. This result is consistent with the role of the quadratic term in the linear-quadratic model. It accounts for the two-track mechanism of cell kill, in which two particles, one after another, damage the same site in the cell nucleus. We then present simplified versions of the formalism, and discuss predicted properties of α and β. Finally, to demonstrate consistency of our formalism with experimental data, we apply it to fit two sets of experimental data: (1) α for heavy ions, covering a broad range of LETs, and (2) β for protons. In both cases, good agreement is achieved.

  18. Impact of CyberKnife Radiosurgery on Overall Survival and Various Parameters of Patients with 1-3 versus ≥ 4 Brain Metastases.

    Science.gov (United States)

    Murovic, Judith; Ding, Victoria; Han, Summer S; Adler, John R; Chang, Steven D

    2017-10-24

    Introduction This study's objective is to compare the overall survivals (OSs) and various parameters of patients with 1-3 versus ≥ 4 brain metastases post-CyberKnife radiosurgery (CKRS) (Accuray, Sunnyvale, California) alone. Methods Charts of 150 patients, from 2009-2014, treated with only CKRS for brain metastases were reviewed retrospectively for overall survival (OS) and patient, tumor, and imaging characteristics. Parameters included demographics, Eastern Cooperative Oncology Group (ECOG) performance scores, number and control of extracranial disease (ECD) sites, cause of death (COD), histology, tumor volume (TV), and post-CKRS whole brain radiotherapy (WBRT). The imaging characteristics assessed were time of complete response (CR), partial response (PR), stable imaging or local failure (LF), and distal brain failure (DBF). Patients and their data were divided into those with 1-3 (group 1) versus ≥ 4 brain metastases (group 2). For each CR and LF patient, absolute neutrophil count (ANC), absolute lymphocyte count (ALC)), and ANC/ALC ratio (NLR) were obtained, when available, at the time of CKRS. Results Both group 1 and group 2 had a median OS of 13 months. The patient median age for the 115 group 1 patients versus the 35 group 2 patients was 62 versus 56 years. Group 1 had slightly more males and group 2, females. The predominant ECOG score for each group was 1 and the number of ECD sites was one and two, respectively. Uncontrolled ECD occurred in the majority of both group 1 and group 2 patients. The main COD was ECD in both groups. The prevalent tumor histology for groups 1 and 2 was non-small cell lung carcinoma. Median TVs were 1.08 cc versus 1.42 cc for groups 1 and 2, respectively. The majority of patients in both groups did not undergo post-CKRS WBRT. Imaging outcomes were LC (CR, PR, or stable imaging) in 93 (80.9%) and 26 (74.3%) group 1 and 2 patients, of whom 32 (27.8%) and six (17.1%) had CR; 38 (33.0%) and 18 (51.4%), PR and 23

  19. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  20. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  1. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  2. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  3. On the probability of cure for heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Hanin, Leonid; Zaider, Marco

    2014-01-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)

  4. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  5. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  6. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  7. Survival curves for irradiated cells

    International Nuclear Information System (INIS)

    Gibson, D.K.

    1975-01-01

    The subject of the lecture is the probability of survival of biological cells which have been subjected to ionising radiation. The basic mathematical theories of cell survival as a function of radiation dose are developed. A brief comparison with observed survival curves is made. (author)

  8. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  9. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  10. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  11. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  12. Feeding ω-3 PUFA enriched rotifers to Galaxias maculatus (Jenyns, 1842 larvae reared at different salinity conditions: effects on growth parameters, survival and fatty acids profile

    Directory of Open Access Journals (Sweden)

    Patricio Dantagnan

    2013-07-01

    Full Text Available Despite the well known importance of ω-3 polyunsaturated fatty acids (PUFA in marine and freshwater fish larvae, there are few studies on how essential fatty acid requirements and composition on whole body can be altered by changes in water salinity. The present study aimed to determine the effect of salinity on ω-3 PUFA requirements, larval growth survival and fatty acid composition of Galaxias maculatus larvae cultured at two different salinities (0 and 15 g L-1 for 20 days while fed rotifers containing two different levels of ω-3 PUFA (1.87 and 3.16%. The results denoted a marked difference in ω-3 PUFA requirements and in the pattern of fatty acid deposition in the whole body of larvae reared at different salinities, depending of ω-3 PUFA in diets. Thus, to improve growth and survival larvae of G. maculatus reared at 0 g L-1 require higher levels of ω-3 PUFA, principally 18:3 ω-3. Larvae reared at salinities of 15 g L-1 require low levels of ω-3 PUFA for optimal survival, especially 18:3 ω-3. Eicosapentaenoic acid and docosahexaenoic acid content in the whole body of larvae was also affected by water salinity.

  13. Comparison of Effects of Separate and Combined Sugammadex and Lipid Emulsion Administration on Hemodynamic Parameters and Survival in a Rat Model of Verapamil Toxicity.

    Science.gov (United States)

    Tulgar, Serkan; Kose, Halil Cihan; Demir Piroglu, Isılay; Karakilic, Evvah; Ates, Nagihan Gozde; Demir, Ahmet; Gergerli, Ruken; Guven, Selin; Piroglu, Mustafa Devrim

    2016-03-25

    Toxicity of calcium channel blockers leads to high patient mortality and there is no effective antidote. The benefit of using 20% lipid emulsion and sugammadex has been reported. The present study measured the effect of sugammadex and 20% lipid emulsion on hemodynamics and survival in a rat model of verapamil toxicity. In this single-blinded randomized control study, rats were separated into 4 groups of 7 rats each: Sugammadex (S), Sugammadex plus 20% lipid emulsion (SL), 20% lipid emulsion (L), and control (C). Heart rates and mean arterial pressures were monitored and noted each minute until death. Average time to death was 21.0±9.57 minutes for group C, 35.57±10.61 minutes for group S, 37.14±16.6 minutes for group L and 49.86±27.56 minutes for group SL. Time to death was significantly longer in other groups than in the control group (psugammadex and lipid emulsion had a positive effect on survival in patients with calcium channel blocker toxicity. Sugammadex and intralipid increased survival in a rat model of verapamil toxicity. The combination of both drugs may decrease cardiotoxicity. Sugammadex alone or combined with 20% lipid emulsion reduce the need for inotropic agents. The mechanism requires clarification with larger studies.

  14. The role of circulating anti-p53 antibodies in patients with advanced non-small cell lung cancer and their correlation to clinical parameters and survival

    International Nuclear Information System (INIS)

    Bergqvist, Michael; Brattström, Daniel; Larsson, Anders; Hesselius, Patrik; Brodin, Ola; Wagenius, Gunnar

    2004-01-01

    Lung cancer causes approximately one million deaths each year worldwide and protein p53 has been shown to be involved in the intricate processes regulating response to radiation and/or chemotherapeutic treatment. Consequently, since antibodies against p53 (anti-p53 antibodies) are associated with mutations within the p53 gene it seems likely that these antibodies could, hypothetically, be correlated with prognosis. Serum samples from patients with non-small cell lung cancer (NSCLC) admitted to the Department of Oncology, University Hospital, Uppsala, Sweden, during 1983–1996 were studied. Anti-p53 abs were measured using a sandwich ELISA (Dianova, Hamburg, Germany). The present study included 84 patients with stage IIIA-IV (advanced NSCLC). At least three serum samples from each patient were collected and altogether 529 serum samples were analysed for the presence of anti-p53 antibodies. The median value of anti-p53 antibodies was 0.06 (range 0 – 139.8). Seventeen percent of investigated NSCLC first serum samples (n = 84) expressed elevated levels of anti-p53 antibodies. Anti-p53 antibodies were not correlated to tumour volume or platelets. Survival analysis showed that anti-p53 antibodies were not associated with survival as revealed by univariate analysis (p = 0.29). However, patients with adenocarcinoma had a significantly poorer survival if they expressed anti-p53 antibodies (p = 0.01), whereas this was not found for patients with squamous cell carcinoma (p = 0.13). In patients where the blood samples were collected during radiation therapy, a statistically significant correlation towards poorer survival was found (p = 0.05) when elevated anti-p53 antibodies levels were present. No correlations to survival were found for serum samples collected prior to radiation therapy, during chemotherapy, or during follow-up. When anti-p53 antibodies were measured continuously, no increase in median anti-p53 values was observed the closer the individual patient come to

  15. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  16. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  17. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  18. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  19. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  20. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  1. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  2. An investigation of the ignition probability and data analysis for the detection of relevant parameters of mechanically generated steel sparks in explosive gas/air-mixtures; Untersuchungen zur Zuendwahrscheinlichkeit und Datenanalyse zur Erfassung der Einflussgroessen mechanisch erzeugter Stahl-Schlagfunktion in explosionsfaehigen Brenngas/Luft-Gemischen

    Energy Technology Data Exchange (ETDEWEB)

    Grunewald, Thomas; Finke, Robert; Graetz, Rainer

    2010-07-01

    Mechanically generated sparks are a potential source of ignition in highly combustible areas. A multiplicity of mechanical and reaction-kinetic influences causes a complex interaction of parameters. It is only little known about their effect on the ignition probability. The ignition probability of mechanically generated sparks with a material combination of unalloyed steel/unalloyed steel and with an kinetic impact energy between 3 and 277 Nm could be determined statistically tolerable. In addition, the explosiveness of not oxidized particles at increased temperatures in excess stoichiometric mixtures was proven. A unique correlation between impact energy and ignition probability as well as a correlation of impact energy and number of separated particles could be determined. Also, a principle component analysis considering the interaction of individual particles could not find a specific combination of measurable characteristics of the particles, which correlate with a distinct increase of the ignition probability.

  3. Apoptosis and survival parameters during protection from radiation-induced thymocyte death by a candidate radioprotector, GC-2112, from Allium sativum

    International Nuclear Information System (INIS)

    Bunagan, J.; Perey, K.; Deocaris, C.C.

    1996-01-01

    Biomedical studies on nuclear fallout effects show that whole-body exposure to relatively low doses of ionizing radiation (2-10 Gy) induces the hematopoietic syndrome (HS) characterized by severe anemia and immunodeficiency and death within 10-30 days. The thymocyte model applies in many cell death researches and is found to undergo a morphologically and molecularly distinct p53-based apoptosis with DNA-damaging insults, such as radiation exposure. We have shown that exogenously applied radioprotector from allium sativum (garlic), GC-2112, improves total cellular survival for various observation periods concomitantly shifting the LD 50/24 from 7 Gy (control) to 21 Gy (GC-2112). This increased survival characteristic of the radioprotected macrophage-free thymocytes, however, fails to correlate with the prevention of apoptosis-associated DNA scissions. Mechanisms to the observed radiomodification may possibly involve cysteine compounds found rich in garlic. These preliminary findings show promise in the applications of selected herbal drugs as dietary prophylaxis against clinical morbidities arising from either medical, occupational or environmental exposures to ionizing radiation. (author)

  4. Apoptosis and survival parameters during protection from radiation-induced thymocyte death by a candidate radioprotector, GC-2112, from Allium sativum

    Energy Technology Data Exchange (ETDEWEB)

    Bunagan, J; Perey, K [Pamantasan ng Lungsod ng Maynila, Manila (Philippines); Deocaris, C C [Philippine Nuclear Research Inst., Diliman, Quezon City (Philippines)

    1997-12-31

    Biomedical studies on nuclear fallout effects show that whole-body exposure to relatively low doses of ionizing radiation (2-10 Gy) induces the hematopoietic syndrome (HS) characterized by severe anemia and immunodeficiency and death within 10-30 days. The thymocyte model applies in many cell death researches and is found to undergo a morphologically and molecularly distinct p53-based apoptosis with DNA-damaging insults, such as radiation exposure. We have shown that exogenously applied radioprotector from allium sativum (garlic), GC-2112, improves total cellular survival for various observation periods concomitantly shifting the LD{sub 50/24} from 7 Gy (control) to 21 Gy (GC-2112). This increased survival characteristic of the radioprotected macrophage-free thymocytes, however, fails to correlate with the prevention of apoptosis-associated DNA scissions. Mechanisms to the observed radiomodification may possibly involve cysteine compounds found rich in garlic. These preliminary findings show promise in the applications of selected herbal drugs as dietary prophylaxis against clinical morbidities arising from either medical, occupational or environmental exposures to ionizing radiation. (author).

  5. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  6. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  7. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  8. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  9. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  10. Rapidity gap survival in the black-disk regime

    International Nuclear Information System (INIS)

    Leonid Frankfurt; Charles Hyde; Mark Strikman; Christian Weiss

    2007-01-01

    We summarize how the approach to the black-disk regime (BDR) of strong interactions at TeV energies influences rapidity gap survival in exclusive hard diffraction pp -> p + H + p (H = dijet, Qbar Q, Higgs). Employing a recently developed partonic description of such processes, we discuss (a) the suppression of diffraction at small impact parameters by soft spectator interactions in the BDR; (b) further suppression by inelastic interactions of hard spectator partons in the BDR; (c) correlations between hard and soft interactions. Hard spectator interactions substantially reduce the rapidity gap survival probability at LHC energies compared to previously reported estimates

  11. Early differential cell death and survival mechanisms initiate and contribute to the development of OPIDN: A study of molecular, cellular, and anatomical parameters

    International Nuclear Information System (INIS)

    Damodaran, T.V.; Attia, M.K.; Abou-Donia, M.B.

    2011-01-01

    Organophosphorus-ester induced delayed neurotoxicity (OPIDN) is a neurodegenerative disorder characterized by ataxia progressing to paralysis with a concomitant central and peripheral, distal axonapathy. Diisopropylphosphorofluoridate (DFP) produces OPIDN in the chicken that results in mild ataxia in 7–14 days and severe paralysis as the disease progresses with a single dose. White leghorn layer hens were treated with DFP (1.7 mg/kg, sc) after prophylactic treatment with atropine (1 mg/kg, sc) in normal saline and eserine (1 mg/kg, sc) in dimethyl sulfoxide. Control groups were treated with vehicle propylene glycol (0.1 ml/kg, sc), atropine in normal saline and eserine in dimethyl sulfoxide. The hens were euthanized at different time points such as 1, 2, 5, 10 and 20 days, and the tissues from cerebrum, midbrain, cerebellum, brainstem and spinal cord were quickly dissected and frozen for mRNA (northern) studies. Northern blots were probed with BCL2, GADD45, beta actin, and 28S RNA to investigate their expression pattern. Another set of hens was treated for a series of time points and perfused with phosphate buffered saline and fixative for histological studies. Various staining protocols such as Hematoxylin and Eosin (H and E); Sevier-Munger; Cresyl echt Violet for Nissl substance; and Gallocynin stain for Nissl granules were used to assess various patterns of cell death and degenerative changes. Complex cell death mechanisms may be involved in the neuronal and axonal degeneration. These data indicate altered and differential mRNA expressions of BCL2 (anti apoptotic gene) and GADD45 (DNA damage inducible gene) in various tissues. Increased cell death and other degenerative changes noted in the susceptible regions (spinal cord and cerebellum) than the resistant region (cerebrum), may indicate complex molecular pathways via altered BCL2 and GADD45 gene expression, causing the homeostatic imbalance between cell survival and cell death mechanisms. Semi quantitative

  12. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  13. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  14. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  15. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  16. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  17. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  18. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  19. Modelling survival

    DEFF Research Database (Denmark)

    Ashauer, Roman; Albert, Carlo; Augustine, Starrlight

    2016-01-01

    The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test...

  20. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  1. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  2. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  3. Survival analysis

    International Nuclear Information System (INIS)

    Badwe, R.A.

    1999-01-01

    The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model

  4. Network ties and survival

    DEFF Research Database (Denmark)

    Acheampong, George; Narteh, Bedman; Rand, John

    2017-01-01

    Poultry farming has been touted as one of the major ways by which poverty can be reduced in low-income economies like Ghana. Yet, anecdotally there is a high failure rate among these poultry farms. This current study seeks to understand the relationship between network ties and survival chances...... of small commercial poultry farms (SCPFs). We utilize data from a 2-year network survey of SCPFs in rural Ghana. The survival of these poultry farms are modelled using a lagged probit model of farms that persisted from 2014 into 2015. We find that network ties are important to the survival chances...... but this probability reduces as the number of industry ties increases but moderation with dynamic capability of the firm reverses this trend. Our findings show that not all network ties aid survival and therefore small commercial poultry farmers need to be circumspect in the network ties they cultivate and develop....

  5. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  6. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  7. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  8. Compensatory effects of recruitment and survival when amphibian populations are perturbed by disease

    Science.gov (United States)

    Muths, E.; Scherer, R. D.; Pilliod, D.S.

    2011-01-01

    The need to increase our understanding of factors that regulate animal population dynamics has been catalysed by recent, observed declines in wildlife populations worldwide. Reliable estimates of demographic parameters are critical for addressing basic and applied ecological questions and understanding the response of parameters to perturbations (e.g. disease, habitat loss, climate change). However, to fully assess the impact of perturbation on population dynamics, all parameters contributing to the response of the target population must be estimated. We applied the reverse-time model of Pradel in Program mark to 6years of capture-recapture data from two populations of Anaxyrus boreas (boreal toad) populations, one with disease and one without. We then assessed a priori hypotheses about differences in survival and recruitment relative to local environmental conditions and the presence of disease. We further explored the relative contribution of survival probability and recruitment rate to population growth and investigated how shifts in these parameters can alter population dynamics when a population is perturbed. High recruitment rates (0??41) are probably compensating for low survival probability (range 0??51-0??54) in the population challenged by an emerging pathogen, resulting in a relatively slow rate of decline. In contrast, the population with no evidence of disease had high survival probability (range 0??75-0??78) but lower recruitment rates (0??25). Synthesis and applications.We suggest that the relationship between survival and recruitment may be compensatory, providing evidence that populations challenged with disease are not necessarily doomed to extinction. A better understanding of these interactions may help to explain, and be used to predict, population regulation and persistence for wildlife threatened with disease. Further, reliable estimates of population parameters such as recruitment and survival can guide the formulation and implementation of

  9. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  10. Essays on Subjective Survival Probabilities, Consumption, and Retirement Decisions

    NARCIS (Netherlands)

    Kutlu Koc, Vesile

    2015-01-01

    Recent pension reforms in industrialized countries are, in part, motivated by the increased life expectancy. As individuals are expected to take more responsibility in their retirement planning and savings decisions, it is important to understand whether they are aware of improvements in life

  11. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  12. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  13. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  14. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  15. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  16. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  17. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  18. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  19. Inferential Statistics from Black Hispanic Breast Cancer Survival Data

    Directory of Open Access Journals (Sweden)

    Hafiz M. R. Khan

    2014-01-01

    Full Text Available In this paper we test the statistical probability models for breast cancer survival data for race and ethnicity. Data was collected from breast cancer patients diagnosed in United States during the years 1973–2009. We selected a stratified random sample of Black Hispanic female patients from the Surveillance Epidemiology and End Results (SEER database to derive the statistical probability models. We used three common model building criteria which include Akaike Information Criteria (AIC, Bayesian Information Criteria (BIC, and Deviance Information Criteria (DIC to measure the goodness of fit tests and it was found that Black Hispanic female patients survival data better fit the exponentiated exponential probability model. A novel Bayesian method was used to derive the posterior density function for the model parameters as well as to derive the predictive inference for future response. We specifically focused on Black Hispanic race. Markov Chain Monte Carlo (MCMC method was used for obtaining the summary results of posterior parameters. Additionally, we reported predictive intervals for future survival times. These findings would be of great significance in treatment planning and healthcare resource allocation.

  20. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  1. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  2. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  3. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  4. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  5. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  6. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  7. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  8. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  9. A track-event theory of cell survival

    International Nuclear Information System (INIS)

    Besserer, Juergen; Schneider, Uwe

    2015-01-01

    When fractionation schemes for hypofractionation and stereotactic body radiotherapy are considered, a reliable cell survival model at high dose is needed for calculating doses of similar biological effectiveness. In this work a simple model for cell survival which is valid also at high dose is developed from Poisson statistics. An event is defined by two double strand breaks (DSB) on the same or different chromosomes. An event is always lethal due to direct lethal damage or lethal binary misrepair by the formation of chromosome aberrations. Two different mechanisms can produce events: one-track events (OTE) or two-track-events (TTE). The target for an OTE is always a lethal event, the target for an TTE is one DSB. At least two TTEs on the same or different chromosomes are necessary to produce an event. Both, the OTE and the TTE are statistically independent. From the stochastic nature of cell kill which is described by the Poisson distribution the cell survival probability was derived. It was shown that a solution based on Poisson statistics exists for cell survival. It exhibits exponential cell survival at high dose and a finite gradient of cell survival at vanishing dose, which is in agreement with experimental cell studies. The model fits the experimental data nearly as well as the three-parameter formula of Hug-Kellerer and is only based on two free parameters. It is shown that the LQ formalism is an approximation of the model derived in this work. It could be also shown that the derived model predicts a fractionated cell survival experiment better than the LQ-model. It was shown that cell survival can be described with a simple analytical formula on the basis of Poisson statistics. This solution represents in the limit of large dose the typical exponential behavior and predicts cell survival after fractionated dose application better than the LQ-model.

  10. A track-event theory of cell survival

    Energy Technology Data Exchange (ETDEWEB)

    Besserer, Juergen; Schneider, Uwe [Zuerich Univ. (Switzerland). Inst. of Physics; Radiotherapy Hirslanden, Zuerich (Switzerland)

    2015-09-01

    When fractionation schemes for hypofractionation and stereotactic body radiotherapy are considered, a reliable cell survival model at high dose is needed for calculating doses of similar biological effectiveness. In this work a simple model for cell survival which is valid also at high dose is developed from Poisson statistics. An event is defined by two double strand breaks (DSB) on the same or different chromosomes. An event is always lethal due to direct lethal damage or lethal binary misrepair by the formation of chromosome aberrations. Two different mechanisms can produce events: one-track events (OTE) or two-track-events (TTE). The target for an OTE is always a lethal event, the target for an TTE is one DSB. At least two TTEs on the same or different chromosomes are necessary to produce an event. Both, the OTE and the TTE are statistically independent. From the stochastic nature of cell kill which is described by the Poisson distribution the cell survival probability was derived. It was shown that a solution based on Poisson statistics exists for cell survival. It exhibits exponential cell survival at high dose and a finite gradient of cell survival at vanishing dose, which is in agreement with experimental cell studies. The model fits the experimental data nearly as well as the three-parameter formula of Hug-Kellerer and is only based on two free parameters. It is shown that the LQ formalism is an approximation of the model derived in this work. It could be also shown that the derived model predicts a fractionated cell survival experiment better than the LQ-model. It was shown that cell survival can be described with a simple analytical formula on the basis of Poisson statistics. This solution represents in the limit of large dose the typical exponential behavior and predicts cell survival after fractionated dose application better than the LQ-model.

  11. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  12. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  13. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  14. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  15. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  16. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  17. Increasing Winter Maximal Metabolic Rate Improves Intrawinter Survival in Small Birds.

    Science.gov (United States)

    Petit, Magali; Clavijo-Baquet, Sabrina; Vézina, François

    Small resident bird species living at northern latitudes increase their metabolism in winter, and this is widely assumed to improve their chances of survival. However, the relationship between winter metabolic performance and survival has yet to be demonstrated. Using capture-mark-recapture, we followed a population of free-living black-capped chickadees (Poecile atricapillus) over 3 yr and evaluated their survival probability within and among winters. We also measured the size-independent body mass (M s ), hematocrit (Hct), basal metabolic rate (BMR), and maximal thermogenic capacity (Msum) and investigated how these parameters influenced survival within and among winters. Results showed that survival probability was high and constant both within (0.92) and among (0.96) winters. They also showed that while M s , Hct, and BMR had no significant influence, survival was positively related to Msum-following a sigmoid relationship-within but not among winter. Birds expressing an Msum below 1.26 W (i.e., similar to summer levels) had a winter. Our data therefore suggest that black-capped chickadees that are either too slow or unable to adjust their phenotype from summer to winter have little chances of survival and thus that seasonal upregulation of metabolic performance is highly beneficial. This study is the first to document in an avian system the relationship between thermogenic capacity and winter survival, a proxy of fitness.

  18. Probability distribution of atmospheric pollutants: comparison among four methods for the determination of the log-normal distribution parameters; La distribuzione di probabilita` degli inquinanti atmosferici: confronto tra quattro metodi per la determinazione dei parametri della distribuzione log-normale

    Energy Technology Data Exchange (ETDEWEB)

    Bellasio, R [Enviroware s.r.l., Agrate Brianza, Milan (Italy). Centro Direzionale Colleoni; Lanzani, G; Ripamonti, M; Valore, M [Amministrazione Provinciale, Como (Italy)

    1998-04-01

    This work illustrates the possibility to interpolate the measured concentrations of CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} during one year (1995) at the 13 stations of the air quality monitoring station network of the Provinces of Como and Lecco (Italy) by means of a log-normal distribution. Particular attention was given in choosing the method for the determination of the log-normal distribution parameters among four possible methods: I natural, II percentiles, III moments, IV maximum likelihood. In order to evaluate the goodness of fit a ranking procedure was carried out over the values of four indices: absolute deviation, weighted absolute deviation, Kolmogorov-Smirnov index and Cramer-von Mises-Smirnov index. The capability of the log-normal distribution to fit the measured data is then discussed as a function of the pollutant and of the monitoring station. Finally an example of application is given: the effect of an emission reduction strategy in Lombardy Region (the so called `bollino blu`) is evaluated using a log-normal distribution. [Italiano] In questo lavoro si discute la possibilita` di interpolare le concentrazioni misurate di CO, NO, NO{sub 2}, O{sub 3}, SO{sub 2} durante un anno solare (il 1995) nelle 13 stazioni della Rete di Rilevamento della qualita` dell`aria delle Provincie di Como e di Lecco mediante una funzione log-normale. In particolare si discute quale metodo e` meglio usare per l`individuazione dei 2 parametri caratteristici della log-normale, tra 4 teoreticamente possibili: I naturale, II dei percentili, III dei momenti, IV della massima verosimiglianza. Per valutare i risultati ottenuti si usano: la deviazione assoluta, la deviazione pesata, il parametro di Kolmogorov-Smirnov e quello di Cramer-von Mises-Smirnov effettuando un ranking tra i metodi in funzione degli inquinanti e della stazione di misura. Ancora in funzione degli inquinanti e delle diverse stazioni di misura si discute poi la capacita` della funzione log-normale di

  19. Probability dynamics of a repopulating tumor in case of fractionated external radiotherapy.

    Science.gov (United States)

    Stavreva, Nadia; Stavrev, Pavel; Fallone, B Gino

    2009-12-01

    In this work two analytical methods are developed for computing the probability distribution of the number of surviving cells of a repopulating tumor during a fractionated external radio-treatment. Both methods are developed for the case of pure birth processes. They both allow the description of the tumor dynamics in case of cell radiosensitivity changing in time and for treatment schedules with variable dose per fraction and variable time intervals between fractions. The first method is based on a direct solution of the set of differential equations describing the tumor dynamics. The second method is based on the works of Hanin et al. [Hanin LG, Zaider M, Yakovlev AY. Distribution of the number of clonogens surviving fractionated radiotherapy: a long-standing problem revisited. Int J Radiat Biol 2001;77:205-13; Hanin LG. Iterated birth and death process as a model of radiation cell survival. Math Biosci 2001;169:89-107; Hanin LG. A stochastic model of tumor response to fractionated radiation: limit theorems and rate of convergence. Math Biosci 2004;191:1-17], where probability generating functions are used. In addition a Monte Carlo algorithm for simulating the probability distributions is developed for the same treatment conditions as for the analytical methods. The probability distributions predicted by the three methods are compared graphically for a certain set of values of the model parameters and an excellent agreement is found to exist between all three results, thus proving the correct implementation of the methods. However, numerical difficulties have been encountered with both analytical methods depending on the values of the model parameters. Therefore, the Poisson approximation is also studied and it is compared to the exact methods for several different combinations of the model parameter values. It is concluded that the Poisson approximation works sufficiently well only for slowly repopulating tumors and a low cell survival probability and that it

  20. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  1. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  2. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  3. Carbonaceous Survivability on Impact

    Science.gov (United States)

    Bunch, T. E.; Becker, Luann; Morrison, David (Technical Monitor)

    1994-01-01

    In order to gain knowledge about the potential contributions of comets and cosmic dust to the origin of life on Earth, we need to explore the survivability of their potential organic compounds on impact and the formation of secondary products that may have arisen from the chaotic events sustained by the carriers as they fell to Earth. We have performed a series of hypervelocity impact experiments using carbon-bearing impactors (diamond, graphite, kerogens, PAH crystals, and Murchison and Nogoya meteorites) into Al plate targets at velocities - 6 km/s. Estimated peak shock pressures probably did not exceed 120 GPa and peak shock temperatures were probably less than 4000 K for times of nano- to microsecs. Nominal crater dia. are less than one mm. The most significant results of these experiments are the preservation of the higher mass PAHs (e. g., pyrene relative to napthalene) and the formation of additional alkylated PAHs. We have also examined the residues of polystyrene projectiles impacted by a microparticle accelerator into targets at velocities up to 15 km/s. This talk will discuss the results of these experiments and their implications with respect to the survival of carbonaceous deliverables to early Earth. The prospects of survivability of organic molecules on "intact" capture of cosmic dust in space via soft: and hard cosmic dust collectors will also be discussed.

  4. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  5. Analysis of histological and immunological parameters of metastatic lymph nodes from colon cancer patients reveals that T-helper 1 type immune response is associated with improved overall survival.

    Science.gov (United States)

    Nizri, Eran; Greenman-Maaravi, Nofar; Bar-David, Shoshi; Ben-Yehuda, Amir; Weiner, Gilad; Lahat, Guy; Klausner, Joseph

    2016-11-01

    Lymph node (LN) involvement in colonic carcinoma (CC) is a grave prognostic sign and mandates the addition of adjuvant treatment. However, in light of the histological variability and outcomes observed, we hypothesized that patients with LN metastases (LNM) comprise different subgroups.We retrospectively analyzed the histological sections of 82 patients with CC and LNM. We studied various histological parameters (such as tumor grade, desmoplasia, and preservation of LN architecture) as well as the prevalence of specific peritumoral immune cells (CD8, CD20, T-bet, and GATA-3). We correlated the histological and immunological data to patient outcome.Tumor grade was a significant prognostic factor even in patients with LNM. So was the number of LN involved (N1/N2 stage). From the morphological parameters tested (LN extracapsular invasion, desmoplasia in LN, LN architecture preservation, and mode of metastases distribution), none was found to be significantly associated with overall survival (OS). The mean OS of CD8 low patients was 66.6 ± 6.25 versus 71.4 ± 5.1 months for CD8 high patients (P = 0.79). However, T-helper (Th) 1 immune response skewing (measured by Th1/Th2 ratio >1) was significantly associated with improved OS. For patients with low ratio, the median OS was 35.5 ± 5 versus 83.5 months for patients with high Th1/Th2 ratio (P = 0.001).The histological presentation of LNM does not entail specific prognostic information. However, the finding of Th1 immune response in LN signifies a protective immune response. Future studies should be carried to verify this marker and develop a strategy that augments this immune response during subsequent adjuvant treatment.

  6. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    of uncertain parameters. Monte Carlo simulation is readily used for practical calculations. However, an alternative approach is offered by possibility theory making use of possibility distributions such as intervals and fuzzy intervals. This approach is well suited to represent lack of knowledge or imprecision......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  7. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  8. Rapidity gap survival in central exclusive diffraction: Dynamical mechanisms and uncertainties

    International Nuclear Information System (INIS)

    Strikman, Mark; Weiss, Christian

    2009-01-01

    We summarize our understanding of the dynamical mechanisms governing rapidity gap survival in central exclusive diffraction, pp -> p + H + p (H = high-mass system), and discuss the uncertainties in present estimates of the survival probability. The main suppression of diffractive scattering is due to inelastic soft spectator interactions at small pp impact parameters and can be described in a mean-field approximation (independent hard and soft interactions). Moderate extra suppression results from fluctuations of the partonic configurations of the colliding protons. At LHC energies absorptive interactions of hard spectator partons associated with the gg -> H process reach the black-disk regime and cause substantial additional suppression, pushing the survival probability below 0.01.

  9. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  10. A probability of synthesis of the superheavy element Z = 124

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First Grade College, Department of Physics, Kolar, Karnataka (India)

    2017-10-15

    We have studied the fusion cross section, evaporation residue cross section, compound nucleus formation probability (P{sub CN}) and survival probability (P{sub sur}) of different projectile target combinations to synthesize the superheavy element Z=124. Hence, we have identified the most probable projectile-target combination to synthesize the superheavy element Z = 124. To synthesize the superheavy element Z=124, the most probable projectile target combinations are Kr+Ra, Ni+Cm, Se+Th, Ge+U and Zn+Pu. We hope that our predictions may be a guide for the future experiments in the synthesis of superheavy nuclei Z = 124. (orig.)

  11. Survival Analysis

    CERN Document Server

    Miller, Rupert G

    2011-01-01

    A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.

  12. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  13. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  14. Bayesian estimation of core-melt probability

    International Nuclear Information System (INIS)

    Lewis, H.W.

    1984-01-01

    A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease

  15. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  16. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  17. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  18. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Prospect evaluation as a function of numeracy and probability denominator.

    Science.gov (United States)

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. A survivability model for ejection of green compacts in powder metallurgy technology

    Directory of Open Access Journals (Sweden)

    Payman Ahi

    2012-01-01

    Full Text Available Reliability and quality assurance have become major considerations in the design and manufacture of today’s parts and products. Survivability of green compact using powder metallurgy technology is considered as one of the major quality attributes in manufacturing systems today. During powder metallurgy (PM production, the compaction conditions and behavior of the metal powder dictate the stress and density distribution in the green compact prior to sintering. These parameters greatly influence the mechanical properties and overall strength of the final component. In order to improve these properties, higher compaction pressures are usually employed, which make unloading and ejection of green compacts more challenging, especially for the powder-compacted parts with relatively complicated shapes. This study looked at a mathematical survivability model concerning green compact characteristics in PM technology and the stress-strength failure model in reliability engineering. This model depicts the relationship between mechanical loads (stress during ejection, experimentally determined green strength and survivability of green compact. The resulting survivability is the probability that a green compact survives during and after ejection. This survivability model can be used as an efficient tool for selecting the appropriate parameters for the process planning stage in PM technology. A case study is presented here in order to demonstrate the application of the proposed survivability model.

  1. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  2. Foreign Ownership and Long-term Survival

    DEFF Research Database (Denmark)

    Kronborg, Dorte; Thomsen, Steen

    2006-01-01

    probability. On average exit risk for domestic companies is 2.3 times higher than for foreign companies. First movers like Siemens, Philips, Kodak, Ford, GM or Goodyear have been active in the country for almost a century. Relative foreign survival increases with company age. However, the foreign survival...

  3. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  4. Probability of Criticality for MOX SNF

    International Nuclear Information System (INIS)

    P. Gottlieb

    1999-01-01

    The purpose of this calculation is to provide a conservative (upper bound) estimate of the probability of criticality for mixed oxide (MOX) spent nuclear fuel (SNF) of the Westinghouse pressurized water reactor (PWR) design that has been proposed for use. with the Plutonium Disposition Program (Ref. 1, p. 2). This calculation uses a Monte Carlo technique similar to that used for ordinary commercial SNF (Ref. 2, Sections 2 and 5.2). Several scenarios, covering a range of parameters, are evaluated for criticality. Parameters specifying the loss of fission products and iron oxide from the waste package are particularly important. This calculation is associated with disposal of MOX SNF

  5. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  6. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  7. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  8. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  9. Probabilistic Survivability Versus Time Modeling

    Science.gov (United States)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  10. Estimating true instead of apparent survival using spatial Cormack-Jolly-Seber models

    Science.gov (United States)

    Schaub, Michael; Royle, J. Andrew

    2014-01-01

    Survival is often estimated from capture–recapture data using Cormack–Jolly–Seber (CJS) models, where mortality and emigration cannot be distinguished, and the estimated apparent survival probability is the product of the probabilities of true survival and of study area fidelity. Consequently, apparent survival is lower than true survival unless study area fidelity equals one. Underestimation of true survival from capture–recapture data is a main limitation of the method.

  11. Rapidity gap survival in the black-disk regime

    International Nuclear Information System (INIS)

    Leonid Frankfurt; Charles Hyde; Mark Strikman; Christian Weiss

    2007-01-01

    We summarize how the approach to the black-disk regime (BDR) of strong interactions at TeV energies influences rapidity gap survival in exclusive hard diffraction pp -> p + H + p (H = dijet, Qbar Q, Higgs). Employing a recently developed partonic description of such processes, we discuss (a) the suppression of diffraction at small impact parameters by soft spectator interactions in the BDR; (b) further suppression by inelastic interactions of hard spectator partons in the BDR; (c) effects of correlations between hard and soft interactions, as suggested by various models of proton structure (color fluctuations, spatial correlations of partons). Hard spectator interactions in the BDR substantially reduce the rapidity gap survival probability at LHC energies compared to previously reported estimates

  12. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  13. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  14. Quantum correlations in terms of neutrino oscillation probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Alok, Ashutosh Kumar, E-mail: akalok@iitj.ac.in [Indian Institute of Technology Jodhpur, Jodhpur 342011 (India); Banerjee, Subhashish, E-mail: subhashish@iitj.ac.in [Indian Institute of Technology Jodhpur, Jodhpur 342011 (India); Uma Sankar, S., E-mail: uma@phy.iitb.ac.in [Indian Institute of Technology Bombay, Mumbai 400076 (India)

    2016-08-15

    Neutrino oscillations provide evidence for the mode entanglement of neutrino mass eigenstates in a given flavour eigenstate. Given this mode entanglement, it is pertinent to consider the relation between the oscillation probabilities and other quantum correlations. In this work, we show that all the well-known quantum correlations, such as the Bell's inequality, are directly related to the neutrino oscillation probabilities. The results of the neutrino oscillation experiments, which measure the neutrino survival probability to be less than unity, imply Bell's inequality violation.

  15. Continuation of probability density functions using a generalized Lyapunov approach

    NARCIS (Netherlands)

    Baars, S.; Viebahn, J. P.; Mulder, T. E.; Kuehn, C.; Wubs, F. W.; Dijkstra, H. A.

    2017-01-01

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial

  16. Some simple applications of probability models to birth intervals

    International Nuclear Information System (INIS)

    Shrestha, G.

    1987-07-01

    An attempt has been made in this paper to apply some simple probability models to birth intervals under the assumption of constant fecundability and varying fecundability among women. The parameters of the probability models are estimated by using the method of moments and the method of maximum likelihood. (author). 9 refs, 2 tabs

  17. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  18. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  19. Extensions and Applications of the Cox-Aalen Survival Model

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2003-01-01

    Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects......Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects...

  20. Flu Shots, Mammogram, and the Perception of Probabilities

    NARCIS (Netherlands)

    Carman, K.G.; Kooreman, P.

    2010-01-01

    We study individuals’ decisions to decline or accept preventive health care interventions such as flu shots and mammograms. In particular, we analyze the role of perceptions of the effectiveness of the intervention, by eliciting individuals' subjective probabilities of sickness and survival, with

  1. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  2. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  3. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  4. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  5. Safeguards systems parameters

    International Nuclear Information System (INIS)

    Avenhaus, R.; Heil, J.

    1979-01-01

    In this paper analyses are made of the values of those parameters that characterize the present safeguards system that is applied to a national fuel cycle; those values have to be fixed quantitatively so that all actions of the safeguards authority are specified precisely. The analysis starts by introducing three categories of quantities: The design parameters (number of MBAs, inventory frequency, variance of MUF, verification effort and false-alarm probability) describe those quantities whose values have to be specified before the safeguards system can be implemented. The performance criteria (probability of detection, expected detection time, goal quantity) measure the effectiveness of a safeguards system; and the standards (threshold amount and critical time) characterize the magnitude of the proliferation problem. The means by which the values of the individual design parameters can be determined with the help of the performance criteria; which qualitative arguments can narrow down the arbitrariness of the choice of values of the remaining parameters; and which parameter values have to be fixed more or less arbitrarily, are investigated. As a result of these considerations, which include the optimal allocation of a given inspection effort, the problem of analysing the structure of the safeguards system is reduced to an evaluation of the interplay of only a few parameters, essentially the quality of the measurement system (variance of MUF), verification effort, false-alarm probability, goal quantity and probability of detection

  6. Surviving Sengstaken.

    Science.gov (United States)

    Jayakumar, S; Odulaja, A; Patel, S; Davenport, M; Ade-Ajayi, N

    2015-07-01

    To report the outcomes of children who underwent Sengstaken-Blakemore tube (SBT) insertion for life-threatening haemetemesis. Single institution retrospective review (1997-2012) of children managed with SBT insertion. Patient demographics, diagnosis and outcomes were noted. Data are expressed as median (range). 19 children [10 male, age 1 (0.4-16) yr] were identified; 18 had gastro-oesophageal varices and 1 aorto-oesophageal fistula. Varices were secondary to: biliary atresia (n=8), portal vein thrombosis (n=5), alpha-1-anti-trypsin deficiency (n=1), cystic fibrosis (n=1), intrahepatic cholestasis (n=1), sclerosing cholangitis (n=1) and nodular hyperplasia with arterio-portal shunt (n=1). Three children deteriorated rapidly and did not survive to have post-SBT endoscopy. The child with an aortooesophageal fistula underwent aortic stent insertion and subsequently oesophageal replacement. Complications included gastric mucosal ulceration (n=3, 16%), pressure necrosis at lips and cheeks (n=6, 31%) and SBT dislodgment (n=1, 6%). Six (31%) children died. The remaining 13 have been followed up for 62 (2-165) months; five required liver transplantation, two underwent a mesocaval shunt procedure and 6 have completed endoscopic variceal obliteration and are under surveillance. SBT can be an effective, albeit temporary, life-saving manoeuvre in children with catastrophic haematemesis. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Survival of radio-implanted drymarchon couperi (Eastern Indigo Snake) in relation to body size and sex

    Science.gov (United States)

    Hyslop, N.L.; Meyers, J.M.; Cooper, R.J.; Norton, Terry M.

    2009-01-01

    Drymarchon couperi (eastern indigo snake) has experienced population declines across its range primarily as a result of extensive habitat loss, fragmentation, and degradation. Conservation efforts for D. couperi have been hindered, in part, because of informational gaps regarding the species, including a lack of data on population ecology and estimates of demographic parameters such as survival. We conducted a 2- year radiotelemetry study of D. couperi on Fort Stewart Military Reservation and adjacent private lands located in southeastern Georgia to assess individual characteristics associated with probability of survival. We used known-fate modeling to estimate survival, and an information-theoretic approach, based on a priori hypotheses, to examine intraspecific differences in survival probabilities relative to individual covariates (sex, size, size standardized by sex, and overwintering location). Annual survival in 2003 and 2004 was 0.89 (95% CI = 0.73-0.97, n = 25) and 0.72 (95% CI = 0.52-0.86; n = 27), respectively. Results indicated that body size, standardized by sex, was the most important covariate determining survival of adult D. couperi, suggesting lower survival for larger individuals within each sex. We are uncertain of the mechanisms underlying this result, but possibilities may include greater resource needs for larger individuals within each sex, necessitating larger or more frequent movements, or a population with older individuals. Our results may also have been influenced by analysis limitations because of sample size, other sources of individual variation, or environmental conditions. ?? 2009 by The Herpetologists' League, Inc.

  8. How Life History Can Sway the Fixation Probability of Mutants

    Science.gov (United States)

    Li, Xiang-Yi; Kurokawa, Shun; Giaimo, Stefano; Traulsen, Arne

    2016-01-01

    In this work, we study the effects of demographic structure on evolutionary dynamics when selection acts on reproduction, survival, or both. In contrast to the previously discovered pattern that the fixation probability of a neutral mutant decreases while the population becomes younger, we show that a mutant with a constant selective advantage may have a maximum or a minimum of the fixation probability in populations with an intermediate fraction of young individuals. This highlights the importance of life history and demographic structure in studying evolutionary dynamics. We also illustrate the fundamental differences between selection on reproduction and selection on survival when age structure is present. In addition, we evaluate the relative importance of size and structure of the population in determining the fixation probability of the mutant. Our work lays the foundation for also studying density- and frequency-dependent effects in populations when demographic structures cannot be neglected. PMID:27129737

  9. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  11. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  12. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  14. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  15. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  16. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  17. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    Anthony, J.P.; Bacher, P.; Lheureux, L.; Moreau, J.; Schmitt, A.P.

    1957-01-01

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [fr

  18. Conditional probabilities in Ponzano-Regge minisuperspace

    International Nuclear Information System (INIS)

    Petryk, Roman; Schleich, Kristin

    2003-01-01

    We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes

  19. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  20. Detection probability of least tern and piping plover chicks in a large river system

    Science.gov (United States)

    Roche, Erin A.; Shaffer, Terry L.; Anteau, Michael J.; Sherfy, Mark H.; Stucker, Jennifer H.; Wiltermuth, Mark T.; Dovichin, Colin M.

    2014-01-01

    Monitoring the abundance and stability of populations of conservation concern is often complicated by an inability to perfectly detect all members of the population. Mark-recapture offers a flexible framework in which one may identify factors contributing to imperfect detection, while at the same time estimating demographic parameters such as abundance or survival. We individually color-marked, recaptured, and re-sighted 1,635 federally listed interior least tern (Sternula antillarum; endangered) chicks and 1,318 piping plover (Charadrius melodus; threatened) chicks from 2006 to 2009 at 4 study areas along the Missouri River and investigated effects of observer-, subject-, and site-level covariates suspected of influencing detection. Increasing the time spent searching and crew size increased the probability of detecting both species regardless of study area and detection methods were not associated with decreased survival. However, associations between detection probability and the investigated covariates were highly variable by study area and species combinations, indicating that a universal mark-recapture design may not be appropriate.

  1. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  2. Evolvement simulation of the probability of neutron-initiating persistent fission chain

    International Nuclear Information System (INIS)

    Wang Zhe; Hong Zhenying

    2014-01-01

    Background: Probability of neutron-initiating persistent fission chain, which has to be calculated in analysis of critical safety, start-up of reactor, burst waiting time on pulse reactor, bursting time on pulse reactor, etc., is an inherent parameter in a multiplying assembly. Purpose: We aim to derive time-dependent integro-differential equation for such probability in relative velocity space according to the probability conservation, and develop the deterministic code Dynamic Segment Number Probability (DSNP) based on the multi-group S N method. Methods: The reliable convergence of dynamic calculation was analyzed and numerical simulation of the evolvement process of dynamic probability for varying concentration was performed under different initial conditions. Results: On Highly Enriched Uranium (HEU) Bare Spheres, when the time is long enough, the results of dynamic calculation approach to those of static calculation. The most difference of such results between DSNP and Partisn code is less than 2%. On Baker model, over the range of about 1 μs after the first criticality, the most difference between the dynamic and static calculation is about 300%. As for a super critical system, the finite fission chains decrease and the persistent fission chains increase as the reactivity aggrandizes, the dynamic evolvement curve of initiation probability is close to the static curve within the difference of 5% when the K eff is more than 1.2. The cumulative probability curve also indicates that the difference of integral results between the dynamic calculation and the static calculation decreases from 35% to 5% as the K eff increases. This demonstrated that the ability of initiating a self-sustaining fission chain reaction approaches stabilization, while the former difference (35%) showed the important difference of the dynamic results near the first criticality with the static ones. The DSNP code agrees well with Partisn code. Conclusions: There are large numbers of

  3. Physiological parameters

    International Nuclear Information System (INIS)

    Natera, E.S.

    1998-01-01

    The physiological characteristics of man depend on the intake, metabolism and excretion of stable elements from food, water, and air. The physiological behavior of natural radionuclides and radionuclides from nuclear weapons testing and from the utilization of nuclear energy is believed to follow the pattern of stable elements. Hence information on the normal physiological processes occurring in the human body plays an important role in the assessment of the radiation dose received by man. Two important physiological parameters needed for internal dose determination are the pulmonary function and the water balance. In the Coordinated Research Programme on the characterization of Asian population, five participants submitted data on these physiological characteristics - China, India, Japan, Philippines and Viet Nam. During the CRP, data on other pertinent characteristics such as physical and dietary were simultaneously being collected. Hence, the information on the physiological characteristics alone, coming from the five participants were not complete and are probably not sufficient to establish standard values for the Reference Asian Man. Nonetheless, the data collected is a valuable contribution to this research programme

  4. The effect of chemical weapons incineration on the survival rates of Red-tailed Tropicbirds

    Science.gov (United States)

    Schreiber, E.A.; Schenk, G.A.; Doherty, P.F.

    2001-01-01

    In 1992, the Johnston Atoll Chemical Agent Disposal System (JACADS) began incinerating U.S. chemical weapons stockpiles on Johnston Atoll (Pacific Ocean) where about 500,000 seabirds breed, including Red-tailed Tropicbirds (Phaethon rubricauda). We hypothesized that survival rates of birds were lower in those nesting downwind of the incinerator smokestack compared to those upwind, and that birds might move away from the area. From 1992 - 2000 we monitored survival and movements between areas upwind and downwind from the JACADS facility. We used a multi-strata mark recapture approach to model survival, probability of recapture and movement. Probability of recapture was significantly higher for birds in downwind areas (owing to greater recapture effort) and thus was an important 'nuisance' parameter to take into account in modeling. We found no differences in survival between birds nesting upwind ( 0.8588) and downwind (0.8550). There was no consistent difference in movement rates between upwind or downwind areas from year to year: differences found may be attributed to differing vegetation growth and human activities between the areas. Our results suggest that JACADS has had no documentable influence on the survival and year to year movement of Red-tailed Tropicbirds.

  5. The design and analysis of salmonid tagging studies in the Columbia River. Volume 7: Monte-Carlo comparison of confidence internal procedures for estimating survival in a release-recapture study, with applications to Snake River salmonids

    International Nuclear Information System (INIS)

    Lowther, A.B.; Skalski, J.

    1996-06-01

    Confidence intervals for survival probabilities between hydroelectric facilities of migrating juvenile salmonids can be computed from the output of the SURPH software developed at the Center for Quantitative Science at the University of Washington. These intervals have been constructed using the estimate of the survival probability, its associated standard error, and assuming the estimate is normally distributed. In order to test the validity and performance of this procedure, two additional confidence interval procedures for estimating survival probabilities were tested and compared using simulated mark-recapture data. Intervals were constructed using normal probability theory, using a percentile-based empirical bootstrap algorithm, and using the profile likelihood concept. Performance of each method was assessed for a variety of initial conditions (release sizes, survival probabilities, detection probabilities). These initial conditions were chosen to encompass the range of parameter values seen in the 1993 and 1994 Snake River juvenile salmonid survival studies. The comparisons among the three estimation methods included average interval width, interval symmetry, and interval coverage

  6. PWR reactor pressure vessel failure probabilities

    International Nuclear Information System (INIS)

    Dufresne, J.; Lanore, J.M.; Lucia, A.C.; Elbaz, J.; Brunnhuber, R.

    1980-05-01

    To evaluate the rupture probability of a LWR vessel a probabilistic method using the fracture mechanics under probabilistic form has been proposed previously, but it appears that more accurate evaluation is possible. In consequence a joint collaboration agreement signed in 1976 between CEA, EURATOM, JRC Ispra and FRAMATOME set up and started a research program covering three parts: a computer code development, data acquisition and processing, and a support experimental program which aims at clarifying the most important parameters used in the COVASTOL computer code

  7. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Understanding survival analysis: Kaplan-Meier estimate.

    Science.gov (United States)

    Goel, Manish Kumar; Khanna, Pardeep; Kishore, Jugal

    2010-10-01

    Kaplan-Meier estimate is one of the best options to be used to measure the fraction of subjects living for a certain amount of time after treatment. In clinical trials or community trials, the effect of an intervention is assessed by measuring the number of subjects survived or saved after that intervention over a period of time. The time starting from a defined point to the occurrence of a given event, for example death is called as survival time and the analysis of group data as survival analysis. This can be affected by subjects under study that are uncooperative and refused to be remained in the study or when some of the subjects may not experience the event or death before the end of the study, although they would have experienced or died if observation continued, or we lose touch with them midway in the study. We label these situations as censored observations. The Kaplan-Meier estimate is the simplest way of computing the survival over time in spite of all these difficulties associated with subjects or situations. The survival curve can be created assuming various situations. It involves computing of probabilities of occurrence of event at a certain point of time and multiplying these successive probabilities by any earlier computed probabilities to get the final estimate. This can be calculated for two groups of subjects and also their statistical difference in the survivals. This can be used in Ayurveda research when they are comparing two drugs and looking for survival of subjects.

  9. Gap probability - Measurements and models of a pecan orchard

    Science.gov (United States)

    Strahler, Alan H.; Li, Xiaowen; Moody, Aaron; Liu, YI

    1992-01-01

    Measurements and models are compared for gap probability in a pecan orchard. Measurements are based on panoramic photographs of 50* by 135 view angle made under the canopy looking upwards at regular positions along transects between orchard trees. The gap probability model is driven by geometric parameters at two levels-crown and leaf. Crown level parameters include the shape of the crown envelope and spacing of crowns; leaf level parameters include leaf size and shape, leaf area index, and leaf angle, all as functions of canopy position.

  10. Estimation of survival of adult Florida manatees in the Crystal River, at Blue Spring, and on the Atlantic Coast

    Science.gov (United States)

    O'Shea, Thomas J.; Langtimm, Catherine A.; O'Shea, Thomas J.; Ackerman, B.B.; Percival, H. Franklin

    1995-01-01

    We applied Cormack-Jolly-Seber open population models to manatee (Trichechus manatus latirostris) photo-identification databases to estimate adult survival probabilities. The computer programs JOLLY and RECAPCO were used to estimate survival of 677 individuals in three study areas: Crystal River (winters 1977-78 to 1990-91), Blue Spring (winters 1977-78 to 1990-91), and the Atlantic Coast (winters 1984-85 to 1990-91). We also estimated annual survival from observations of 111 manatees tagged for studies with radiotelemetry. Survival estimated from observations with telemetry had broader confidence intervals than survival estimated with the Cormack-Jolly-Seber models. Annual probabilities of capture based on photo-identification records were generally high. The mean annual adult survival estimated from sighting-resighting records was 0.959-0.962 in the Crystal River and 0.936-0.948 at Blue Spring and may be high enough to permit population growth, given the values of other life-history parameters. On the Atlantic Coast, the estimated annual adult survival (range of means = 0.877-0.885) may signify a declining population. However, for several reasons, interpretation of data from the latter study group should be tempered with caution. Adult survivorship seems to be constant with age in all three study groups. No strong differences were apparent between adult survival ofmales and females in the Crystal River or at Blue Spring; the basis of significant differences between sexes on the Atlantic Coast is unclear. Future research into estimating survival with photo-identification and the Cormack-Jolly-Seber models should be vigorously pursued. Estimates of annual survival can provide an additional indication of Florida manatee population status with a stronger statistical basis than aerial counts and carcass totals.

  11. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  12. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  13. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  14. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  15. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  16. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  17. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  18. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  19. Models for probability and statistical inference theory and applications

    CERN Document Server

    Stapleton, James H

    2007-01-01

    This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...

  20. Probability of seeing increases saccadic readiness.

    Directory of Open Access Journals (Sweden)

    Thérèse Collins

    Full Text Available Associating movement directions or endpoints with monetary rewards or costs influences movement parameters in humans, and associating movement directions or endpoints with food reward influences movement parameters in non-human primates. Rewarded movements are facilitated relative to non-rewarded movements. The present study examined to what extent successful foveation facilitated saccadic eye movement behavior, with the hypothesis that foveation may constitute an informational reward. Human adults performed saccades to peripheral targets that either remained visible after saccade completion or were extinguished, preventing visual feedback. Saccades to targets that were systematically extinguished were slower and easier to inhibit than saccades to targets that afforded successful foveation, and this effect was modulated by the probability of successful foveation. These results suggest that successful foveation facilitates behavior, and that obtaining the expected sensory consequences of a saccadic eye movement may serve as a reward for the oculomotor system.

  1. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  2. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  3. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  4. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  5. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  6. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  7. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  8. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  9. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  10. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  11. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  12. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  13. Social class and survival on the S.S. Titanic.

    Science.gov (United States)

    Hall, W

    1986-01-01

    Passengers' chances of surviving the sinking of the S.S. Titanic were related to their sex and their social class: females were more likely to survive than males, and the chances of survival declined with social class as measured by the class in which the passenger travelled. The probable reasons for these differences in rates of survival are discussed as are the reasons accepted by the Mersey Committee of Inquiry into the sinking.

  14. Joint modelling of longitudinal CEA tumour marker progression and survival data on breast cancer

    Science.gov (United States)

    Borges, Ana; Sousa, Inês; Castro, Luis

    2017-06-01

    This work proposes the use of Biostatistics methods to study breast cancer in patients of Braga's Hospital Senology Unit, located in Portugal. The primary motivation is to contribute to the understanding of the progression of breast cancer, within the Portuguese population, using a more complex statistical model assumptions than the traditional analysis that take into account a possible existence of a serial correlation structure within a same subject observations. We aim to infer which risk factors aect the survival of Braga's Hospital patients, diagnosed with breast tumour. Whilst analysing risk factors that aect a tumour markers used on the surveillance of disease progression the Carcinoembryonic antigen (CEA). As survival and longitudinal processes may be associated, it is important to model these two processes together. Hence, a joint modelling of these two processes to infer on the association of these was conducted. A data set of 540 patients, along with 50 variables, was collected from medical records of the Hospital. A joint model approach was used to analyse these data. Two dierent joint models were applied to the same data set, with dierent parameterizations which give dierent interpretations to model parameters. These were used by convenience as the ones implemented in R software. Results from the two models were compared. Results from joint models, showed that the longitudinal CEA values were signicantly associated with the survival probability of these patients. A comparison between parameter estimates obtained in this analysis and previous independent survival[4] and longitudinal analysis[5][6], lead us to conclude that independent analysis brings up bias parameter estimates. Hence, an assumption of association between the two processes in a joint model of breast cancer data is necessary. Results indicate that the longitudinal progression of CEA is signicantly associated with the probability of survival of these patients. Hence, an assumption of

  15. Simultaneous use of mark-recapture and radiotelemetry to estimate survival, movement, and capture rates

    Science.gov (United States)

    Powell, L.A.; Conroy, M.J.; Hines, J.E.; Nichols, J.D.; Krementz, D.G.

    2000-01-01

    Biologists often estimate separate survival and movement rates from radio-telemetry and mark-recapture data from the same study population. We describe a method for combining these data types in a single model to obtain joint, potentially less biased estimates of survival and movement that use all available data. We furnish an example using wood thrushes (Hylocichla mustelina) captured at the Piedmont National Wildlife Refuge in central Georgia in 1996. The model structure allows estimation of survival and capture probabilities, as well as estimation of movements away from and into the study area. In addition, the model structure provides many possibilities for hypothesis testing. Using the combined model structure, we estimated that wood thrush weekly survival was 0.989 ? 0.007 ( ?SE). Survival rates of banded and radio-marked individuals were not different (alpha hat [S_radioed, ~ S_banded]=log [S hat _radioed/ S hat _banded]=0.0239 ? 0.0435). Fidelity rates (weekly probability of remaining in a stratum) did not differ between geographic strata (psi hat=0.911 ? 0.020; alpha hat [psi11, psi22]=0.0161 ? 0.047), and recapture rates ( = 0.097 ? 0.016) banded and radio-marked individuals were not different (alpha hat [p_radioed, p_banded]=0.145 ? 0.655). Combining these data types in a common model resulted in more precise estimates of movement and recapture rates than separate estimation, but ability to detect stratum or mark-specific differences in parameters was week. We conducted simulation trials to investigate the effects of varying study designs on parameter accuracy and statistical power to detect important differences. Parameter accuracy was high (relative bias [RBIAS] inference from this model, study designs should seek a minimum of 25 animals of each marking type observed (marked or observed via telemetry) in each time period and geographic stratum.

  16. Challenging a dogma: five-year survival does not equal cure in all colorectal cancer patients.

    Science.gov (United States)

    Abdel-Rahman, Omar

    2018-02-01

    The current study tried to evaluate the factors affecting 10- to 20- years' survival among long term survivors (>5 years) of colorectal cancer (CRC). Surveillance, Epidemiology and End Results (SEER) database (1988-2008) was queried through SEER*Stat program.Univariate probability of overall and cancer-specific survival was determined and the difference between groups was examined. Multivariate analysis for factors affecting overall and cancer-specific survival was also conducted. Among node positive patients (Dukes C), 34% of the deaths beyond 5 years can be attributed to CRC; while among M1 patients, 63% of the deaths beyond 5 years can be attributed to CRC. The following factors were predictors of better overall survival in multivariate analysis: younger age, white race (versus black race), female gender, Right colon location (versus rectal location), earlier stage and surgery (P <0.0001 for all parameters). Similarly, the following factors were predictors of better cancer-specific survival in multivariate analysis: younger age, white race (versus black race), female gender, Right colon location (versus left colon and rectal locations), earlier stage and surgery (P <0.0001 for all parameters). Among node positive long-term CRC survivors, more than one third of all deaths can be attributed to CRC.

  17. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  18. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  19. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  20. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  1. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  2. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  3. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  4. The statistical treatment of cell survival data

    International Nuclear Information System (INIS)

    Boag, J.W.

    1975-01-01

    The paper considers the sources of experimental error in cell survival experiments and discusses in simple terms how these combine to influence the accuracy of single points and the parameters of complete survival curves. Cell sampling and medium-dilution errors are discussed at length and one way of minimizing the former is examined. The Monte-Carlo method of estimating the distribution of derived parameters in small samples is recommended and illustrated. (author)

  5. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  6. Lower Confidence Bounds for the Probabilities of Correct Selection

    Directory of Open Access Journals (Sweden)

    Radhey S. Singh

    2011-01-01

    Full Text Available We extend the results of Gupta and Liang (1998, derived for location parameters, to obtain lower confidence bounds for the probability of correctly selecting the t best populations (PCSt simultaneously for all t=1,…,k−1 for the general scale parameter models, where k is the number of populations involved in the selection problem. The application of the results to the exponential and normal probability models is discussed. The implementation of the simultaneous lower confidence bounds for PCSt is illustrated through real-life datasets.

  7. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  8. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  9. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  10. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  11. Clustered survival data with left-truncation

    DEFF Research Database (Denmark)

    Eriksson, Frank; Martinussen, Torben; Scheike, Thomas H.

    2015-01-01

    Left-truncation occurs frequently in survival studies, and it is well known how to deal with this for univariate survival times. However, there are few results on how to estimate dependence parameters and regression effects in semiparametric models for clustered survival data with delayed entry....... Surprisingly, existing methods only deal with special cases. In this paper, we clarify different kinds of left-truncation and suggest estimators for semiparametric survival models under specific truncation schemes. The large-sample properties of the estimators are established. Small-sample properties...

  12. Flexural strength and the probability of failure of cold isostatic pressed zirconia core ceramics.

    Science.gov (United States)

    Siarampi, Eleni; Kontonasaki, Eleana; Papadopoulou, Lambrini; Kantiranis, Nikolaos; Zorba, Triantafillia; Paraskevopoulos, Konstantinos M; Koidis, Petros

    2012-08-01

    The flexural strength of zirconia core ceramics must predictably withstand the high stresses developed during oral function. The in-depth interpretation of strength parameters and the probability of failure during clinical performance could assist the clinician in selecting the optimum materials while planning treatment. The purpose of this study was to evaluate the flexural strength based on survival probability and Weibull statistical analysis of 2 zirconia cores for ceramic restorations. Twenty bar-shaped specimens were milled from 2 core ceramics, IPS e.max ZirCAD and Wieland ZENO Zr, and were loaded until fracture according to ISO 6872 (3-point bending test). An independent samples t test was used to assess significant differences of fracture strength (α=.05). Weibull statistical analysis of the flexural strength data provided 2 parameter estimates: Weibull modulus (m) and characteristic strength (σ(0)). The fractured surfaces of the specimens were evaluated by scanning electron microscopy (SEM) and energy dispersive spectroscopy (EDS). The investigation of the crystallographic state of the materials was performed with x-ray diffraction analysis (XRD) and Fourier transform infrared (FTIR) spectroscopy. Higher mean flexural strength (Plines zones). Both groups primarily sustained the tetragonal phase of zirconia and a negligible amount of the monoclinic phase. Although both zirconia ceramics presented similar fractographic and crystallographic properties, the higher flexural strength of WZ ceramics was associated with a lower m and more voids in their microstructure. These findings suggest a greater scattering of strength values and a flaw distribution that are expected to increase failure probability. Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  13. Complexity for survival of livings

    International Nuclear Information System (INIS)

    Zak, Michail

    2007-01-01

    A connection between survivability of livings and complexity of their behavior is established. New physical paradigms-exchange of information via reflections, and chain of abstractions-explaining and describing progressive evolution of complexity in living (active) systems are introduced. A biological origin of these paradigms is associated with a recently discovered mirror neuron that is able to learn by imitation. As a result, an active element possesses the self-nonself images and interacts with them creating the world of mental dynamics. Three fundamental types of complexity of mental dynamics that contribute to survivability are identified. Mathematical model of the corresponding active systems is described by coupled motor-mental dynamics represented by Langevin and Fokker-Planck equations, respectively, while the progressive evolution of complexity is provided by nonlinear evolution of probability density. Application of the proposed formalism to modeling common-sense-based decision-making process is discussed

  14. Complexity for survival of livings

    Energy Technology Data Exchange (ETDEWEB)

    Zak, Michail [Jet Propulsion Laboratory, California Institute of Technology, Advance Computing Algorithms and IVHM Group, Pasadena, CA 91109 (United States)]. E-mail: Michail.Zak@jpl.nasa.gov

    2007-05-15

    A connection between survivability of livings and complexity of their behavior is established. New physical paradigms-exchange of information via reflections, and chain of abstractions-explaining and describing progressive evolution of complexity in living (active) systems are introduced. A biological origin of these paradigms is associated with a recently discovered mirror neuron that is able to learn by imitation. As a result, an active element possesses the self-nonself images and interacts with them creating the world of mental dynamics. Three fundamental types of complexity of mental dynamics that contribute to survivability are identified. Mathematical model of the corresponding active systems is described by coupled motor-mental dynamics represented by Langevin and Fokker-Planck equations, respectively, while the progressive evolution of complexity is provided by nonlinear evolution of probability density. Application of the proposed formalism to modeling common-sense-based decision-making process is discussed.

  15. Biologically-equivalent dose and long-term survival time in radiation treatments

    International Nuclear Information System (INIS)

    Zaider, Marco; Hanin, Leonid

    2007-01-01

    Within the linear-quadratic model the biologically-effective dose (BED)-taken to represent treatments with an equal tumor control probability (TCP)-is commonly (and plausibly) calculated according to BED(D) = -log[S(D)]/α. We ask whether in the presence of cellular proliferation this claim is justified and examine, as a related question, the extent to which BED approximates an isoeffective dose (IED) defined, more sensibly, in terms of an equal long-term survival probability, rather than TCP. We derive, under the assumption that cellular birth and death rates are time homogeneous, exact equations for the isoeffective dose, IED. As well, we give a rigorous definition of effective long-term survival time, T eff . By using several sets of radiobiological parameters, we illustrate potential differences between BED and IED on the one hand and, on the other, between T eff calculated as suggested here or by an earlier recipe. In summary: (a) the equations currently in use for calculating the effective treatment time may underestimate the isoeffective dose and should be avoided. The same is the case for the tumor control probability (TCP), only more so; (b) for permanent implants BED may be a poor substitute for IED; (c) for a fractionated treatment schedule, interpreting the observed probability of cure in terms of a TCP formalism that refers to the end of the treatment (rather than T eff ) may result in a miscalculation (underestimation) of the initial number of clonogens

  16. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  17. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    International Nuclear Information System (INIS)

    George, L.L.

    1983-01-01

    This workshop describes some probability problems in power plant reliability and maintenance analysis. The problems are seismic risk analysis, loss of load probability, load combinations, and load sharing. The seismic risk problem is to compute power plant reliability given an earthquake and the resulting risk. Component survival occurs if its peak random response to the earthquake does not exceed its strength. Power plant survival is a complicated Boolean function of component failures and survivals. The responses and strengths of components are dependent random processes, and the peak responses are maxima of random processes. The resulting risk is the expected cost of power plant failure

  18. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  19. Normal tissue complication probability for salivary glands

    International Nuclear Information System (INIS)

    Rana, B.S.

    2008-01-01

    The purpose of radiotherapy is to make a profitable balance between the morbidity (due to side effects of radiation) and cure of malignancy. To achieve this, one needs to know the relation between NTCP (normal tissue complication probability) and various treatment variables of a schedule viz. daily dose, duration of treatment, total dose and fractionation along with tissue conditions. Prospective studies require that a large number of patients be treated with varied schedule parameters and a statistically acceptable number of patients develop complications so that a true relation between NTCP and a particular variable is established. In this study Salivary Glands Complications have been considered. The cases treated in 60 Co teletherapy machine during the period 1994 to 2002 were analyzed and the clinicians judgement in ascertaining the end points was the only means of observations. The only end points were early and late xerestomia which were considered for NTCP evaluations for a period of 5 years

  20. Confidence intervals for the lognormal probability distribution

    International Nuclear Information System (INIS)

    Smith, D.L.; Naberejnev, D.G.

    2004-01-01

    The present communication addresses the topic of symmetric confidence intervals for the lognormal probability distribution. This distribution is frequently utilized to characterize inherently positive, continuous random variables that are selected to represent many physical quantities in applied nuclear science and technology. The basic formalism is outlined herein and a conjured numerical example is provided for illustration. It is demonstrated that when the uncertainty reflected in a lognormal probability distribution is large, the use of a confidence interval provides much more useful information about the variable used to represent a particular physical quantity than can be had by adhering to the notion that the mean value and standard deviation of the distribution ought to be interpreted as best value and corresponding error, respectively. Furthermore, it is shown that if the uncertainty is very large a disturbing anomaly can arise when one insists on interpreting the mean value and standard deviation as the best value and corresponding error, respectively. Reliance on using the mode and median as alternative parameters to represent the best available knowledge of a variable with large uncertainties is also shown to entail limitations. Finally, a realistic physical example involving the decay of radioactivity over a time period that spans many half-lives is presented and analyzed to further illustrate the concepts discussed in this communication

  1. Estimating population parameters of longsnout seahorses, Hippocampus reidi (Teleostei: Syngnathidae through mark-recapture

    Directory of Open Access Journals (Sweden)

    Alexandre C. Siqueira

    2017-12-01

    Full Text Available ABSTRACT Estimating population parameters is essential for understanding the ecology of species, which ultimately helps to assess their conservation status. The seahorse Hippocampus reidi is directly exposed to anthropogenic threats along the Brazilian coast, but the species still figures as Data Deficient (DD at IUCN’s Red List. To provide better information on the ecology of this species, we studied how population parameters vary over time in a natural subtropical environment. By combing mark-recapture models for open and closed populations, we estimated abundance, survival rate, emigration probability, and capture probability. We marked 111 individuals, which showed a 1:1 sex ratio, and an average size of 10.5 cm. The population showed high survival rate, low temporary emigration probability and variable capture probability and abundance. Our models considering relevant biological criteria illuminate the relatively poorly known population ecology and life history of seahorses. It is our hope that this study inspires the use of mark-recapture methods in other populations of H. reidi in a collective effort to properly assess their conservation status.

  2. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  3. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    Kuzio, S.

    2001-01-01

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  4. Parameter Estimation

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Heitzig, Martina; Cameron, Ian

    2011-01-01

    of optimisation techniques coupled with dynamic solution of the underlying model. Linear and nonlinear approaches to parameter estimation are investigated. There is also the application of maximum likelihood principles in the estimation of parameters, as well as the use of orthogonal collocation to generate a set......In this chapter the importance of parameter estimation in model development is illustrated through various applications related to reaction systems. In particular, rate constants in a reaction system are obtained through parameter estimation methods. These approaches often require the application...... of algebraic equations as the basis for parameter estimation.These approaches are illustrated using estimations of kinetic constants from reaction system models....

  5. Repair-dependent cell radiation survival and transformation: an integrated theory

    International Nuclear Information System (INIS)

    Sutherland, John C

    2014-01-01

    The repair-dependent model of cell radiation survival is extended to include radiation-induced transformations. The probability of transformation is presumed to scale with the number of potentially lethal damages that are repaired in a surviving cell or the interactions of such damages. The theory predicts that at doses corresponding to high survival, the transformation frequency is the sum of simple polynomial functions of dose; linear, quadratic, etc, essentially as described in widely used linear-quadratic expressions. At high doses, corresponding to low survival, the ratio of transformed to surviving cells asymptotically approaches an upper limit. The low dose fundamental- and high dose plateau domains are separated by a downwardly concave transition region. Published transformation data for mammalian cells show the high-dose plateaus predicted by the repair-dependent model for both ultraviolet and ionizing radiation. For the neoplastic transformation experiments that were analyzed, the data can be fit with only the repair-dependent quadratic function. At low doses, the transformation frequency is strictly quadratic, but becomes sigmodial over a wider range of doses. Inclusion of data from the transition region in a traditional linear-quadratic analysis of neoplastic transformation frequency data can exaggerate the magnitude of, or create the appearance of, a linear component. Quantitative analysis of survival and transformation data shows good agreement for ultraviolet radiation; the shapes of the transformation components can be predicted from survival data. For ionizing radiations, both neutrons and x-rays, survival data overestimate the transforming ability for low to moderate doses. The presumed cause of this difference is that, unlike UV photons, a single x-ray or neutron may generate more than one lethal damage in a cell, so the distribution of such damages in the population is not accurately described by Poisson statistics. However, the complete

  6. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  7. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  8. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  9. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  10. Joint probabilities reproducing three EPR experiments on two qubits

    NARCIS (Netherlands)

    Roy, S. M.; Atkinson, D.; Auberson, G.; Mahoux, G.; Singh, V.

    2007-01-01

    An eight-parameter family of the most general non-negative quadruple probabilities is constructed for EPR-Bohm-Aharonov experiments when only three pairs of analyser settings are used. It is a simultaneous representation of three different Bohr-incompatible experimental configurations involving

  11. Probability analysis of dynamical effects of axial piston hydraulic motor

    OpenAIRE

    Sapietova Alzbeta; Dekys Vladimír; Sapieta Milan; Sulka Peter; Gajdos Lukas; Rojek Izabela

    2018-01-01

    The paper presents an analysis of impact force on stopper screw in axial piston hydraulic motor. The solution contains probabilistic description of input variables. If the output parameters of probabilistic solution are compared with arbitrary values and values acquired by analytical solution, the probability of proper operation of the device can be evaluated.

  12. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  13. When celibacy matters: incorporating non-breeders improves demographic parameter estimates.

    Science.gov (United States)

    Pardo, Deborah; Weimerskirch, Henri; Barbraud, Christophe

    2013-01-01

    In long-lived species only a fraction of a population breeds at a given time. Non-breeders can represent more than half of adult individuals, calling in doubt the relevance of estimating demographic parameters from the sole breeders. Here we demonstrate the importance of considering observable non-breeders to estimate reliable demographic traits: survival, return, breeding, hatching and fledging probabilities. We study the long-lived quasi-biennial breeding wandering albatross (Diomedea exulans). In this species, the breeding cycle lasts almost a year and birds that succeed a given year tend to skip the next breeding occasion while birds that fail tend to breed again the following year. Most non-breeders remain unobservable at sea, but still a substantial number of observable non-breeders (ONB) was identified on breeding sites. Using multi-state capture-mark-recapture analyses, we used several measures to compare the performance of demographic estimates between models incorporating or ignoring ONB: bias (difference in mean), precision (difference is standard deviation) and accuracy (both differences in mean and standard deviation). Our results highlight that ignoring ONB leads to bias and loss of accuracy on breeding probability and survival estimates. These effects are even stronger when studied in an age-dependent framework. Biases on breeding probabilities and survival increased with age leading to overestimation of survival at old age and thus actuarial senescence and underestimation of reproductive senescence. We believe our study sheds new light on the difficulties of estimating demographic parameters in species/taxa where a significant part of the population does not breed every year. Taking into account ONB appeared important to improve demographic parameter estimates, models of population dynamics and evolutionary conclusions regarding senescence within and across taxa.

  14. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  15. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  16. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  17. Survival pathways under stress

    Indian Academy of Sciences (India)

    First page Back Continue Last page Graphics. Survival pathways under stress. Bacteria survive by changing gene expression. pattern. Three important pathways will be discussed: Stringent response. Quorum sensing. Proteins performing function to control oxidative damage.

  18. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  19. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  20. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  1. Effects of mercury on health and first-year survival of free-ranging great egrets (Ardea albus) from southern Florida.

    Science.gov (United States)

    Sepúlveda, M S; Williams, G E; Frederick, P C; Spalding, M G

    1999-10-01

    The objectives of this study were to determine whether elevated mercury (Hg) concentrations have a negative impact on the health and survival of nestling and juvenile free-ranging great egrets (Ardea albus) from southern Florida. During 1994, when health and survival was monitored in a cohort of young birds with naturally variable concentrations of Hg, packed cell volume was positively correlated with blood Hg concentrations, and high Hg concentration in blood was not related to the probability of surviving during the first 10.5 months of life. During 1995, 70 first-hatched great egret chicks were included in a Hg field-dosing experiment to compare the effects of elevated Hg on health and survival. Birds were dosed while in the nest orally every 2.5 days for 15 days with 0.5 mg of methyl mercury chloride (MeHgCl) for an estimated intake of 1.54 mg MeHgCl/kg food intake. These birds were compared with controls, which received an estimated 0.41 mg MeHgCl/kg food. No differences were observed in health parameters or in the probability of surviving during the first 8 months of age between egrets that were dosed with Hg and those that were not. A likely explanation for the lack of any effects on health and survival between both groups could be that chicks at this age were eliminating most of the dietary Hg through the production of new feathers.

  2. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  3. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  4. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  5. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  6. Influence of dose distribution homogeneity on the tumor control probability in heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Wen Xiaoqiong; Li Qiang; Zhou Guangming; Li Wenjian; Wei Zengquan

    2001-01-01

    In order to estimate the influence of the un-uniform dose distribution on the clinical treatment result, the Influence of dose distribution homogeneity on the tumor control probability was investigated. Basing on the formula deduced previously for survival fraction of cells irradiated by the un-uniform heavy-ion irradiation field and the theory of tumor control probability, the tumor control probability was calculated for a tumor mode exposed to different dose distribution homogeneity. The results show that the tumor control probability responding to the same total dose will decrease if the dose distribution homogeneity gets worse. In clinical treatment, the dose distribution homogeneity should be better than 95%

  7. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  8. Comparison between standard culture and peptide nucleic acid 16S rRNA hybridization quantification to study the influence of physico-chemical parameters on Legionella pneumophila survival in drinking water biofilms.

    Science.gov (United States)

    Gião, M S; Wilks, S A; Azevedo, N F; Vieira, M J; Keevil, C W

    2009-01-01

    Legionella pneumophila is a waterborne pathogen that is mainly transmitted by the inhalation of contaminated aerosols. In this article, the influence of several physico-chemical parameters relating to the supply of potable water was studied using a L. pneumophila peptide nucleic acid (PNA) specific probe to quantify total L. pneumophila in addition to standard culture methods. A two-stage chemostat was used to form the heterotrophic biofilms, with biofilm generating vessels fed with naturally occurring L. pneumophila. The substratum was the commonly used potable water pipe material, uPVC. It proved impossible to recover cultivable L. pneumophila due to overgrowth by other microorganisms and/or the loss of cultivability of this pathogen. Nevertheless, results obtained for total L. pneumophila cells in biofilms using a specific PNA probe showed that for the two temperatures studied (15 and 20 degrees C), there were no significant differences when shear stress was increased. However, when a source of carbon was added there was a significant increase in numbers at 20 degrees C. A comparison of the two temperatures showed that at 15 degrees C, the total cell numbers for L. pneumophila were generally higher compared with the total microbial flora, suggesting that lower temperatures support the inclusion of L. pneumophila in drinking water biofilms. The work reported in this article suggests that standard culture methods are not accurate for the evaluation of water quality in terms of L. pneumophila. This raises public health concerns since culture methods are still considered to be the gold standard for assessing the presence of this opportunistic pathogen in water.

  9. Network survivability performance

    Science.gov (United States)

    1993-11-01

    This technical report has been developed to address the survivability of telecommunications networks including services. It responds to the need for a common understanding of, and assessment techniques for network survivability, availability, integrity, and reliability. It provides a basis for designing and operating telecommunications networks to user expectations for network survivability and a foundation for continuing industry activities in the subject area. This report focuses on the survivability of both public and private networks and covers a wide range of users. Two frameworks are established for quantifying and categorizing service outages, and for classifying network survivability techniques and measures. The performance of the network survivability techniques is considered; however, recommended objectives are not established for network survivability performance.

  10. Simple and compact expressions for neutrino oscillation probabilities in matter

    International Nuclear Information System (INIS)

    Minakata, Hisakazu; Parke, Stephen J.

    2016-01-01

    We reformulate perturbation theory for neutrino oscillations in matter with an expansion parameter related to the ratio of the solar to the atmospheric Δm"2 scales. Unlike previous works, we use a renormalized basis in which certain first-order effects are taken into account in the zeroth-order Hamiltonian. We show that the new framework has an exceptional feature that leads to the neutrino oscillation probability in matter with the same structure as in vacuum to first order in the expansion parameter. It facilitates immediate physical interpretation of the formulas, and makes the expressions for the neutrino oscillation probabilities extremely simple and compact. We find, for example, that the ν_e disappearance probability at this order is of a simple two-flavor form with an appropriately identified mixing angle and Δm"2. More generally, all the oscillation probabilities can be written in the universal form with the channel-discrimination coefficient of 0, ±1 or simple functions of θ_2_3. Despite their simple forms they include all order effects of θ_1_3 and all order effects of the matter potential, to first order in our expansion parameter.

  11. Estimating demographic parameters using a combination of known-fate and open N-mixture models.

    Science.gov (United States)

    Schmidt, Joshua H; Johnson, Devin S; Lindberg, Mark S; Adams, Layne G

    2015-10-01

    Accurate estimates of demographic parameters are required to infer appropriate ecological relationships and inform management actions. Known-fate data from marked individuals are commonly used to estimate survival rates, whereas N-mixture models use count data from unmarked individuals to estimate multiple demographic parameters. However, a joint approach combining the strengths of both analytical tools has not been developed. Here we develop an integrated model combining known-fate and open N-mixture models, allowing the estimation of detection probability, recruitment, and the joint estimation of survival. We demonstrate our approach through both simulations and an applied example using four years of known-fate and pack count data for wolves (Canis lupus). Simulation results indicated that the integrated model reliably recovered parameters with no evidence of bias, and survival estimates were more precise under the joint model. Results from the applied example indicated that the marked sample of wolves was biased toward individuals with higher apparent survival rates than the unmarked pack mates, suggesting that joint estimates may be more representative of the overall population. Our integrated model is a practical approach for reducing bias while increasing precision and the amount of information gained from mark-resight data sets. We provide implementations in both the BUGS language and an R package.

  12. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  13. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  14. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  15. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  16. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  17. Bayesian probability theory applications in the physical sciences

    CERN Document Server

    Linden, Wolfgang von der; Toussaint, Udo von

    2014-01-01

    From the basics to the forefront of modern research, this book presents all aspects of probability theory, statistics and data analysis from a Bayesian perspective for physicists and engineers. The book presents the roots, applications and numerical implementation of probability theory, and covers advanced topics such as maximum entropy distributions, stochastic processes, parameter estimation, model selection, hypothesis testing and experimental design. In addition, it explores state-of-the art numerical techniques required to solve demanding real-world problems. The book is ideal for students and researchers in physical sciences and engineering.

  18. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  19. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  20. Hyperprolinemic larvae of the drosophilid fly, Chymomyza costata, survive cryopreservation in liquid nitrogen.

    Science.gov (United States)

    Kostál, Vladimír; Zahradnícková, Helena; Šimek, Petr

    2011-08-09

    The larva of the drosophilid fly, Chymomyza costata, is probably the most complex metazoan organism that can survive submergence in liquid nitrogen (-196 °C) in a fully hydrated state. We examined the associations between the physiological and biochemical parameters of differently acclimated larvae and their freeze tolerance. Entering diapause is an essential and sufficient prerequisite for attaining high levels of survival in liquid nitrogen (23% survival to adult stage), although cold acclimation further improves this capacity (62% survival). Profiling of 61 different metabolites identified proline as a prominent compound whose concentration increased from 20 to 147 mM during diapause transition and subsequent cold acclimation. This study provides direct evidence for the essential role of proline in high freeze tolerance. We increased the levels of proline in the larval tissues by feeding larvae proline-augmented diets and found that this simple treatment dramatically improved their freeze tolerance. Cell and tissue survival following exposure to liquid nitrogen was evident in proline-fed nondiapause larvae, and survival to adult stage increased from 0% to 36% in proline-fed diapause-destined larvae. A significant statistical correlation was found between the whole-body concentration of proline, either natural or artificial, and survival to the adult stage in liquid nitrogen for diapause larvae. Differential scanning calorimetry analysis suggested that high proline levels, in combination with a relatively low content of osmotically active water and freeze dehydration, increased the propensity of the remaining unfrozen water to undergo a glass-like transition (vitrification) and thus facilitated the prevention of cryoinjury.

  1. Conditional probability of the tornado missile impact given a tornado occurrence

    International Nuclear Information System (INIS)

    Goodman, J.; Koch, J.E.

    1982-01-01

    Using an approach based on statistical mechanics, an expression for the probability of the first missile strike is developed. The expression depends on two generic parameters (injection probability eta(F) and height distribution psi(Z,F)), which are developed in this study, and one plant specific parameter (number of potential missiles N/sub p/). The expression for the joint probability of simultaneous impact of muitiple targets is also developed. This espression is applicable to calculation of the probability of common cause failure due to tornado missiles. It is shown that the probability of the first missile strike can be determined using a uniform missile distribution model. It is also shown that the conditional probability of the second strike, given the first, is underestimated by the uniform model. The probability of the second strike is greatly increased if the missiles are in clusters large enough to cover both targets

  2. 51Cr - erythrocyte survival curves

    International Nuclear Information System (INIS)

    Paiva Costa, J. de.

    1982-07-01

    Sixteen patients were studied, being fifteen patients in hemolytic state, and a normal individual as a witness. The aim was to obtain better techniques for the analysis of the erythrocytes, survival curves, according to the recommendations of the International Committee of Hematology. It was used the radiochromatic method as a tracer. Previously a revisional study of the International Literature was made in its aspects inherent to the work in execution, rendering possible to establish comparisons and clarify phonomena observed in cur investigation. Several parameters were considered in this study, hindering both the exponential and the linear curves. The analysis of the survival curves of the erythrocytes in the studied group, revealed that the elution factor did not present a homogeneous answer quantitatively to all, though, the result of the analysis of these curves have been established, through listed programs in the electronic calculator. (Author) [pt

  3. Inventory parameters

    CERN Document Server

    Sharma, Sanjay

    2017-01-01

    This book provides a detailed overview of various parameters/factors involved in inventory analysis. It especially focuses on the assessment and modeling of basic inventory parameters, namely demand, procurement cost, cycle time, ordering cost, inventory carrying cost, inventory stock, stock out level, and stock out cost. In the context of economic lot size, it provides equations related to the optimum values. It also discusses why the optimum lot size and optimum total relevant cost are considered to be key decision variables, and uses numerous examples to explain each of these inventory parameters separately. Lastly, it provides detailed information on parameter estimation for different sectors/products. Written in a simple and lucid style, it offers a valuable resource for a broad readership, especially Master of Business Administration (MBA) students.

  4. Modelling survival after treatment of intraocular melanoma using artificial neural networks and Bayes theorem

    International Nuclear Information System (INIS)

    Taktak, Azzam F G; Fisher, Anthony C; Damato, Bertil E

    2004-01-01

    This paper describes the development of an artificial intelligence (AI) system for survival prediction from intraocular melanoma. The system used artificial neural networks (ANNs) with five input parameters: coronal and sagittal tumour location, anterior tumour margin, largest basal tumour diameter and the cell type. After excluding records with missing data, 2331 patients were included in the study. These were split randomly into training and test sets. Date censorship was applied to the records to deal with patients who were lost to follow-up and patients who died from general causes. Bayes theorem was then applied to the ANN output to construct survival probability curves. A validation set with 34 patients unseen to both training and test sets was used to compare the AI system with Cox's regression (CR) and Kaplan-Meier (KM) analyses. Results showed large differences in the mean 5 year survival probability figures when the number of records with matching characteristics was small. However, as the number of matches increased to >100 the system tended to agree with CR and KM. The validation set was also used to compare the system with a clinical expert in predicting time to metastatic death. The rms error was 3.7 years for the system and 4.3 years for the clinical expert for 15 years survival. For <10 years survival, these figures were 2.7 and 4.2, respectively. We concluded that the AI system can match if not better the clinical expert's prediction. There were significant differences with CR and KM analyses when the number of records was small, but it was not known which model is more accurate

  5. Modelling survival after treatment of intraocular melanoma using artificial neural networks and Bayes theorem

    Energy Technology Data Exchange (ETDEWEB)

    Taktak, Azzam F G [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Fisher, Anthony C [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Damato, Bertil E [Department of Ophthalmology, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom)

    2004-01-07

    This paper describes the development of an artificial intelligence (AI) system for survival prediction from intraocular melanoma. The system used artificial neural networks (ANNs) with five input parameters: coronal and sagittal tumour location, anterior tumour margin, largest basal tumour diameter and the cell type. After excluding records with missing data, 2331 patients were included in the study. These were split randomly into training and test sets. Date censorship was applied to the records to deal with patients who were lost to follow-up and patients who died from general causes. Bayes theorem was then applied to the ANN output to construct survival probability curves. A validation set with 34 patients unseen to both training and test sets was used to compare the AI system with Cox's regression (CR) and Kaplan-Meier (KM) analyses. Results showed large differences in the mean 5 year survival probability figures when the number of records with matching characteristics was small. However, as the number of matches increased to >100 the system tended to agree with CR and KM. The validation set was also used to compare the system with a clinical expert in predicting time to metastatic death. The rms error was 3.7 years for the system and 4.3 years for the clinical expert for 15 years survival. For <10 years survival, these figures were 2.7 and 4.2, respectively. We concluded that the AI system can match if not better the clinical expert's prediction. There were significant differences with CR and KM analyses when the number of records was small, but it was not known which model is more accurate.

  6. Combining Gene Signatures Improves Prediction of Breast Cancer Survival

    Science.gov (United States)

    Zhao, Xi; Naume, Bjørn; Langerød, Anita; Frigessi, Arnoldo; Kristensen, Vessela N.; Børresen-Dale, Anne-Lise; Lingjærde, Ole Christian

    2011-01-01

    Background Several gene sets for prediction of breast cancer survival have been derived from whole-genome mRNA expression profiles. Here, we develop a statistical framework to explore whether combination of the information from such sets may improve prediction of recurrence and breast cancer specific death in early-stage breast cancers. Microarray data from two clinically similar cohorts of breast cancer patients are used as training (n = 123) and test set (n = 81), respectively. Gene sets from eleven previously published gene signatures are included in the study. Principal Findings To investigate the relationship between breast cancer survival and gene expression on a particular gene set, a Cox proportional hazards model is applied using partial likelihood regression with an L2 penalty to avoid overfitting and using cross-validation to determine the penalty weight. The fitted models are applied to an independent test set to obtain a predicted risk for each individual and each gene set. Hierarchical clustering of the test individuals on the basis of the vector of predicted risks results in two clusters with distinct clinical characteristics in terms of the distribution of molecular subtypes, ER, PR status, TP53 mutation status and histological grade category, and associated with significantly different survival probabilities (recurrence: p = 0.005; breast cancer death: p = 0.014). Finally, principal components analysis of the gene signatures is used to derive combined predictors used to fit a new Cox model. This model classifies test individuals into two risk groups with distinct survival characteristics (recurrence: p = 0.003; breast cancer death: p = 0.001). The latter classifier outperforms all the individual gene signatures, as well as Cox models based on traditional clinical parameters and the Adjuvant! Online for survival prediction. Conclusion Combining the predictive strength of multiple gene signatures improves prediction of breast

  7. Combining gene signatures improves prediction of breast cancer survival.

    Directory of Open Access Journals (Sweden)

    Xi Zhao

    Full Text Available BACKGROUND: Several gene sets for prediction of breast cancer survival have been derived from whole-genome mRNA expression profiles. Here, we develop a statistical framework to explore whether combination of the information from such sets may improve prediction of recurrence and breast cancer specific death in early-stage breast cancers. Microarray data from two clinically similar cohorts of breast cancer patients are used as training (n = 123 and test set (n = 81, respectively. Gene sets from eleven previously published gene signatures are included in the study. PRINCIPAL FINDINGS: To investigate the relationship between breast cancer survival and gene expression on a particular gene set, a Cox proportional hazards model is applied using partial likelihood regression with an L2 penalty to avoid overfitting and using cross-validation to determine the penalty weight. The fitted models are applied to an independent test set to obtain a predicted risk for each individual and each gene set. Hierarchical clustering of the test individuals on the basis of the vector of predicted risks results in two clusters with distinct clinical characteristics in terms of the distribution of molecular subtypes, ER, PR status, TP53 mutation status and histological grade category, and associated with significantly different survival probabilities (recurrence: p = 0.005; breast cancer death: p = 0.014. Finally, principal components analysis of the gene signatures is used to derive combined predictors used to fit a new Cox model. This model classifies test individuals into two risk groups with distinct survival characteristics (recurrence: p = 0.003; breast cancer death: p = 0.001. The latter classifier outperforms all the individual gene signatures, as well as Cox models based on traditional clinical parameters and the Adjuvant! Online for survival prediction. CONCLUSION: Combining the predictive strength of multiple gene signatures improves

  8. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  9. A new algorithm for finding survival coefficients employed in reliability equations

    Science.gov (United States)

    Bouricius, W. G.; Flehinger, B. J.

    1973-01-01

    Product reliabilities are predicted from past failure rates and reasonable estimate of future failure rates. Algorithm is used to calculate probability that product will function correctly. Algorithm sums the probabilities of each survival pattern and number of permutations for that pattern, over all possible ways in which product can survive.

  10. Challenges in the estimation of Net SURvival: The CENSUR working survival group.

    Science.gov (United States)

    Giorgi, R

    2016-10-01

    Net survival, the survival probability that would be observed, in a hypothetical world, where the cancer of interest would be the only possible cause of death, is a key indicator in population-based cancer studies. Accounting for mortality due to other causes, it allows cross-country comparisons or trends analysis and provides a useful indicator for public health decision-making. The objective of this study was to show how the creation and formalization of a network comprising established research teams, which already had substantial and complementary experience in both cancer survival analysis and methodological development, make it possible to meet challenges and thus provide more adequate tools, to improve the quality and the comparability of cancer survival data, and to promote methodological transfers in areas of emerging interest. The Challenges in the Estimation of Net SURvival (CENSUR) working survival group is composed of international researchers highly skilled in biostatistics, methodology, and epidemiology, from different research organizations in France, the United Kingdom, Italy, Slovenia, and Canada, and involved in French (FRANCIM) and European (EUROCARE) cancer registry networks. The expected advantages are an interdisciplinary, international, synergistic network capable of addressing problems in public health, for decision-makers at different levels; tools for those in charge of net survival analyses; a common methodology that makes unbiased cross-national comparisons of cancer survival feasible; transfer of methods for net survival estimations to other specific applications (clinical research, occupational epidemiology); and dissemination of results during an international training course. The formalization of the international CENSUR working survival group was motivated by a need felt by scientists conducting population-based cancer research to discuss, develop, and monitor implementation of a common methodology to analyze net survival in order

  11. Using Thermal Inactivation Kinetics to Calculate the Probability of Extreme Spore Longevity: Implications for Paleomicrobiology and Lithopanspermia

    Science.gov (United States)

    Nicholson, Wayne L.

    2003-12-01

    Thermal inactivation kinetics with extrapolation were used to model the survival probabilities of spores of various Bacillus species over time periods of millions of years at the historical ambient temperatures (25-40 °) encountered within the 250 million-year-old Salado formation, from which the putative ancient spore-forming bacterium Salibacillus marismortui strain 2-9-3 was recovered. The model indicated extremely low-to-moderate survival probabilities for spores of mesophiles, but surprisingly high survival probabilities for thermophilic spores. The significance of the results are discussed in terms of the survival probabilities of (i) terrestrial spores in ancient geologic samples and (ii) spores transported between planets within impact ejecta.

  12. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  13. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  14. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  15. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  16. Statistical physics of pairwise probability models

    Directory of Open Access Journals (Sweden)

    Yasser Roudi

    2009-11-01

    Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.

  17. Multinationals and plant survival

    DEFF Research Database (Denmark)

    Bandick, Roger

    2010-01-01

    The aim of this paper is twofold: first, to investigate how different ownership structures affect plant survival, and second, to analyze how the presence of foreign multinational enterprises (MNEs) affects domestic plants’ survival. Using a unique and detailed data set on the Swedish manufacturing...... sector, I am able to separate plants into those owned by foreign MNEs, domestic MNEs, exporting non-MNEs, and purely domestic firms. In line with previous findings, the result, when conditioned on other factors affecting survival, shows that foreign MNE plants have lower survival rates than non......-MNE plants. However, separating the non-MNEs into exporters and non-exporters, the result shows that foreign MNE plants have higher survival rates than non-exporting non-MNEs, while the survival rates of foreign MNE plants and exporting non-MNE plants do not seem to differ. Moreover, the simple non...

  18. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    S. Kuzio

    2004-01-01

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be

  19. Can confidence indicators forecast the probability of expansion in Croatia?

    Directory of Open Access Journals (Sweden)

    Mirjana Čižmešija

    2016-04-01

    Full Text Available The aim of this paper is to investigate how reliable are confidence indicators in forecasting the probability of expansion. We consider three Croatian Business Survey indicators: the Industrial Confidence Indicator (ICI, the Construction Confidence Indicator (BCI and the Retail Trade Confidence Indicator (RTCI. The quarterly data, used in the research, covered the periods from 1999/Q1 to 2014/Q1. Empirical analysis consists of two parts. The non-parametric Bry-Boschan algorithm is used for distinguishing periods of expansion from the period of recession in the Croatian economy. Then, various nonlinear probit models were estimated. The models differ with respect to the regressors (confidence indicators and the time lags. The positive signs of estimated parameters suggest that the probability of expansion increases with an increase in Confidence Indicators. Based on the obtained results, the conclusion is that ICI is the most powerful predictor of the probability of expansion in Croatia.

  20. Efficient simulation of tail probabilities of sums of correlated lognormals

    DEFF Research Database (Denmark)

    Asmussen, Søren; Blanchet, José; Juneja, Sandeep

    We consider the problem of efficient estimation of tail probabilities of sums of correlated lognormals via simulation. This problem is motivated by the tail analysis of portfolios of assets driven by correlated Black-Scholes models. We propose two estimators that can be rigorously shown to be eff......We consider the problem of efficient estimation of tail probabilities of sums of correlated lognormals via simulation. This problem is motivated by the tail analysis of portfolios of assets driven by correlated Black-Scholes models. We propose two estimators that can be rigorously shown...... optimize the scaling parameter of the covariance. The second estimator decomposes the probability of interest in two contributions and takes advantage of the fact that large deviations for a sum of correlated lognormals are (asymptotically) caused by the largest increment. Importance sampling...

  1. Survival analysis approach to account for non-exponential decay rate effects in lifetime experiments

    Energy Technology Data Exchange (ETDEWEB)

    Coakley, K.J., E-mail: kevincoakley@nist.gov [National Institute of Standards and Technology, 325 Broadway, Boulder, CO 80305 (United States); Dewey, M.S.; Huber, M.G. [National Institute of Standards and Technology, 100 Bureau Drive, Stop 8461, Gaithersburg, MD 20899 (United States); Huffer, C.R.; Huffman, P.R. [North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Marley, D.E. [National Institute of Standards and Technology, 100 Bureau Drive, Stop 8461, Gaithersburg, MD 20899 (United States); North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Mumm, H.P. [National Institute of Standards and Technology, 100 Bureau Drive, Stop 8461, Gaithersburg, MD 20899 (United States); O' Shaughnessy, C.M. [University of North Carolina at Chapel Hill, 120 E. Cameron Ave., CB #3255, Chapel Hill, NC 27599 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Schelhammer, K.W. [North Carolina State University, 2401 Stinson Drive, Box 8202, Raleigh, NC 27695 (United States); Triangle Universities Nuclear Laboratory, 116 Science Drive, Box 90308, Durham, NC 27708 (United States); Thompson, A.K.; Yue, A.T. [National Institute of Standards and Technology, 100 Bureau Drive, Stop 8461, Gaithersburg, MD 20899 (United States)

    2016-03-21

    In experiments that measure the lifetime of trapped particles, in addition to loss mechanisms with exponential survival probability functions, particles can be lost by mechanisms with non-exponential survival probability functions. Failure to account for such loss mechanisms produces systematic measurement error and associated systematic uncertainties in these measurements. In this work, we develop a general competing risks survival analysis method to account for the joint effect of loss mechanisms with either exponential or non-exponential survival probability functions, and a method to quantify the size of systematic effects and associated uncertainties for lifetime estimates. As a case study, we apply our survival analysis formalism and method to the Ultra Cold Neutron lifetime experiment at NIST. In this experiment, neutrons can escape a magnetic trap before they decay due to a wall loss mechanism with an associated non-exponential survival probability function.

  2. Survival analysis approach to account for non-exponential decay rate effects in lifetime experiments

    International Nuclear Information System (INIS)

    Coakley, K.J.; Dewey, M.S.; Huber, M.G.; Huffer, C.R.; Huffman, P.R.; Marley, D.E.; Mumm, H.P.; O'Shaughnessy, C.M.; Schelhammer, K.W.; Thompson, A.K.; Yue, A.T.

    2016-01-01

    In experiments that measure the lifetime of trapped particles, in addition to loss mechanisms with exponential survival probability functions, particles can be lost by mechanisms with non-exponential survival probability functions. Failure to account for such loss mechanisms produces systematic measurement error and associated systematic uncertainties in these measurements. In this work, we develop a general competing risks survival analysis method to account for the joint effect of loss mechanisms with either exponential or non-exponential survival probability functions, and a method to quantify the size of systematic effects and associated uncertainties for lifetime estimates. As a case study, we apply our survival analysis formalism and method to the Ultra Cold Neutron lifetime experiment at NIST. In this experiment, neutrons can escape a magnetic trap before they decay due to a wall loss mechanism with an associated non-exponential survival probability function.

  3. Demisability and survivability sensitivity to design-for-demise techniques

    Science.gov (United States)

    Trisolini, Mirko; Lewis, Hugh G.; Colombo, Camilla

    2018-04-01

    The paper is concerned with examining the effects that design-for-demise solutions can have not only on the demisability of components, but also on their survivability that is their capability to withstand impacts from space debris. First two models are introduced. A demisability model to predict the behaviour of spacecraft components during the atmospheric re-entry and a survivability model to assess the vulnerability of spacecraft structures against space debris impacts. Two indices that evaluate the level of demisability and survivability are also proposed. The two models are then used to study the sensitivity of the demisability and of the survivability indices as a function of typical design-for-demise options. The demisability and the survivability can in fact be influenced by the same design parameters in a competing fashion that is while the demisability is improved, the survivability is worsened and vice versa. The analysis shows how the design-for-demise solutions influence the demisability and the survivability independently. In addition, the effect that a solution has simultaneously on the two criteria is assessed. Results shows which, among the design-for-demise parameters mostly influence the demisability and the survivability. For such design parameters maps are presented, describing their influence on the demisability and survivability indices. These maps represent a useful tool to quickly assess the level of demisability and survivability that can be expected from a component, when specific design parameters are changed.

  4. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  5. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  6. Generic Degraded Configuration Probability Analysis for the Codisposal Waste Package

    International Nuclear Information System (INIS)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-01-01

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M and O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k eff in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package

  7. Ubiquitous log odds: a common representation of probability and frequency distortion in perception, action and cognition

    Directory of Open Access Journals (Sweden)

    Hang eZhang

    2012-01-01

    Full Text Available In decision from experience, the source of probability information affects how probability is distorted in the decision task. Understanding how and why probability is distorted is a key issue in understanding the peculiar character of experience-based decision. We consider how probability information is used not just in decision making but also in a wide variety of cognitive, perceptual and motor tasks. Very similar patterns of distortion of probability/frequency information have been found in visual frequency estimation, frequency estimation based on memory, signal detection theory, and in the use of probability information in decision-making under risk and uncertainty. We show that distortion of probability in all cases is well captured as linear transformations of the log odds of frequency and/or probability, a model with a slope parameter and an intercept parameter. We then consider how task and experience influence these two parameters and the resulting distortion of probability. We review how the probability distortions change in systematic ways with task and report three experiments on frequency distortion where the distortions change systematically in the same task. We found that the slope of frequency distortions decreases with the sample size, which is echoed by findings in decision from experience. We review previous models of the representation of uncertainty and find that none can account for the empirical findings.

  8. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  9. ASURV: Astronomical SURVival Statistics

    Science.gov (United States)

    Feigelson, E. D.; Nelson, P. I.; Isobe, T.; LaValley, M.

    2014-06-01

    ASURV (Astronomical SURVival Statistics) provides astronomy survival analysis for right- and left-censored data including the maximum-likelihood Kaplan-Meier estimator and several univariate two-sample tests, bivariate correlation measures, and linear regressions. ASURV is written in FORTRAN 77, and is stand-alone and does not call any specialized libraries.

  10. Reach/frequency for printed media: Personal probabilities or models

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl

    2000-01-01

    The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded ...... and estiamtes from such models are shown to be closer to panel data. the problem, however, is to get valid input for such models from readership surveys. Means for this are discussed....

  11. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  12. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  13. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  14. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  15. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  16. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  17. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  18. How Does Firm Survival Differ between Business Takeovers and New Venture Start-Ups?

    NARCIS (Netherlands)

    G. Xi (Guoqian); J.H. Block (Jörn); F. Lasch (Frank); F. Robert (Frank); A.R. Thurik (Roy)

    2017-01-01

    textabstractFocusing on entrepreneurship entry modes, we investigate two research questions regarding firm survival: how does the survival probability differ between business takeovers and new venture start-ups? And how do the determinants of survival differ between the two entry modes? Using a

  19. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  20. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  1. Survivability of systems under multiple factor impact

    International Nuclear Information System (INIS)

    Korczak, Edward; Levitin, Gregory

    2007-01-01

    The paper considers vulnerable multi-state series-parallel systems operating under influence of external impacts. Both the external impacts and internal failures affect system survivability, which is determined as the probability of meeting a given demand. The external impacts are characterized by several destructive factors affecting the system or its parts simultaneously. In order to increase the system's survivability a multilevel protection against the destructive factors can be applied to its subsystems. In such systems, the protected subsystems can be destroyed only if all of the levels of their protection are destroyed. The paper presents an algorithm for evaluating the survivability of series-parallel systems with arbitrary configuration of multilevel protection against multiple destructive factor impacts. The algorithm is based on a composition of Boolean and the Universal Generating Function techniques. Illustrative examples are presented

  2. Effects of growth rate, size, and light availability on tree survival across life stages: a demographic analysis accounting for missing values and small sample sizes.

    Science.gov (United States)

    Moustakas, Aristides; Evans, Matthew R

    2015-02-28

    Plant survival is a key factor in forest dynamics and survival probabilities often vary across life stages. Studies specifically aimed at assessing tree survival are unusual and so data initially designed for other purposes often need to be used; such data are more likely to contain errors than data collected for this specific purpose. We investigate the survival rates of ten tree species in a dataset designed to monitor growth rates. As some individuals were not included in the census at some time points we use capture-mark-recapture methods both to allow us to account for missing individuals, and to estimate relocation probabilities. Growth rates, size, and light availability were included as covariates in the model predicting survival rates. The study demonstrates that tree mortality is best described as constant between years and size-dependent at early life stages and size independent at later life stages for most species of UK hardwood. We have demonstrated that even with a twenty-year dataset it is possible to discern variability both between individuals and between species. Our work illustrates the potential utility of the method applied here for calculating plant population dynamics parameters in time replicated datasets with small sample sizes and missing individuals without any loss of sample size, and including explanatory covariates.

  3. K-shell ionization probability in energetic nearly symmetric heavy-ion collisions

    International Nuclear Information System (INIS)

    Tserruya, I.; Schmidt-Boecking, H.; Schuch, R.

    1977-01-01

    Impact parameter dependent K-x-ray emission probabilities for the projectile and target atoms have been measured in 35 MeV Cl on Cl, Cl on Ti and Cl on Ni collisions. The sum of projectile plus target K-shell ionization probability is taken as a measure of the total 2psigma ionization probability. The 2pπ-2psigma totational coupling model is in clear disagreement with the present results. On the other hand the sum of probabilities is reproduced both in shape and absolute magnitude by the statistical model for inner-shell ionization. The K-shell ionization probability of the higher -Z collision partner is well described by this model including the 2psigma-1ssigma vacancy sharing probability calculated as a function of the impact parameter. (author)

  4. A methodology for the transfer of probabilities between accident severity categories

    International Nuclear Information System (INIS)

    Whitlow, J.D.; Neuhauser, K.S.

    1991-01-01

    A methodology has been developed which allows the accident probabilities associated with one accident-severity category scheme to be transferred to another severity category scheme. The methodology requires that the schemes use a common set of parameters to define the categories. The transfer of accident probabilities is based on the relationships between probability of occurrence and each of the parameters used to define the categories. Because of the lack of historical data describing accident environments in engineering terms, these relationships may be difficult to obtain directly for some parameters. Numerical models or experienced judgement are often needed to obtain the relationships. These relationships, even if they are not exact, allow the accident probability associated with any severity category to be distributed within that category in a manner consistent with accident experience, which in turn will allow the accident probability to be appropriately transferred to a different category scheme

  5. Bomb parameters

    International Nuclear Information System (INIS)

    Kerr, George D.; Young, Rebert W.; Cullings, Harry M.; Christry, Robert F.

    2005-01-01

    The reconstruction of neutron and gamma-ray doses at Hiroshima and Nagasaki begins with a determination of the parameters describing the explosion. The calculations of the air transported radiation fields and survivor doses from the Hiroshima and Nagasaki bombs require knowledge of a variety of parameters related to the explosions. These various parameters include the heading of the bomber when the bomb was released, the epicenters of the explosions, the bomb yields, and the tilt of the bombs at time of explosion. The epicenter of a bomb is the explosion point in air that is specified in terms of a burst height and a hypocenter (or the point on the ground directly below the epicenter of the explosion). The current reassessment refines the energy yield and burst height for the Hiroshima bomb, as well as the locations of the Hiroshima and Nagasaki hypocenters on the modern city maps used in the analysis of the activation data for neutrons and TLD data for gamma rays. (J.P.N.)

  6. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  7. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  8. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  9. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  10. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  11. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  12. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  13. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  14. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  15. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  16. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  17. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  18. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  19. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  20. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  1. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  2. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  3. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  4. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  5. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  6. Clinical Features in a Danish Population-Based Cohort of Probable Multiple System Atrophy Patients

    DEFF Research Database (Denmark)

    Starhof, Charlotte; Korbo, Lise; Lassen, Christina Funch

    2016-01-01

    Background: Multiple system atrophy (MSA) is a rare, sporadic and progressive neurodegenerative disorder. We aimed to describe the clinical features of Danish probable MSA patients, evaluate their initial response to dopaminergic therapy and examine mortality. Methods: From the Danish National...... the criteria for probable MSA. We recorded clinical features, examined differences by MSA subtype and used Kaplan-Meier survival analysis to examine mortality. Results: The mean age at onset of patients with probable MSA was 60.2 years (range 36-75 years) and mean time to wheelchair dependency was 4.7 years...

  7. Survival under uncertainty an introduction to probability models of social structure and evolution

    CERN Document Server

    Volchenkov, Dimitri

    2016-01-01

    This book introduces and studies a number of stochastic models of subsistence, communication, social evolution and political transition that will allow the reader to grasp the role of uncertainty as a fundamental property of our irreversible world. At the same time, it aims to bring about a more interdisciplinary and quantitative approach across very diverse fields of research in the humanities and social sciences. Through the examples treated in this work – including anthropology, demography, migration, geopolitics, management, and bioecology, among other things – evidence is gathered to show that volatile environments may change the rules of the evolutionary selection and dynamics of any social system, creating a situation of adaptive uncertainty, in particular, whenever the rate of change of the environment exceeds the rate of adaptation. Last but not least, it is hoped that this book will contribute to the understanding that inherent randomness can also be a great opportunity – for social systems an...

  8. DETERMINE THE PROBABILITY OF PASSENGER SURVIVAL IN AN AVIATION INCIDENT WITH FIRE ON THE GROUND

    Directory of Open Access Journals (Sweden)

    Vladislav Pavlovich Turko

    2017-05-01

    Full Text Available Conducting the risk level of aviation incident with fire and the impacts of contingence affecting factors on people. Base on statistical data of aviation incident, the model of aircraft fire situation on the ground was offer.

  9. Analysis of probability of defects in the disposal canisters

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Kuusela, P.

    2011-06-01

    This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)

  10. Undetected error probability for data services in a terrestrial DAB single frequency network

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.; Veldhuis, Raymond N.J.; Veldhuis, R.N.J.; Cronie, H.S.

    2007-01-01

    DAB (Digital Audio Broadcasting) is the European successor of FM radio. Besides audio services, other services such as traffic information can be provided. An important parameter for data services is the probability of non-recognized or undetected errors in the system. To derive this probability, we

  11. Interpretation of the results of statistical measurements. [search for basic probability model

    Science.gov (United States)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  12. Analysis of the probability of channel satisfactory state in P2P live ...

    African Journals Online (AJOL)

    In this paper a model based on user behaviour of P2P live streaming systems was developed in order to analyse one of the key QoS parameter of such systems, i.e. the probability of channel-satisfactory state, the impact of upload bandwidths and channels' popularity on the probability of channel-satisfactory state was also ...

  13. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  14. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  15. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  16. Repair models of cell survival and corresponding computer program for survival curve fitting

    International Nuclear Information System (INIS)

    Shen Xun; Hu Yiwei

    1992-01-01

    Some basic concepts and formulations of two repair models of survival, the incomplete repair (IR) model and the lethal-potentially lethal (LPL) model, are introduced. An IBM-PC computer program for survival curve fitting with these models was developed and applied to fit the survivals of human melanoma cells HX118 irradiated at different dose rates. Comparison was made between the repair models and two non-repair models, the multitar get-single hit model and the linear-quadratic model, in the fitting and analysis of the survival-dose curves. It was shown that either IR model or LPL model can fit a set of survival curves of different dose rates with same parameters and provide information on the repair capacity of cells. These two mathematical models could be very useful in quantitative study on the radiosensitivity and repair capacity of cells

  17. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  18. Age-specific survival of tundra swans on the lower Alaska Peninsula

    Science.gov (United States)

    Meixell, Brandt W.; Lindberg, Mark S.; Conn, Paul B.; Dau, Christian P.; Sarvis, John E.; Sowl, Kristine M.

    2013-01-01

    The population of Tundra Swans (Cygnus columbianus columbianus) breeding on the lower Alaska Peninsula represents the southern extremity of the species' range and is uniquely nonmigratory. We used data on recaptures, resightings, and recoveries of neck-collared Tundra Swans on the lower Alaska Peninsula to estimate collar loss, annual apparent survival, and other demographic parameters for the years 1978–1989. Annual collar loss was greater for adult males fitted with either the thinner collar type (0.34) or the thicker collar type (0.15) than for other age/sex classes (thinner: 0.10, thicker: 0.04). The apparent mean probability of survival of adults (0.61) was higher than that of immatures (0.41) and for both age classes varied considerably by year (adult range: 0.44–0.95, immature range: 0.25–0.90). To assess effects of permanent emigration by age and breeding class, we analyzed post hoc the encounter histories of swans known to breed in our study area. The apparent mean survival of known breeders (0.65) was generally higher than that of the entire marked sample but still varied considerably by year (range 0.26–1.00) and indicated that permanent emigration of breeding swans was likely. We suggest that reductions in apparent survival probability were influenced primarily by high and variable rates of permanent emigration and that immigration by swans from elsewhere may be important in sustaining a breeding population at and near Izembek National Wildlife Refuge.

  19. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  20. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.